WO2005124687A1 - Method for marker tracking in optical motion capture system, optical motion capture method, and system - Google Patents

Method for marker tracking in optical motion capture system, optical motion capture method, and system Download PDF

Info

Publication number
WO2005124687A1
WO2005124687A1 PCT/JP2005/010644 JP2005010644W WO2005124687A1 WO 2005124687 A1 WO2005124687 A1 WO 2005124687A1 JP 2005010644 W JP2005010644 W JP 2005010644W WO 2005124687 A1 WO2005124687 A1 WO 2005124687A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
position information
dimensional position
dimensional
motion capture
Prior art date
Application number
PCT/JP2005/010644
Other languages
French (fr)
Japanese (ja)
Inventor
Yoshihiko Nakamura
Katsu Yamane
Original Assignee
The University Of Tokyo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The University Of Tokyo filed Critical The University Of Tokyo
Priority to JP2006514717A priority Critical patent/JPWO2005124687A1/en
Publication of WO2005124687A1 publication Critical patent/WO2005124687A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/285Analysis of motion using a sequence of stereo image pairs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the present invention relates to a marker tracking method in an optical motion capture system and an optical motion capture system.
  • Motion capture is a method of measuring the movements of humans and animals as numerical data, and the data obtained by such measurements is wide-ranging, such as biomechanics, humanoid movement generation, and CG animation generation! / , Used in the field.
  • motion capture for example, a wire type that attaches a device for data acquisition to a subject and transfers data by wire and records coordinate data, a head type,
  • a marker that reflects light is attached to the shoulder, elbow, wrist, etc., and the coordinate data is recorded by capturing the reflected light from the marker. From the viewpoint of the subject's low restraint during exercise, the so-called passive optical motion capture is more advantageous than the wire type.
  • Another object of the present invention is to provide a high-precision optical motion capture system.
  • the purpose is to obtain high-precision and high-speed data by using two types of cameras, a camera and a high-speed camera, to compensate for the shortcomings of each other and to match each other.
  • a first technical means adopted by the present invention is a marker detecting step of detecting two-dimensional position information of an image force marker acquired by each camera, and a two-dimensional position information of a marker force detected by each camera.
  • a three-dimensional reconstruction step of obtaining the three-dimensional position information of the marker using (marker estimation vector), and a three-dimensional position of the marker determined to be the same by searching the three-dimensional position information of the marker in time series
  • a tracking step of assigning the same ID to the information wherein the marker detection step and / or the three-dimensional reconstruction step includes a reference marker position having an ID, and two-dimensional position information (marker marker) of the detected marker. (Estimated vector) or Z and the identity with the 3D position information are determined, and the 2D position information (marker estimation vector) or Z and And a step of assigning the same ID to the three-dimensional position information as the reference marker position.
  • a second technical means employed by the present invention includes a step of capturing an image of a subject marked with a heterogeneous camera group composed of cameras having different frame rates and resolutions to obtain an image, A marker detection step for detecting the two-dimensional position information of the image force marker acquired by each camera group, and acquiring three-dimensional position information of the marker using the two-dimensional position information of the marker detected in each camera group A three-dimensional reconstruction step of integrating the three-dimensional position information of the markers acquired by each force camera group.
  • the integration step searches for the three-dimensional position information of the markers in time series. Tracking step to assign the same ID to the powerful three-dimensional position information determined to be the same, and the three-dimensional position information of the marker with the same ID acquired by each camera group Interpolating the optical motion.
  • a third technical means adopted by the present invention is a photographing means for photographing a subject marked with a heterogeneous camera group composed of cameras having different frame rates and resolutions to obtain an image.
  • Marker detecting means for detecting the two-dimensional position information of the marker from the acquired image, and the two-dimensional position of the marker detected in each camera group
  • a three-dimensional reconstruction means for acquiring three-dimensional position information of the marker using the information
  • an integrating means for integrating the three-dimensional position information of the marker acquired by each camera group.
  • a tracking unit that searches the three-dimensional position information of the marker in time series and assigns the same ID to the three-dimensional position information of the marker determined to be identical, and the same ID acquired by each camera group is assigned. Means for interpolating the three-dimensional position information of the marker.
  • the marker detection step includes a step of determining a two-dimensional coordinate value of a reference marker position to which an ID is assigned and two-dimensional position information of the detected marker.
  • An ID is assigned by determining the identity.
  • three-dimensional position information of a marker is obtained based on two-dimensional position information to which the same ID is assigned.
  • the three-dimensional reconstruction step assigns an ID by determining the identity between the reference marker position to which the ID is assigned and the three-dimensional position information of the reconstructed marker. is there.
  • the determination as to whether or not the markers are the same is made based on the distance between the position information of the detected marker or the position information of the three-dimensionally reconstructed manual force and the reference marker position.
  • the reference marker position is a marker prediction position having the same ID calculated based on the measurement force position with ID obtained in the tracking step. The latest measurement marker position with ID obtained in the tracking step is used as the reference minimum force position.
  • the three-dimensional reconstruction step includes: calculating a distance between a plurality of marker estimation vectors to which the same ID has been assigned in the marker detection step; Determining whether the vectors intersect at a distance equal to or less than a threshold value. If the judgment is positive, calculate the three-dimensional position of the marker. If the judgment is negative, a search is made for a set that intersects at a distance equal to or less than the threshold with respect to the marker estimation vector used for calculating the three-dimensional position of the marker, and the newly found marker estimation vector force is also configured. A new ID is assigned to the three-dimensional position of the marker to be set.
  • the 3D position information of the marker with the same ID obtained by each camera group is interpolated.
  • the interpolation is for obtaining an interpolation polynomial for interpolating a marker trajectory composed of time-series three-dimensional position information of a marker to which the same ID is assigned over a plurality of frames measured by each camera group.
  • weighting is performed based on the average distance between marker estimation vectors at the time of three-dimensional reconstruction.
  • a marker position in a next frame of a camera of each camera group is predicted based on the interpolation.
  • An ID is assigned to the marker prediction position, and by providing the marker prediction position information together with the ID to the marker detection step and the three-dimensional reconstruction step, the marker tracking (marker) is also performed in the marker detection step and the three-dimensional reconstruction step. ID).
  • local tracking of markers can be performed at the stage of marker detection and three-dimensional reconstruction of marker positions.
  • the amount of calculation required for three-dimensional reconstruction can be reduced, and real-time processing becomes possible.
  • real-time labeling and joint angle calculation can be performed using the marker tracking function.
  • the spatial resolution is low with high-speed cameras alone, but accuracy can be improved by using data with high-precision cameras.
  • Correction of low-precision data is performed by using the distance between vectors as an index of accuracy and determining interpolation parameters based on the index. At this time, the distance between the vectors is generally large because the vectors of the low-resolution cameras are greatly shifted by an error of one pixel.
  • the present invention relates to a high-precision and high-speed motion capture system by mutually complementing different types of cameras.
  • the motion capture system includes a plurality of markers attached to a plurality of predetermined parts (a head, a neck, a shoulder, an elbow, a wrist, and the like) of a subject, imaging means for imaging the subject, and imaging means for the imaging means. It has one or more computers that are electrically connected (in a state that allows data transfer), and the image information of the markers acquired by the imaging means is transmitted to the computers, and the computer processes the image information. Is used to obtain the three-dimensional position of the marker.
  • a computer connected to the photographing unit, a storage unit for storing the image information obtained by the photographing unit and the information calculated by the image processing unit, a display unit for displaying the image information, and an image for the image information.
  • An image processing unit for performing the processing is configured.
  • the photographing means in the motion capture is composed of a plurality of video cameras, and by arranging a plurality of video cameras at different positions so as to surround the subject, a plurality of video forces can be obtained. Acquire images simultaneously.
  • the acquired moving image is, for example, a time-series image group of still image information of several tens of frames and several hundred frames per second.
  • the photographing means is composed of a group of different kinds of cameras, and the group of different kinds of cameras is one preferred! /, In an embodiment, is composed of a group of high-precision cameras and a group of high-speed cameras.
  • a high-precision camera and a high-speed camera refer to high-resolution cameras of different types when one has a higher resolution 'low frame rate than the other' and the other has a lower resolution 'high frame rate than the other'.
  • a camera with a resolution of “low frame rate” is called a high precision camera
  • a camera with a low resolution of “high frame rate” is called a high speed camera.
  • the cameras that make up Group A have a resolution of 1004 x 1004 and a maximum frame rate of 50 fts
  • the cameras that make up Group B have a resolution of 512 X 512 and a maximum frame rate of 262 ft) s, and make up Group A.
  • the cameras that make up are high-precision cameras
  • the cameras that make up Group B are high-speed cameras.
  • FIG. 1 is a diagram showing a configuration of a motion capture system using two types of camera groups (Gl, G2).
  • the motion capture system according to the present invention has three block functions: marker detection processing, three-dimensional reconstruction processing, and integration processing. Each block performs the following processing.
  • the marker detection step (C) The camera image color marker is detected.
  • the three-dimensional reconstruction step (R) the three-dimensional position of the marker is calculated using the two-dimensional position information of the marker detected by the same type of camera group.
  • the integration step (I) data from a plurality of three-dimensional reconstruction blocks is integrated.
  • FIG. 1 shows two camera groups. A force of three or more camera groups may be used.
  • the two-dimensional image force captured by each camera also detects a marker.
  • the detection of power is performed by processing the images acquired by each camera, and the marker is detected as a certain coordinate in the two-dimensional image.
  • the position of each camera constituting each group is determined in advance, and the detected marker is located on a vector (mark estimation vector) starting from the absolute position of the camera.
  • the marker detection process is performed independently for each camera.
  • an intersection of a plurality of marker estimation vectors obtained from a plurality of cameras belonging to the same group at the same time is obtained, and the intersection is set as the three-dimensional position of the marker.
  • marker estimation vectors do not intersect strictly due to the effects of camera pixel errors and calibration errors.Therefore, a set of vectors with a small distance is searched, and the midpoint of the common perpendicular is regarded as the intersection. .
  • the three-dimensional reconstruction processing is performed independently for each camera group (G).
  • High precision data is obtained by using a high precision camera.
  • high-precision cameras usually have low frame rates, so the time interval between data is large.
  • use a high-speed camera to obtain high frame rate data.
  • the accuracy of high-speed camera data is low, so the accuracy will be reduced as it is. Therefore, high-speed and high-precision data can be obtained by integrating the two data and correcting the low-precision data using an interpolation method.
  • the high-speed camera plays a role of providing the data to be interpolated at a high frame rate to facilitate integration and to increase the frame rate of data finally obtained as an output of the system.
  • the integration is to interpolate the marker positions measured by each camera group and to convert one continuous data. As a result, it becomes possible to predict a future marker position in which low-precision data is corrected with high-precision data.
  • the three-dimensional marker position data sent from the plurality of camera groups is received, the correspondence with the previously observed marker is checked, and a unique ID is assigned (tracking). Marker position data of already measured frames is divided for each ID and stored together with the interpolation formula.
  • the current predicted position is calculated using this interpolation formula, the correspondence is checked, and the corresponding ID is assigned.
  • an interpolation polynomial is calculated in consideration of the error at the time of three-dimensional reconstruction.
  • the error referred to here is the distance between the vectors in Eq. (7) described later.A larger value means that the reliability of the obtained three-dimensional position where the error of each vector is large is low. Give weight.
  • an interpolation formula is calculated that approximates highly accurate data as a whole. By using this interpolation formula, a marker position is predicted when a certain camera group performs the next measurement (next frame), and the information is sent to the marker detection 3D reconstruction processing. Tracking can also be performed at the configuration stage. This enables real-time three-dimensional reconstruction and tracking as described later.
  • each marker (marker with the same ID) is interpolated by a polynomial in order to predict the position of the marker and obtain data of a fixed time width after the end of the measurement. Specifically, when data measured at time t is sent, the coefficients of the polynomial interpolation function are determined using data after time t.
  • I is a constant that determines the time width of data used for interpolation.
  • FIG. 3 shows an interpolation method.
  • Time t (i 0, 1,..., N; t- ⁇ ⁇ t ⁇ t
  • equation (2) can be summarized as follows.
  • ⁇ # (vector) is a weighted pseudo-inverse of T (vector).
  • the weight w corresponding to the data at each time is calculated using the average distance d between the marker estimation vectors during 3D reconstruction.
  • a a coefficient for correcting the difference in camera frame rate
  • C an arbitrary constant.
  • the weight of the marker that is, the marker with a large error is reduced, and in general, for cameras with a high frame rate, a is reduced to reduce the influence of individual data.
  • the interpolation method is described here. The method for determining the weight is not limited to the one described here, and any other method may be used as long as the function monotonically decreases with respect to the distance.
  • the marker tracking will be described.
  • the above-mentioned interpolation is enabled by tracking the marker that can be determined to be the same across multiple frames and assigning the same ID. Since marker tracking is performed in real time, local tracking is performed not only in integration processing but also in marker detection and reconstruction processing. Ability to Explain Marker Tracking Based on Using Different Kinds of Cameras
  • the marker tracking method according to the present invention is effective even in a system using only one kind of camera.
  • a force that describes a case where a marker prediction position is adopted as a reference marker position having an ID is not limited to the marker prediction position. For example, the latest measurement marker position with ID obtained in the tracking step may be adopted as the reference marker position.
  • FIG. 2 shows a conceptual diagram of tracking using the marker estimated position (marker predicted position).
  • Figure 2 As shown in (2), the marker estimation positions (white circles 1, 2, and 3 with IDs) obtained by the integration processing are projected on a camera image to obtain two-dimensional coordinate values. Then, local tracking of the detected marker is performed by associating with the actually detected marker. When the correspondence between the estimated marker estimated position and the actually detected marker is confirmed, an ID is assigned to the detected marker. The detected marker (with ID) is reconstructed in three dimensions to obtain the marker position. The correspondence between the estimated marker position and the three-dimensionally reconstructed marker position is taken. These operations consist of a set of detected markers D,
  • indicates that there is no corresponding estimated marker, that is, a new marker, and a new ID is added when such a marker appears.
  • the score of the pair (D, E) is given as follows.
  • C is an arbitrary constant
  • d (D, ⁇ ) is the distance between the detected marker D and the estimated marker ⁇ .
  • ⁇ , that is, when the corresponding estimated marker is strong
  • d (D, E) is set as the maximum distance d that may correspond.
  • IDs are added to detected markers by three-dimensional reconstruction processing. Therefore, when D and E have the same ID, a certain score is added to the pair (D, E) so that the pair is searched preferentially.
  • the three-dimensional reconstruction will be described. Three-dimensional reconstruction in normal motion capture
  • the calculation amount is large.
  • the IDs are already assigned to the markers detected by each camera, too! /, So that all IDs are correct! / Known, no need to search.
  • the distance estimation is used together to determine the set of marker estimation vectors, and the three-dimensional position of the force is calculated.
  • the search is performed as follows.
  • Step 1 Check the distance between marker estimation vectors having the same ID for each estimated marker. If n or more vectors intersect at a distance less than or equal to threshold d, it is assumed that a marker exists and mm max
  • Step 2 Search for a set that intersects with the marker estimation vector not used in Step 1 at a distance equal to or smaller than the threshold value in the same manner as in a normal motion capture. The marker found as a result is given a new ID.
  • the experimental equipment and conditions will be described.
  • a motion capture system with a total of 19 cameras was constructed.
  • the cameras belonging to Group A are high-precision cameras with a resolution of 1004 x 1004 pixels and a maximum frame rate of 50 fps.
  • the high-precision cameras consist of 10 high-precision cameras.
  • the cameras belonging to Group B are high-speed cameras, with a resolution of 532 x 512 pixels and a maximum frame speed of S262 fps.
  • the high-precision cameras consist of nine high-speed cameras. Marker detection
  • One PC for each camera is assigned to each camera, and one of the PCs in each camera group performs 3D reconstruction.
  • Pentium IV 2GHz machine was used for integrated processing.
  • the squat, walking, and kick movements were measured for each of the subjects with 10, 20, 30, and 40 markers attached, and the speed and tracking performance were evaluated.
  • the speed evaluation will be described.
  • the processing time of each step in the camera group and the integration processing is shown in Table 2 and Table 3, respectively.
  • Table 4 shows the average time taken to acquire one frame. For 10 to 30 markers, the one using group B alone is the fastest, while for 40 markers, the integrated system is the fastest. This is thought to be because the load of the integration processing becomes relatively smaller as the number of markers increases as compared to the reconstruction.
  • Table 2 Computation time for each step at the camera groups [ms].
  • Table 5 shows the percentage of markers that could be tracked to the end in each exercise. We can see that in most cases, the integrated system can track the most effort.
  • the movement of a human or an animal can be measured as numerical data, and the data obtained thereby can be used in a wide range of fields such as biomechanics, humanoid movement generation, and CG animation generation. Can be used.
  • FIG. 1 is a schematic diagram of a motion capture system using two types of camera groups.
  • FIG. 2 is a diagram illustrating local tracking in a marker detection process and a reconstruction process.
  • FIG. 3 is a diagram illustrating interpolation using a polynomial.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

In an optical motion capture system, it is possible to obtain highly-accurate high-speed data by using two types of cameras: a highly-accurate camera and a high-speed camera. The method includes: a step of imaging an object under test having a marker, by a camera group including cameras different in the frame rate and resolution and acquiring an image; a marker detection step for detecting two-dimensional position information on the marker from the image; a three-dimensional reconfiguration step for acquiring the three-dimensional position information on the marker by using the two-dimensional position information on the marker detected; and a step of unifying the three-dimensional position information on the marker acquired by the respective cameras. The unifying step has a tracking step for searching the three-dimensional position information on the marker in time series and attaching the same ID to the three-dimensional position information on the marker judged to be identical and an interpolation step for interpolating the three-dimensional position information on the marker attached by the same ID acquired by the respective cameras.

Description

明 細 書  Specification
光学式モーションキヤプチヤシステムにおけるマーカトラッキング方法、光 学式モーションキヤプチャ方法及びシステム  Marker tracking method in optical motion capture system, optical motion capture method and system
技術分野  Technical field
[0001] 本発明は、光学式モーションキヤプチヤシステムにおけるマーカのトラッキング方法お よび光学式モーションキヤプチヤシステムに関するものである。  The present invention relates to a marker tracking method in an optical motion capture system and an optical motion capture system.
背景技術  Background art
[0002] モーションキヤプチャは、人間や動物の動作を数値的なデータとして計測する手法で あり、それによつて得られたデータはバイオメカ-タス、ヒューマノイドの動作生成、 C Gアニメーションの生成など幅広!/、分野で用いられて 、る。モーションキヤプチヤには 様々な方式があり、例えば、被験者にデータ取得のためのデバイスを装着して有線 でデータを転送して座標データを記録するワイヤ式のものや、被験者の頭、首、肩、 肘、手首等に光を反射するマーカを付け、マーカからの反射光を撮影して座標デー タを記録する光学式のものがある。運動時における被験者の低拘束性という観点か らは、ワイヤ式のものに比べて、いわゆるパッシブ光学式のモーションキヤプチヤが有 利である。  [0002] Motion capture is a method of measuring the movements of humans and animals as numerical data, and the data obtained by such measurements is wide-ranging, such as biomechanics, humanoid movement generation, and CG animation generation! / , Used in the field. There are various types of motion capture, for example, a wire type that attaches a device for data acquisition to a subject and transfers data by wire and records coordinate data, a head type, There is an optical type in which a marker that reflects light is attached to the shoulder, elbow, wrist, etc., and the coordinate data is recorded by capturing the reflected light from the marker. From the viewpoint of the subject's low restraint during exercise, the so-called passive optical motion capture is more advantageous than the wire type.
[0003] しかしながら、ノッシブ光学式のモーションキヤプチヤでは、複数のカメラで取得した 画像から、マーカの三次元位置を求める (三次元再構成)ための計算量が大きい、 各 マーカと体の部位との対応を取る作業(ラベリング)のための計算量が大きい、といつ た欠点がある。また、カメラ力もコンピュータへのデータ転送速度に限界があるため単 一のカメラで高精度かつ高速度のデータを得ることはできない。  [0003] However, in the noisy optical motion capture, the amount of calculation for determining the three-dimensional position of a marker (three-dimensional reconstruction) from images acquired by a plurality of cameras is large. There is a drawback that the amount of calculation for the work (labeling) to deal with is large. In addition, since the power of the camera has a limit in the data transfer speed to the computer, a single camera cannot obtain high-precision and high-speed data.
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0004] 本発明の目的は、光学式モーションキヤプチヤシステムにおいて、マーカ検出、マー 力位置の三次元再構成の段階でマーカのローカルトラッキングを行うことで、リアルタ ィムのマーカトラッキングを実現することにある。 [0004] It is an object of the present invention to realize real-time marker tracking in an optical motion capture system by performing marker local tracking at the stage of marker detection and three-dimensional reconstruction of a marker position. It is in.
[0005] 本発明のもう一つの目的は、光学式モーションキヤプチヤシステムにおいて、高精度 カメラと高速度カメラの 2種類のカメラを用い、互 、の欠点を補 、合うことで高精度か つ高速度のデータを得ることを目的とするものである。 Another object of the present invention is to provide a high-precision optical motion capture system. The purpose is to obtain high-precision and high-speed data by using two types of cameras, a camera and a high-speed camera, to compensate for the shortcomings of each other and to match each other.
課題を解決するための手段  Means for solving the problem
[0006] 本発明が採用した第 1の技術手段は、各カメラによって取得された画像力 マーカの 二次元位置情報を検出するマーカ検出ステップと、各カメラによって検出されたマー 力の二次元位置情報 (マーカ推定ベクトル)を用いて、マーカの三次元位置情報を取 得する三次元再構成ステップと、マーカの三次元位置情報を時系列に探索して、同 一と判定されるマーカの三次元位置情報に同一の IDを付与するトラッキングステップ とを有し、前記マーカ検出ステップおよび/あるいは前記三次元再構成ステップは、 IDを備えた参照マーカ位置と、検出されたマーカの二次元位置情報 (マーカ推定べ タトル)あるいは Zおよび三次元位置情報との同一性を判定し、参照マーカ位置と同 一と判定された二次元位置情報 (マーカ推定ベクトル)あるいは Zおよび三次元位置 情報に参照マーカ位置と同一の IDを付与するステップを含むことを特徴とする光学 式モーションキヤプチヤシステムにおけるマーカトラッキング方法である。  [0006] A first technical means adopted by the present invention is a marker detecting step of detecting two-dimensional position information of an image force marker acquired by each camera, and a two-dimensional position information of a marker force detected by each camera. A three-dimensional reconstruction step of obtaining the three-dimensional position information of the marker using (marker estimation vector), and a three-dimensional position of the marker determined to be the same by searching the three-dimensional position information of the marker in time series A tracking step of assigning the same ID to the information, wherein the marker detection step and / or the three-dimensional reconstruction step includes a reference marker position having an ID, and two-dimensional position information (marker marker) of the detected marker. (Estimated vector) or Z and the identity with the 3D position information are determined, and the 2D position information (marker estimation vector) or Z and And a step of assigning the same ID to the three-dimensional position information as the reference marker position.
[0007] 本発明が採用した第 2の技術手段は、フレームレート及び解像度において異なるカメ ラカゝら構成される異種カメラ群によってマーカが付された被験体を撮影して画像を取 得するステップと、各カメラ群にぉ 、て取得した画像力 マーカの二次元位置情報を 検出するマーカ検出ステップと、各カメラ群において検出されたマーカの二次元位置 情報を用いて、マーカの三次元位置情報を取得する三次元再構成ステップと、各力 メラ群で取得されたマーカの三次元位置情報を統合する統合ステップとを有し、該統 合ステップは、マーカの三次元位置情報を時系列に探索して、同一と判定されるマ 一力の三次元位置情報に同一の IDを付与するトラッキングステップと、各カメラ群で 取得された同一の IDが付与されたマーカの三次元位置情報を補間するステップとを 有することを特徴とする光学式モーションキヤプチャ方法である。  [0007] A second technical means employed by the present invention includes a step of capturing an image of a subject marked with a heterogeneous camera group composed of cameras having different frame rates and resolutions to obtain an image, A marker detection step for detecting the two-dimensional position information of the image force marker acquired by each camera group, and acquiring three-dimensional position information of the marker using the two-dimensional position information of the marker detected in each camera group A three-dimensional reconstruction step of integrating the three-dimensional position information of the markers acquired by each force camera group. The integration step searches for the three-dimensional position information of the markers in time series. Tracking step to assign the same ID to the powerful three-dimensional position information determined to be the same, and the three-dimensional position information of the marker with the same ID acquired by each camera group Interpolating the optical motion.
[0008] 本発明が採用した第 3の技術手段は、フレームレート及び解像度において異なるカメ ラカゝら構成される異種カメラ群によってマーカが付された被験体を撮影して画像を取 得する撮影手段と、各カメラ群にお!ヽて取得した画像からマーカの二次元位置情報 を検出するマーカ検出手段と、各カメラ群において検出されたマーカの二次元位置 情報を用いて、マーカの三次元位置情報を取得する三次元再構成手段と、各カメラ 群で取得されたマーカの三次元位置情報を統合する統合手段とを有し、該統合手 段は、マーカの三次元位置情報を時系列に探索して、同一と判定されるマーカの三 次元位置情報に同一の IDを付与するトラッキング手段と、各カメラ群で取得された同 一の IDが付与されたマーカの三次元位置情報を補間する手段とを有することを特徴 とする光学式モーションキヤプチヤシステムである。 [0008] A third technical means adopted by the present invention is a photographing means for photographing a subject marked with a heterogeneous camera group composed of cameras having different frame rates and resolutions to obtain an image. , For each camera group! Marker detecting means for detecting the two-dimensional position information of the marker from the acquired image, and the two-dimensional position of the marker detected in each camera group There is provided a three-dimensional reconstruction means for acquiring three-dimensional position information of the marker using the information, and an integrating means for integrating the three-dimensional position information of the marker acquired by each camera group. A tracking unit that searches the three-dimensional position information of the marker in time series and assigns the same ID to the three-dimensional position information of the marker determined to be identical, and the same ID acquired by each camera group is assigned. Means for interpolating the three-dimensional position information of the marker.
[0009] 本発明にお 、て、一つの好ま U、態様では、前記マーカ検出ステップは、 IDが付さ れた参照マーカ位置の二次元座標値と検出されたマーカの二次元位置情報との同 一性を判定することで IDを付与するものである。さらに、好ましくは、前記三次元再構 成ステップは、同一の IDが付与された二次元位置情報に基づいてマーカの三次元 位置情報を取得するものである。他の好ましい態様では、前記三次元再構成ステツ プは、 IDが付された参照マーカ位置と再構成されたマーカの三次元位置情報との同 一性を判定することで IDを付与するものである。  [0009] In one preferred aspect of the present invention, in the aspect, the marker detection step includes a step of determining a two-dimensional coordinate value of a reference marker position to which an ID is assigned and two-dimensional position information of the detected marker. An ID is assigned by determining the identity. Still preferably, in the three-dimensional reconstruction step, three-dimensional position information of a marker is obtained based on two-dimensional position information to which the same ID is assigned. In another preferred aspect, the three-dimensional reconstruction step assigns an ID by determining the identity between the reference marker position to which the ID is assigned and the three-dimensional position information of the reconstructed marker. is there.
[0010] マーカが同一か否の判定は検出マーカの位置情報あるいは三次元再構成されたマ 一力の位置情報と、参照マーカ位置との間の距離によって行う。前記参照マーカ位 置は、一つの好ましい態様では、前記トラッキングステップで得られた ID付き計測マ 一力位置に基づいて計算された同じ IDを備えたマーカ予測位置である。前記参照マ 一力位置として、前記トラッキングステップで得られた最新の ID付き計測マーカ位置 を用いることちでさる。  [0010] The determination as to whether or not the markers are the same is made based on the distance between the position information of the detected marker or the position information of the three-dimensionally reconstructed manual force and the reference marker position. In one preferred embodiment, the reference marker position is a marker prediction position having the same ID calculated based on the measurement force position with ID obtained in the tracking step. The latest measurement marker position with ID obtained in the tracking step is used as the reference minimum force position.
[0011] 一つの好ましい態様では、前記三次元再構成ステップは、前記マーカ検出ステップ で同一の IDが付与された複数のマーカ推定ベクトル間の距離を計算するステップと 、予め設定された本数以上のベクトルが閾値以下の距離で交差するか否かを判定す るステップとを含むものである。判定が是の場合には、マーカの三次元位置を計算す る。判定が否の場合には、マーカの三次元位置の計算に使用されな力つたマーカ推 定ベクトルに対して、閾値以下の距離で交差する組を探索し、新しく見つかったマー 力推定ベクトル力も構成されるマーカの三次元位置に新しい IDを付与する。  [0011] In one preferred embodiment, the three-dimensional reconstruction step includes: calculating a distance between a plurality of marker estimation vectors to which the same ID has been assigned in the marker detection step; Determining whether the vectors intersect at a distance equal to or less than a threshold value. If the judgment is positive, calculate the three-dimensional position of the marker. If the judgment is negative, a search is made for a set that intersects at a distance equal to or less than the threshold with respect to the marker estimation vector used for calculating the three-dimensional position of the marker, and the newly found marker estimation vector force is also configured. A new ID is assigned to the three-dimensional position of the marker to be set.
[0012] 一つの好ましい態様では、フレームレート及び解像度において異なるカメラ力 構成 される異種カメラ群によってマーカが付された被験体を撮影して画像を取得するモー シヨンキヤプチヤであって、各カメラ群で取得された同一の IDが付与されたマーカの 三次元位置情報を補間するようにした。前記補間は、各カメラ群で計測された複数フ レームにわたって同一の IDが付与されたマーカの時系列三次元位置情報からなる マーカ軌跡を補間する補間多項式を求めるものである。好ましくは、多項式の係数の 算出において、三次元再構成時のマーカ推定ベクトル間の平均距離に基づく重み 付けを行う。 [0012] In one preferred embodiment, a mode for capturing an image of a subject marked with a group of disparate cameras having different camera powers at different frame rates and resolutions to acquire an image. The 3D position information of the marker with the same ID obtained by each camera group is interpolated. The interpolation is for obtaining an interpolation polynomial for interpolating a marker trajectory composed of time-series three-dimensional position information of a marker to which the same ID is assigned over a plurality of frames measured by each camera group. Preferably, in calculating the coefficients of the polynomial, weighting is performed based on the average distance between marker estimation vectors at the time of three-dimensional reconstruction.
[0013] さらに好ましい態様では、前記補間に基づいて、各カメラ群のカメラの次フレームに おけるマーカ位置を予測する。マーカ予測位置には IDが付与されており、マーカ予 測位置情報を IDと共に、マーカ検出ステップ、三次元再構成ステップに与えることで 、マーカ検出ステップ、三次元再構成ステップにおいてもマーカのトラッキング(ID付 与)を行うことができる。  [0013] In a further preferred aspect, a marker position in a next frame of a camera of each camera group is predicted based on the interpolation. An ID is assigned to the marker prediction position, and by providing the marker prediction position information together with the ID to the marker detection step and the three-dimensional reconstruction step, the marker tracking (marker) is also performed in the marker detection step and the three-dimensional reconstruction step. ID).
発明の効果  The invention's effect
[0014] 本発明では、マーカ検出、マーカ位置の三次元再構成の段階でマーカのローカルト ラッキングが可能となる。マーカ検出におけるトラッキング結果を利用することで、三 次元再構成に必要な計算量を低減することができ、リアルタイムでの処理が可能とな る。さらに、マーカトラッキング機能を利用して、リアルタイムでのラベリング '関節角計 算が可能となる。  According to the present invention, local tracking of markers can be performed at the stage of marker detection and three-dimensional reconstruction of marker positions. By using the tracking result in marker detection, the amount of calculation required for three-dimensional reconstruction can be reduced, and real-time processing becomes possible. Furthermore, real-time labeling and joint angle calculation can be performed using the marker tracking function.
[0015] 異種カメラを統合することで、次のような効果がある。一つは、高精度カメラだけでは 1 フレーム間のマーカの移動距離が大きいためルタイムのトラッキングが難しいが、高 速度カメラからのデータを併用することでトラッキングが容易になるということである。こ れにより低速度データだけを使うときよりも予測の精度が上がるため、トラッキングの 精度(マーカの対応が正しく取れる確率)  [0015] The following effects are obtained by integrating different types of cameras. One is that real-time tracking is difficult due to the large moving distance of the marker between one frame with a high-precision camera alone, but tracking is easier with the combined use of data from a high-speed camera. As a result, the accuracy of prediction is higher than when only low-speed data is used, so the accuracy of tracking (probability of correct marker correspondence)
が向上する。もう一つは、高速度カメラのみでは空間分解能が低いが、高精度カメラ 力ものデータを併用することで精度を向上させることができる。低精度データの修正 に関しては、ベクトル間の距離を精度の指標として、それに基づいて補間のパラメ一 タを決定することで行う。このとき低解像度のカメラでは 1ピクセルの誤差でベクトルが 大きくずれるので、一般にベクトル間の距離が大きくなる。  Is improved. Second, the spatial resolution is low with high-speed cameras alone, but accuracy can be improved by using data with high-precision cameras. Correction of low-precision data is performed by using the distance between vectors as an index of accuracy and determining interpolation parameters based on the index. At this time, the distance between the vectors is generally large because the vectors of the low-resolution cameras are greatly shifted by an error of one pixel.
発明を実施するための最良の形態 [0016] [A]モーションキヤプチヤシステムの全体構成 BEST MODE FOR CARRYING OUT THE INVENTION [A] Overall Configuration of Motion Capture System
本発明は、異種カメラの相互補完による高精度 ·高速度モーションキヤプチヤシステ ムに関するものである。モーションキヤプチヤシステムは、被験者の複数の所定部位( 頭、首、肩、肘、手首等)に付された複数のマーカと、該被験者を撮影するための撮 影手段と、該撮影手段に電気的に接続された (データ転送可能な状態で) 1台以上 のコンピュータとを有し、撮影手段によって取得されたマーカの画像情報がコンビュ ータに送信され、コンピュータで画像情報を処理することでマーカの三次元位置を取 得するように構成されている。撮影手段に接続されたコンピュータは、撮影手段によ つて取得された画像情報および画像処理部で計算された情報を記憶する記憶部、 該画像情報を表示する表示部、該画像情報に対して画像処理を施すための画像処 理部を構成している。  The present invention relates to a high-precision and high-speed motion capture system by mutually complementing different types of cameras. The motion capture system includes a plurality of markers attached to a plurality of predetermined parts (a head, a neck, a shoulder, an elbow, a wrist, and the like) of a subject, imaging means for imaging the subject, and imaging means for the imaging means. It has one or more computers that are electrically connected (in a state that allows data transfer), and the image information of the markers acquired by the imaging means is transmitted to the computers, and the computer processes the image information. Is used to obtain the three-dimensional position of the marker. A computer connected to the photographing unit, a storage unit for storing the image information obtained by the photographing unit and the information calculated by the image processing unit, a display unit for displaying the image information, and an image for the image information. An image processing unit for performing the processing is configured.
[0017] モーションキヤプチヤにおける撮影手段は、複数台のビデオカメラ力 構成されてお り、複数台のビデオカメラを被験者を取り囲むように夫々異なる位置に配置することで 、複数角度力も被験者の動画像を同時に取得する。取得される動画像は、例えば、 1 秒あたり数十フレーム力も数百フレームの静止画像情報の時系列の画像群である。 本発明では、撮影手段は、異種のカメラ群から構成されており、異種のカメラ群は一 つの好まし!/、態様では、高精度カメラ群と高速度カメラ群から構成される。  [0017] The photographing means in the motion capture is composed of a plurality of video cameras, and by arranging a plurality of video cameras at different positions so as to surround the subject, a plurality of video forces can be obtained. Acquire images simultaneously. The acquired moving image is, for example, a time-series image group of still image information of several tens of frames and several hundred frames per second. In the present invention, the photographing means is composed of a group of different kinds of cameras, and the group of different kinds of cameras is one preferred! /, In an embodiment, is composed of a group of high-precision cameras and a group of high-speed cameras.
[0018] 高精度カメラ、高速度カメラとは、異種カメラにおいて、一方が他方に比べて高解像 度'低フレームレート、他方が一方に比べて低解像度'高フレームレートである場合に 、高解像度'低フレームレートのカメラを高精度カメラ、低解像度'高フレームレートの カメラを高速度カメラという。後述の実験例では、グループ Aを構成するカメラが解像 度 1004 X 1004、最大フレームレート 50fts、グループ Bを構成するカメラが解像度 512 X 512、最大フレームレート 262ft)sであり、グループ Aを構成するカメラが高精度カメラ 、グループ Bを構成するカメラが高速度カメラである。  [0018] A high-precision camera and a high-speed camera refer to high-resolution cameras of different types when one has a higher resolution 'low frame rate than the other' and the other has a lower resolution 'high frame rate than the other'. A camera with a resolution of “low frame rate” is called a high precision camera, and a camera with a low resolution of “high frame rate” is called a high speed camera. In the experimental examples described below, the cameras that make up Group A have a resolution of 1004 x 1004 and a maximum frame rate of 50 fts, and the cameras that make up Group B have a resolution of 512 X 512 and a maximum frame rate of 262 ft) s, and make up Group A. The cameras that make up are high-precision cameras, and the cameras that make up Group B are high-speed cameras.
[0019] 図 1は、 2種類のカメラ群(Gl, G2)を用いたモーションキヤプチヤシステムの構成を 示す図である。本発明に係るモーションキヤプチヤシステムは、マーカ検出処理、三 次元再構成処理、統合処理の 3つのブロック力 なる。それぞれのブロックは以下の 処理を行う。マーカ検出ステップ (C)では、 カメラ画像カゝらマーカを検出する。三次元再構成ステップ (R)では、同じ種類のカメラ 群で検出されたマーカの二次元位置情報を用いて、マーカの三次元位置を計算す る。統合ステップ (I)では、複数の三次元再構成ブロックからのデータを統合する。尚 、図 1では、 2つのカメラ群を示した力 3つ以上のカメラ群を用いても良い。 FIG. 1 is a diagram showing a configuration of a motion capture system using two types of camera groups (Gl, G2). The motion capture system according to the present invention has three block functions: marker detection processing, three-dimensional reconstruction processing, and integration processing. Each block performs the following processing. In the marker detection step (C), The camera image color marker is detected. In the three-dimensional reconstruction step (R), the three-dimensional position of the marker is calculated using the two-dimensional position information of the marker detected by the same type of camera group. In the integration step (I), data from a plurality of three-dimensional reconstruction blocks is integrated. It should be noted that FIG. 1 shows two camera groups. A force of three or more camera groups may be used.
[0020] マーカ検出処理では、各カメラで取り込まれた 2次元画像力もマーカを検出する。マ 一力の検出は、各カメラで取得された画像を画像処理することで行われ、マーカは 2 次元画像におけるある座標として検出される。各群を構成する各カメラの位置は予め 決定されており、検出されたマーカは、カメラの絶対位置を始点とするベクトル(マー 力推定ベクトル)上に位置する。マーカ検出処理は、各カメラ毎で独立に行う。  [0020] In the marker detection process, the two-dimensional image force captured by each camera also detects a marker. The detection of power is performed by processing the images acquired by each camera, and the marker is detected as a certain coordinate in the two-dimensional image. The position of each camera constituting each group is determined in advance, and the detected marker is located on a vector (mark estimation vector) starting from the absolute position of the camera. The marker detection process is performed independently for each camera.
[0021] 三次元再構成処理では、同時刻における同じ群に属する複数のカメラ力 得られた マーカ推定ベクトルの交点を求め、その交点をマーカの三次元位置とする。しかし、 実際にはカメラのピクセル誤差やキャリブレーション誤差の影響で厳密にはマーカ推 定ベクトルは交差しないため、距離が小さいベクトルの組を探索し、それらの共通垂 線の中点を交点とみなす。三次元再構成処理は各カメラ群 (G)で独立に行う。  In the three-dimensional reconstruction processing, an intersection of a plurality of marker estimation vectors obtained from a plurality of cameras belonging to the same group at the same time is obtained, and the intersection is set as the three-dimensional position of the marker. However, in practice, marker estimation vectors do not intersect strictly due to the effects of camera pixel errors and calibration errors.Therefore, a set of vectors with a small distance is searched, and the midpoint of the common perpendicular is regarded as the intersection. . The three-dimensional reconstruction processing is performed independently for each camera group (G).
[0022] 統合処理について説明する。高精度データは、高精度カメラを用いることにより得ら れる。しかし、高精度カメラは通常フレームレートが低いので、データの時間間隔は 大きくなる。これを解決するため、高速度カメラを使って高フレームレートのデータを 得る。しかし、高速度カメラのデータは精度が低いので、そのままでは精度が落ちて しまう。そこで、両者のデータを統合した上で、補間方法を使って低精度データを修 正することで、高速度かつ高精度なデータを得ることができる。高速度カメラは、補間 のもととなるデータを高フレームレートで提供することにより統合を容易にするとともに 、システムの出力として最終的に得られるデータのフレームレートを上げるという役割 を果たしている。異種カメラから取得した三次元位置情報を統合するには、カメラ群 1 で検出されたマーカとカメラ群 2で検出されたマーカがどう対応しているか  [0022] The integration process will be described. High precision data is obtained by using a high precision camera. However, high-precision cameras usually have low frame rates, so the time interval between data is large. To solve this, use a high-speed camera to obtain high frame rate data. However, the accuracy of high-speed camera data is low, so the accuracy will be reduced as it is. Therefore, high-speed and high-precision data can be obtained by integrating the two data and correcting the low-precision data using an interpolation method. The high-speed camera plays a role of providing the data to be interpolated at a high frame rate to facilitate integration and to increase the frame rate of data finally obtained as an output of the system. In order to integrate the three-dimensional position information obtained from different types of cameras, how the markers detected by camera group 1 correspond to the markers detected by camera group 2
(どれとどれが同じマーカのデータである力 を調べる、つまり個々のマーカをトラツキ ングする必要がある。統合とは、各カメラ群で計測されたマーカ位置を補間し、 1つの 連続したデータを得る。これによつて、精度の低いデータを精度の高いデータで修正 する、将来のマーカ位置を予測することが可能となる。 [0023] 統合処理では、複数のカメラ群力 送られてくるマーカ三次元位置データを受け取り 、以前観測されたマーカとの対応を調べて一意な IDを付け (トラッキング)を行う。既に 計測されたフレームのマーカ位置データは IDごとに分けられ、補間式と共に記憶さ れている。統合の時には、この補間式を用いて現在の予測位置を計算し、対応関係 を調べて対応する IDを付ける。トラッキングできたマーカにっ 、ては三次元再構成時 の誤差を考慮して補間多項式を計算する。ここで言う誤差とは、後述する式 (7)のべ タトル間距離であり、これが大きいということは各ベクトルの誤差が大きぐ得られた三 次元位置の信頼性が低いことを意味し、小さい重みを与える。これにより、全体的に 精度の高いデータに近づくような補間式が計算される。この補間式を用いて、ある力 メラ群が次の計測(次フレーム)を行うときのマーカ位置を予測し、その情報をマーカ 検出'三次元再構成処理に送ると、マーカ検出や三次元再構成の段階でもトラツキン グを行うことができる。これにより、後に述べるようにリアルタイムでの三次元再構成や トラッキングが実現される。 (It is necessary to find out which and which are the data of the same marker, that is, to track the individual markers. The integration is to interpolate the marker positions measured by each camera group and to convert one continuous data. As a result, it becomes possible to predict a future marker position in which low-precision data is corrected with high-precision data. In the integration process, the three-dimensional marker position data sent from the plurality of camera groups is received, the correspondence with the previously observed marker is checked, and a unique ID is assigned (tracking). Marker position data of already measured frames is divided for each ID and stored together with the interpolation formula. At the time of integration, the current predicted position is calculated using this interpolation formula, the correspondence is checked, and the corresponding ID is assigned. For the marker that has been tracked, an interpolation polynomial is calculated in consideration of the error at the time of three-dimensional reconstruction. The error referred to here is the distance between the vectors in Eq. (7) described later.A larger value means that the reliability of the obtained three-dimensional position where the error of each vector is large is low. Give weight. As a result, an interpolation formula is calculated that approximates highly accurate data as a whole. By using this interpolation formula, a marker position is predicted when a certain camera group performs the next measurement (next frame), and the information is sent to the marker detection 3D reconstruction processing. Tracking can also be performed at the configuration stage. This enables real-time three-dimensional reconstruction and tracking as described later.
[0024] [B]マーカ軌跡の補間  [0024] [B] Interpolation of marker locus
異種カメラからのデータの統合を行う補間処理及び位置予測処理について述べる。 マーカの位置予測、および計測終了後一定時間幅のデータを得るために、各マーカ (同じ IDが付与されているマーカ)の軌跡を多項式により補間する。具体的には、時 刻 tにおいて計測されたデータが送られてきたとき、時刻 t て以降のデータを用い て多項式補間関数の係数を決定する。ここで  An interpolation process and a position prediction process for integrating data from different types of cameras will be described. The trajectory of each marker (marker with the same ID) is interpolated by a polynomial in order to predict the position of the marker and obtain data of a fixed time width after the end of the measurement. Specifically, when data measured at time t is sent, the coefficients of the polynomial interpolation function are determined using data after time t. here
ては補間に用いるデータの時間幅を決める定数である。  Is a constant that determines the time width of data used for interpolation.
[0025] 補間の方法を図 3に示す。時刻 t (i=0, 1, . . . , n;t- τ <t <t く. . . <t  FIG. 3 shows an interpolation method. Time t (i = 0, 1,..., N; t- τ <t <t
m+i m m+ 1  m + i m m + 1
=t)に計測された位置を pと表し、このとき時刻 tから時刻 t の区間多項式 m+n i m m+ 1  = t) is denoted by p, and the interval polynomial m + n i m m + 1 from time t to time t
f (t-t )を求める。マーカ軌跡が時刻 t において 1階微分可能でなければならない m m m  Find f (t-t). Marker trajectory must be first differentiable at time t m m m
とすると、  Then
[数 1]
Figure imgf000009_0001
[Number 1]
Figure imgf000009_0001
(o) = 1 } たた , pm = Jm—l( m - tm-i), pm = H ― tm- \) また、時刻 t において多項式はそれぞれ p を通るから、 (o) = 1} , p m = J m —l ( m -t m -i), p m = H ― tm- \) Also, at time t, each polynomial passes through p, so
m+i m+i  m + i m + i
△t=t — t とおくと、  △ t = t — t
i m + i m  i m + i m
[数 2]  [Number 2]
= pm+i = 1, 2, 。, . (2) を満たす必要がある。ここで、多項式の次数が補間に使う点数より少ない場合や、異 常なデータが計測された場合に対応するため、式 (2)は必ずしも厳密に満たす必要 はないものとする。 = p m + i = 1, 2 ,. ,. (2) must be satisfied. Here, in order to cope with a case where the degree of the polynomial is smaller than the number of points used for interpolation, or a case where abnormal data is measured, it is not necessary to always strictly satisfy Expression (2).
f (t)を r次の多項式とすると If f (t) is a polynomial of degree r,
[数 3] fm(t)二 Qf + ax 1十… + α,.—Λί + ar (3) と書ける。 [Number 3] fm (t) two Q f + a x 1 ten ... + α, .- Λ ί + a r (3) and write.
式(1)より明らかに、 Clearly from equation (1),
画 ar = Pm fc^ r ar—\ - pm なので、式(2)は次のようにまとめられる。 Since the image a r = Pm fc ^ ra r — \-p m , equation (2) can be summarized as follows.
[数 5] [Number 5]
Figure imgf000010_0001
ノ "一 t„pm -pm )
Figure imgf000010_0001
No "one t„ p m -p m )
(4) これはさら  (4) This is more
[数 6] [Number 6]
Ta = u とまとめられ,これを解くと Ta = u And solve this,
[数 7] a =≠p (6) と多項式の係数が計算できる。ここで Τ# (ベクトル)は T (ベクトル)の重み付き擬似逆 行列である。各時刻のデータに対応する重み w は、三次元再構成時のマーカ推 定ベクトル間の平均距離 d を用いて [Equation 7] a = ≠ p (6) and the coefficients of the polynomial can be calculated. Here, Τ # (vector) is a weighted pseudo-inverse of T (vector). The weight w corresponding to the data at each time is calculated using the average distance d between the marker estimation vectors during 3D reconstruction.
m + i  m + i
[数 8]  [Equation 8]
C  C
= a "m ~+~i 7 + レ^ (7) によって計算する。ここで aはカメラのフレームレートの違いを補正するための係数、 Cは任意定数である。これにより、平均距離の大きいマーカ、すなわち誤差の大きい マーカの重みが小さくなる。また、一般にフレームレートの高いカメラに対しては aを 小さくして、個々のデータの影響を小さくする。尚、補間方法は、ここで述べたものに 限定されない。また、重みの決定方法についても、ここで述べたものに限定されず、 距離に対して単調減少する関数であれば他の決め方でもよい。 = a "m ~ + ~ i 7 + ^^ ( 7 ) where a is a coefficient for correcting the difference in camera frame rate, and C is an arbitrary constant. The weight of the marker, that is, the marker with a large error is reduced, and in general, for cameras with a high frame rate, a is reduced to reduce the influence of individual data.The interpolation method is described here. The method for determining the weight is not limited to the one described here, and any other method may be used as long as the function monotonically decreases with respect to the distance.
[0027] [C]マーカのトラッキング  [0027] [C] Marker tracking
マーカのトラッキングについて述べる。本システムでは、複数のフレームにわたって同 一と判断できるマーカをトラッキングし、同じ IDを付けることで前述の補間を可能にし ている。マーカトラッキングをリアルタイムで行うため、統合処理だけでなくマーカ検出 '再構成処理においてもローカルトラッキングを行う。マーカのトラッキングについて異 種カメラを用いたものに基づいて説明する力 本発明に係るマーカトラッキング方法 は、 1種類のカメラだけを用いるシステムでも有効である。また、ここでは、一つの好ま L ヽ態様として、マーカ予測位置を IDを備えた参照マーカ位置として採用する場合 について説明する力 本発明で採用され得る参照マーカ位置はマーカ予測位置に は限定されず、例えば、トラッキングステップで得られた最新の ID付き計測マーカ位 置を参照マーカ位置として採用してもよい。  The marker tracking will be described. In this system, the above-mentioned interpolation is enabled by tracking the marker that can be determined to be the same across multiple frames and assigning the same ID. Since marker tracking is performed in real time, local tracking is performed not only in integration processing but also in marker detection and reconstruction processing. Ability to Explain Marker Tracking Based on Using Different Kinds of Cameras The marker tracking method according to the present invention is effective even in a system using only one kind of camera. In addition, here, as one preferred aspect, a force that describes a case where a marker prediction position is adopted as a reference marker position having an ID is not limited to the marker prediction position. For example, the latest measurement marker position with ID obtained in the tracking step may be adopted as the reference marker position.
[0028] マーカ推定位置(マーカ予測位置)を用いたトラッキングの概念図を図 2に示す。図 2 に示すように、統合処理によって得られたマーカ推定位置(白丸 1, 2, 3であって、 I Dが付与されている)をカメラ画像に投影して二次元座標値を取得する。そして、実 際に検出されたマーカとの対応を取ることで検出されたマーカのローカルトラッキング を行う。予想されるマーカ推定位置と実際に検出されたマーカとの対応が確認される と、検出されたマーカに IDが付される。検出されたマーカ(IDが付されている)を三次 元再構成してマーカ位置を取得する。マーカ推定位置と三次元再構成されたマーカ 位置との対応を取る。これらの処理は、検出されたマーカの集合 D、 FIG. 2 shows a conceptual diagram of tracking using the marker estimated position (marker predicted position). Figure 2 As shown in (2), the marker estimation positions (white circles 1, 2, and 3 with IDs) obtained by the integration processing are projected on a camera image to obtain two-dimensional coordinate values. Then, local tracking of the detected marker is performed by associating with the actually detected marker. When the correspondence between the estimated marker estimated position and the actually detected marker is confirmed, an ID is assigned to the detected marker. The detected marker (with ID) is reconstructed in three dimensions to obtain the marker position. The correspondence between the estimated marker position and the three-dimensionally reconstructed marker position is taken. These operations consist of a set of detected markers D,
推定マーカの集合 Eが与えられたときに、最適な割り当て、  Given a set of estimated markers E, the optimal assignment,
[数 9]  [Number 9]
{(Di , E ), (D2, Eh), . . . , (¾, Ej ), Ό, 6{ e {φ 6} を探す問題となる。ここで Ε φは対応する推定マーカがない、すなわち新しいマー 力であることを表し、このようなマーカが現れた場合は新たな IDが付けられる。 {(Di, E), ( D 2, E h),..., A (¾, Ej), Ό, Find 6 {e {φ 6} problems. Here, Εφ indicates that there is no corresponding estimated marker, that is, a new marker, and a new ID is added when such a marker appears.
[0029] 本実施例では、組 (D ,E)のスコアを以下のように与える。 In this embodiment, the score of the pair (D, E) is given as follows.
[数 10]  [Number 10]
Ρ(^, Ε = -——― (8) Ρ (^, Ε = ------- (8)
ただし、 Cは任意定数、 d(D ,Ε )は検出されたマーカ Dと推定マーカ Εとの距離であ る。なお、 Ε = φ、すなわち対応する推定マーカがな力つた場合は d(D ,E )を対応す る可能性のある最大距離 d とする。そして、最適化のための評価関数として式 (8) Here, C is an arbitrary constant, and d (D, Ε) is the distance between the detected marker D and the estimated marker Ε. Note that Ε = φ, that is, when the corresponding estimated marker is strong, d (D, E) is set as the maximum distance d that may correspond. Then, as an evaluation function for optimization, equation (8)
max  max
の和を用いる。これにより、いずれかのマーカが検出されなかったとしても大域的に 最適な割り当てを探索することができる。  Is used. Thereby, even if any of the markers are not detected, it is possible to search for a globally optimal assignment.
[0030] 統合処理におけるトラッキングでは、検出されたマーカにも三次元再構成処理により I Dが付けられている。そこで、 Dと Eが同じ IDを持つとき組 (D ,E)に一定のスコアを 追加することで、その組が優先的に探索されるようにする。 [0030] In tracking in the integration processing, IDs are added to detected markers by three-dimensional reconstruction processing. Therefore, when D and E have the same ID, a certain score is added to the pair (D, E) so that the pair is searched preferentially.
[0031] [D]三次元再構成  [D] 3D reconstruction
三次元再構成について説明する。通常のモーションキヤプチヤにおける三次元再構 成では、多数のマーカ推定ベクトルの組から距離の近いものを探索するため、計算 量が大きくなる。これに対し本発明では、前述のように各カメラで検出されたマーカに も既に IDが付けられて!/、るので、すべての IDが正し!/、場合にはマーカ推定ベクトル の組み合わせは既知であり、探索の必要がない。実際には間違った IDが付けられて いる可能性があるため、距離の計算を併用してマーカ推定ベクトルの組を決定し、マ 一力の三次元位置を計算する。 The three-dimensional reconstruction will be described. Three-dimensional reconstruction in normal motion capture In the configuration, since a search is performed for a short distance from a set of a large number of marker estimation vectors, the calculation amount is large. On the other hand, in the present invention, as described above, the IDs are already assigned to the markers detected by each camera, too! /, So that all IDs are correct! / Known, no need to search. In practice, there is a possibility that the wrong ID is assigned, so the distance estimation is used together to determine the set of marker estimation vectors, and the three-dimensional position of the force is calculated.
[0032] 三次元再構成は、 n個のカメラを持つシステムにおいて、カメラ kで検出されたマーカ の集合 Dと推定マーカの集合 Eが与えられたとき、 [0032] In a system having n cameras, three-dimensional reconstruction is performed when a set D of markers detected by camera k and a set E of estimated markers are given.
[数 11] 推定マーカ e { }とカメラたで検出された  [Equation 11] Estimated marker e {} detected by camera
マーカ € {^D11}の組み合わせ( , , . . ., を複数見つける問題となる。それぞれの組み合わせ 1つに対しマーカ 1個が対応するThe combination of marker € {^ D 11} (, ,..., A plurality find the problem. For each combination one single marker corresponding
。探索は以下のようにして行う。 . The search is performed as follows.
[0033] ステップ 1:各推定マーカについて同じ IDを持つマーカ推定ベクトル間の距離を調べ る。 n 本以上のベクトルが閾値 d 以下の距離で交差すればマーカが存在するとし mm max  Step 1: Check the distance between marker estimation vectors having the same ID for each estimated marker. If n or more vectors intersect at a distance less than or equal to threshold d, it is assumed that a marker exists and mm max
てマーカ位置を計算する。  To calculate the marker position.
[0034] ステップ 2 :ステップ 1で使われなかったマーカ推定ベクトルに対し、通常のモーション キヤプチヤと同様に閾値以下の距離で交差する組を探索する。この結果見つかった マーカには新し 、IDが付けられる。 Step 2: Search for a set that intersects with the marker estimation vector not used in Step 1 at a distance equal to or smaller than the threshold value in the same manner as in a normal motion capture. The marker found as a result is given a new ID.
[0035] [E]実験 [E] Experiment
本システムの精度と速度を評価するための実験とその結果につ 、て述べる。実験装 置及び条件について説明する。テーブル 1に示す 2種類のカメラを用いて、計 19台の カメラ力もなるモーションキヤプチヤシステムを構築した。グループ Aに属するカメラ群 が高精度カメラ群であり、解像度が 1004 X 1004ピクセル、最大フレーム速度が 50f psであり、高精度カメラ群は 10台の高精度カメラより構成される。グループ Bに属する カメラ群が高速度カメラ群であり、解像度が 532 X 512ピクセル、最大フレーム速度 力 S262fpsであり、高精度カメラ群は 9台の高速度カメラより構成される。マーカ検出 用の PCを各カメラに 1台ずつ割り当て、各カメラ群の PCの中の 1台の PCで三次元再 構成を行う。また別に PentiumIV2GHzマシンを統合処理に使った。被験者に 10個、 2 0個、 30個、 40個のマーカを取り付けた場合それぞれについてスクワット、歩行、キック の動作を計測し、速度とトラッキング性能を評価した。以下の評価実験では、補間の パラメータとして τ =0.1s, r=4を用いた。また、三次元再構成における最小ベクトル数 を n = 3、閾値をグループ A,Bに対してそれぞれ d =0.005m、 d = 0.008mとし min max max An experiment to evaluate the accuracy and speed of this system and the results are described. The experimental equipment and conditions will be described. Using the two types of cameras shown in Table 1, a motion capture system with a total of 19 cameras was constructed. The cameras belonging to Group A are high-precision cameras with a resolution of 1004 x 1004 pixels and a maximum frame rate of 50 fps. The high-precision cameras consist of 10 high-precision cameras. The cameras belonging to Group B are high-speed cameras, with a resolution of 532 x 512 pixels and a maximum frame speed of S262 fps. The high-precision cameras consist of nine high-speed cameras. Marker detection One PC for each camera is assigned to each camera, and one of the PCs in each camera group performs 3D reconstruction. Separately, a Pentium IV 2GHz machine was used for integrated processing. The squat, walking, and kick movements were measured for each of the subjects with 10, 20, 30, and 40 markers attached, and the speed and tracking performance were evaluated. In the following evaluation experiments, τ = 0.1 s and r = 4 were used as interpolation parameters. The minimum number of vectors in the 3D reconstruction is n = 3, and the thresholds are d = 0.005m and d = 0.008m for groups A and B, respectively.
た。 It was.
[表 1] [table 1]
Table 1: Specifications of the cameras used for the experiTable 1: Specifications of the cameras used for the experi
Figure imgf000014_0001
速度評価について述べる。カメラグループ、統合処理における各ステップの処理時 間をそれぞれテープノレ 2、テーブル 3に示す。また、 1フレームを取得するのにかかつ た平均時間をテーブル 4に示す。 10〜30マーカではグループ Bのみを使ったものが 最も高速なのに対し、 40マーカでは統合システムが最も高速である。これは、マーカ 数が増加すると統合処理の負荷が再構成と比べて相対的に小さくなるためと考えら れる。
Figure imgf000014_0001
The speed evaluation will be described. The processing time of each step in the camera group and the integration processing is shown in Table 2 and Table 3, respectively. Table 4 shows the average time taken to acquire one frame. For 10 to 30 markers, the one using group B alone is the fastest, while for 40 markers, the integrated system is the fastest. This is thought to be because the load of the integration processing becomes relatively smaller as the number of markers increases as compared to the reconstruction.
[表 2] Table 2: Computation time for each step at the camera groups [ms]. [Table 2] Table 2: Computation time for each step at the camera groups [ms].
Figure imgf000015_0002
Figure imgf000015_0002
[表 3] [Table 3]
Table 3: Computation time for each step at integration process [ms .
Figure imgf000015_0001
Figure imgf000015_0003
Table 3: Computation time for each step at integration process (ms.
Figure imgf000015_0001
Figure imgf000015_0003
[表 4] [Table 4]
Table 4: Average time per frame for various combinations of  Table 4: Average time per frame for various combinations of
Figure imgf000015_0004
トラッキング性能評価について説明する。各運動において最後までトラッキングできた マーカの割合をテーブル 5に示す。ほとんどのケースで統合システムが最も多くのマ 一力をトラッキングできて 、ることがわかる。
Figure imgf000015_0004
The tracking performance evaluation will be described. Table 5 shows the percentage of markers that could be tracked to the end in each exercise. We can see that in most cases, the integrated system can track the most effort.
[表 5] Table 5: Ratio of markers tracked throughout the motion [Table 5] Table 5: Ratio of markers tracked throughout the motion
Figure imgf000016_0001
Figure imgf000016_0001
[0038] リアルタイム関節角計算について述べる。本システムではマーカをリアルタイムでトラ ッキングしてレ、るため、初めにマーカ IDとラベルの対応が取れてレヽればリアルタイム のラベリング.関節角計算が可能となる。初期フレームのラベリングを行うための GUI を開発し、柔軟な逆運動学計算アルゴリズム UTPoser (山根, 中村: "ヒューマンフィギ ユアの全身運動生成のための協応構造ィ匕インターフェース、 "日本ロボット学会誌, V ol.20, no. 3,ρρ.113-121,2002)を用いて任意のマーカ配置に対してリアルタイムの関 節角計算を実現した。尚、逆運動学計算に 70--80msかかるため、画面中のヒユーマ ンフィギュアは時間遅れを持って動くことが観察された。 [0038] The real-time joint angle calculation will be described. In this system, markers are tracked and recorded in real time, so if a marker ID and a label are first matched and then displayed, real-time labeling and joint angle calculation can be performed. UTPoser (Yamane, Nakamura: "Cooperative Structure for Generating Whole Body Motion of Human-figure Yua," developed a GUI for labeling of initial frames and developed a flexible inverse kinematics calculation algorithm. Vol.20, no. 3, ρρ.113-121, 2002), real-time joint angle calculation was realized for an arbitrary marker arrangement. Since the inverse kinematics calculation took 70-80 ms, it was observed that the human figure on the screen moved with a time delay.
[0039] 2種類のカメラを用いた実験により、高速度のデータが得られること、高い確率でマー 力がトラッキングできることが確認された。またマーカトラッキング機能を利用して、リア ルタイムでのラベリング '関節角計算が実現された。  [0039] Experiments using two types of cameras have confirmed that high-speed data can be obtained and that mar-power can be tracked with high probability. In addition, real-time labeling and joint angle calculation were realized using the marker tracking function.
産業上の利用可能性  Industrial applicability
[0040] 本発明によって、人間や動物の動作を数値的なデータとして計測することができ、そ れによって得られたデータはバイオメカ-タス、ヒューマノイドの動作生成、 CGアニメ ーシヨンの生成など幅広 、分野で用いられ得る。 [0040] According to the present invention, the movement of a human or an animal can be measured as numerical data, and the data obtained thereby can be used in a wide range of fields such as biomechanics, humanoid movement generation, and CG animation generation. Can be used.
図面の簡単な説明 [図 1]2種類のカメラ群を用いたモーションキヤプチヤシステムの概略図である。 Brief Description of Drawings FIG. 1 is a schematic diagram of a motion capture system using two types of camera groups.
[図 2]マーカ検出処理及び再構成処理におけるローカルトラッキングを示す図である 圆 3]多項式による補間を示す図である  FIG. 2 is a diagram illustrating local tracking in a marker detection process and a reconstruction process. FIG. 3 is a diagram illustrating interpolation using a polynomial.

Claims

請求の範囲 The scope of the claims
[1] 各カメラによって取得された画像力 マーカの二次元位置情報を検出するマーカ 検出ステップと、  [1] a marker detection step for detecting two-dimensional position information of an image force marker acquired by each camera;
各カメラによって検出されたマーカの二次元位置情報を用いて、マーカの三次元 位置情報を取得する三次元再構成ステップと、  A three-dimensional reconstruction step of acquiring three-dimensional position information of the marker using the two-dimensional position information of the marker detected by each camera;
マーカの三次元位置情報を時系列に探索して、同一と判定されるマーカの三次元 位置情報に同一の IDを付与するトラッキングステップとを有し、  A tracking step of searching for the three-dimensional position information of the marker in time series and assigning the same ID to the three-dimensional position information of the marker determined to be the same,
前記マーカ検出ステップおよび Zあるいは前記三次元再構成ステップは、 IDを備 えた参照マーカ位置と、検出されたマーカの二次元位置情報あるいは Zおよび三次 元位置情報との同一性を判定し、参照マーカ位置と同一と判定された二次元位置情 報あるいは Zおよび三次元位置情報に参照マーカ位置と同一の IDを付与するステ ップを含むことを特徴とする光学式モーションキヤプチヤシステムにおけるマーカトラ ッキング方法。  The marker detection step and Z or the three-dimensional reconstruction step determines the identity of the reference marker position provided with the ID with the detected marker's two-dimensional position information or Z and three-dimensional position information, and Marker tracking in an optical motion capture system characterized by including a step of assigning the same ID as the reference marker position to the two-dimensional position information or Z and three-dimensional position information determined to be the same as the position Method.
[2] 前記マーカ検出ステップは、 IDが付された参照マーカ位置の二次元座標値と検出 されたマーカの二次元位置情報との同一性を判定することで IDを付与することを特 徴とする請求項 1に記載のマーカトラッキング方法。  [2] The marker detecting step is characterized in that the ID is given by determining the identity between the two-dimensional coordinate value of the reference marker position to which the ID is attached and the two-dimensional position information of the detected marker. The marker tracking method according to claim 1, wherein the marker tracking method is performed.
[3] 前記三次元再構成ステップは、同一の IDが付与された二次元位置情報に基づい てマーカの三次元位置情報を取得することを特徴とする請求項 2に記載のマーカトラ ッキング方法。 3. The marker tracking method according to claim 2, wherein the three-dimensional reconstruction step acquires three-dimensional position information of a marker based on two-dimensional position information to which the same ID is assigned.
[4] 前記三次元再構成ステップは、 IDが付された参照マーカ位置と再構成されたマー 力の三次元位置情報との同一性を判定することで IDを付与することを特徴とする請 求項 1乃至 3に記載のマーカトラッキング方法。  [4] In the three-dimensional reconstruction step, the ID is given by determining the identity between the reference marker position to which the ID is added and the three-dimensional position information of the reconstructed marker. 4. The marker tracking method according to claim 1, wherein
[5] マーカが同一力否の判定は検出マーカの位置情報あるいは三次元再構成された マーカの位置情報と、参照マーカ位置との間の距離によって行うことを特徴とする請 求項 1乃至 4いずれかに記載のマーカトラッキング方法。 [5] Claims 1 to 4 wherein the determination of whether or not the markers have the same force is performed based on the distance between the position information of the detected marker or the position information of the three-dimensionally reconstructed marker and the reference marker position. The marker tracking method according to any one of the above.
[6] 前記参照マーカ位置は、前記トラッキングステップで得られた ID付き計測マーカ位 置に基づいて計算された同じ IDを備えたマーカ予測位置であることを特徴とする請 求項 1乃至 5いずれかに記載のマーカトラッキング方法。 [6] The claim according to any one of claims 1 to 5, wherein the reference marker position is a marker prediction position having the same ID calculated based on the measurement marker position with ID obtained in the tracking step. Or the marker tracking method described in
[7] 前記参照マーカ位置は、前記トラッキングステップで得られた最新の ID付き計測マ 一力位置であることを特徴とする請求項 1乃至 5いずれかに記載のマーカトラッキング 方法。 [7] The marker tracking method according to any one of claims 1 to 5, wherein the reference marker position is the latest measurement force position with ID obtained in the tracking step.
[8] 前記三次元再構成ステップは、  [8] The three-dimensional reconstruction step includes:
前記マーカ検出ステップで同一の IDが付与された複数のマーカ推定ベクトル間の 距離を計算するステップと、  Calculating a distance between a plurality of marker estimation vectors assigned the same ID in the marker detection step;
予め設定された本数以上のベクトルが閾値以下の距離で交差するか否かを判定す るステップと、  Determining whether or not vectors equal to or greater than a predetermined number intersect at a distance equal to or less than a threshold value;
を含むことを特徴とする請求項 1乃至 7いずれかに記載のマーカトラッキング方法。  8. The marker tracking method according to claim 1, further comprising:
[9] 判定が是の場合には、マーカの三次元位置を計算することを特徴とする請求項 8に 記載のマーカトラッキング方法。 [9] The marker tracking method according to claim 8, wherein if the judgment is positive, the three-dimensional position of the marker is calculated.
[10] 判定が否の場合には、マーカの三次元位置の計算に使用されな力つたマーカ推定 ベクトルに対して、閾値以下の距離で交差する組を探索し、新しく見つかったマーカ 推定ベクトルカゝら構成されるマーカの三次元位置に新しい IDを付与することを特徴と する請求項 8に記載のマーカトラッキング方法。 [10] If the determination is negative, a set that intersects with a strong marker estimation vector used for calculation of the three-dimensional position of the marker at a distance equal to or less than the threshold is searched, and a newly found marker estimation vector marker 9. The marker tracking method according to claim 8, wherein a new ID is assigned to a three-dimensional position of a marker composed of the marker.
[11] 請求項 1乃至 10いずれかに記載の方法において、該方法は、 [11] The method according to any one of claims 1 to 10, wherein the method comprises:
フレームレート及び解像度において異なるカメラ力 構成される異種カメラ群によつ てマーカが付された被験体を撮影して画像を取得するものであり、  An image is acquired by photographing a subject marked with a heterogeneous camera group having different camera powers at different frame rates and resolutions,
該マーカ検出ステップ、該三次元再構成ステップは各カメラ群にぉ 、て行われ、 該トラッキングステップは、  The marker detection step and the three-dimensional reconstruction step are performed for each camera group, and the tracking step includes:
各カメラ群で取得された同一の IDが付与されたマーカの三次元位置情報を時系列 に統合する統合ステップが含まれる、  An integration step of integrating the three-dimensional position information of the markers with the same ID acquired by each camera group in a time-series manner,
ことを特徴とするマーカトラッキング方法。  A marker tracking method, characterized in that:
[12] 異種カメラ群は、高精度カメラ群と高速度カメラ群を含むことを特徴とする請求項 11 に記載のマーカトラッキング方法。 12. The marker tracking method according to claim 11, wherein the heterogeneous camera group includes a high-precision camera group and a high-speed camera group.
[13] 前記統合ステップは、各カメラ群で計測された複数フレームにわたって同一の IDが 付与されたマーカの時系列三次元位置情報力 なるマーカ軌跡を補間する補間多 項式を求めるステップを含むことを特徴とする請求項 11, 12いずれか〖こ記載のマー 力トラッキング方法。 [13] The integrating step includes a step of obtaining an interpolation polynomial for interpolating a marker trajectory, which is a time-series three-dimensional position information force of a marker to which the same ID is assigned over a plurality of frames measured by each camera group. The marker according to any one of claims 11 and 12, Force tracking method.
[14] 補間は、以下の多項式によって行われることを特徴とする請求項 10に記載のマー 力トラッキング方法。  [14] The marquee tracking method according to claim 10, wherein the interpolation is performed by the following polynomial.
[数 1]  [Number 1]
Jm{t) = aQr + ai t'"1 + ' . . + ar—i t + ar ここに、 f (t)は、時刻 t 〜t の間の補間式、 a - - -aは多項式の係数である。 Jm {t) = a Q r + ai t '" 1 +'.. + A r —it + a r where f (t) is the interpolation formula between times t and t, a---a Is the coefficient of the polynomial.
m m m+ 1 0 r  m m m + 1 0 r
[15] 多項式の係数の算出において、三次元再構成時のマーカ推定ベクトル間の平均 距離に基づく重み付けを行うことを特徴とする請求項 14に記載のマーカトラッキング 方法。  15. The marker tracking method according to claim 14, wherein in calculating the coefficients of the polynomial, weighting is performed based on an average distance between marker estimation vectors at the time of three-dimensional reconstruction.
[16] 補間多項式を用いて、次フレームにおけるマーカ位置を予測し、マーカ予測位置 を参照マーカ位置とすることを特徴とする請求項 13乃至 15いずれかに記載のマー 力トラッキング方法。  16. The marquee tracking method according to claim 13, wherein a marker position in the next frame is predicted using an interpolation polynomial, and the predicted marker position is used as a reference marker position.
[17] フレームレート及び解像度において異なるカメラ力 構成される異種カメラ群によつ てマーカが付された被験体を撮影して画像を取得するステップと、  [17] a step of capturing an image of a subject marked with a heterogeneous camera group having different camera powers at different frame rates and resolutions, and acquiring an image;
各カメラ群において取得した画像力もマーカの二次元位置情報を検出するマーカ 検出ステップと、  A marker detection step for detecting the two-dimensional position information of the marker also for the image power acquired by each camera group;
各カメラ群において検出されたマーカの二次元位置情報を用いて、マーカの三次 元位置情報を取得する三次元再構成ステップと、  A three-dimensional reconstruction step of obtaining three-dimensional position information of the marker using the two-dimensional position information of the marker detected in each camera group;
各カメラ群で取得されたマーカの三次元位置情報を統合する統合ステップと、 を有し、該統合ステップは、  An integration step of integrating the three-dimensional position information of the marker acquired by each camera group, and the integration step includes:
マーカの三次元位置情報を時系列に探索して、同一と判定されるマーカの三次元 位置情報に同一の IDを付与するトラッキングステップと、  A tracking step of searching for three-dimensional position information of the marker in a time series and assigning the same ID to the three-dimensional position information of the marker determined to be the same;
各カメラ群で取得された同一の IDが付与されたマーカの三次元位置情報を補間す るステップと、  Interpolating the three-dimensional position information of the marker with the same ID acquired by each camera group;
を有することを特徴とする光学式モーションキヤプチャ方法。  An optical motion capture method comprising:
[18] 異種カメラ群は、高精度カメラ群と高速度カメラ群を含むことを特徴とする請求項 17 に記載のモーションキヤプチャ方法。 18. The motion capture method according to claim 17, wherein the heterogeneous camera group includes a high-precision camera group and a high-speed camera group.
[19] 前記補間ステップは、各カメラ群で計測された複数フレームにわたって同一の IDが 付与されたマーカの時系列三次元位置情報力 なるマーカ軌跡を補間する補間多 項式を求めるものであることを特徴とする請求項 17, 18いずれかに記載のモーション キヤプチャ方法。 [19] The interpolation step is for obtaining an interpolation polynomial for interpolating a marker trajectory, which is a time-series three-dimensional position information force of a marker to which the same ID is assigned over a plurality of frames measured by each camera group. 19. The motion capture method according to claim 17, wherein:
[20] 補間は、以下の多項式によって行われることを特徴とする請求項 18に記載のモー シヨンキヤプチャ方法。  [20] The motion capture method according to claim 18, wherein the interpolation is performed by the following polynomial.
[数 2]  [Number 2]
Jm{t) = aof + α\ ί~' + · . . + ar^\t + ar ここに、 f (t)は、時刻 t 〜t の間の補間式、 a - - -aは多項式の係数である。 J m (t) = aof + α \ ί ~ '+.. + A r ^ \ t + a r where f (t) is the interpolation formula between times t and t, a---a Is the coefficient of the polynomial.
m m m+ 1 0 r  m m m + 1 0 r
[21] 多項式の係数の算出において、三次元再構成時のマーカ推定ベクトル間の平均 距離に基づく重み付けを行うことを特徴とする請求項 20に記載のモーションキヤプチ ャ方法。  21. The motion capture method according to claim 20, wherein in calculating the coefficients of the polynomial, weighting is performed based on an average distance between marker estimation vectors at the time of three-dimensional reconstruction.
[22] 前記補間に基づいて、各カメラ群のカメラの次フレームにおけるマーカ位置を予測 するステップを含む請求項 17乃至 21いずれかに記載のモーションキヤプチャ方法。  22. The motion capture method according to claim 17, further comprising: predicting a marker position in a next frame of a camera of each camera group based on the interpolation.
[23] 前記マーカ予測位置には IDが付与されており、前記マーカ検出ステップおよび Z あるいは前記三次元再構成ステップは、 IDを備えたマーカ予測位置と、検出された マーカの二次元位置情報あるいは Zおよび三次元位置情報との同一性を判定し、 マーカ予測位置と同一と判定された二次元位置情報あるいは Zおよび三次元位置 情報にマーカ予測位置と同一の IDを付与するステップを含むことを特徴とする請求 項 22に記載のモーションキヤプチャ方法。  [23] An ID is assigned to the marker prediction position, and the marker detection step and Z or the three-dimensional reconstruction step include a marker prediction position having an ID, two-dimensional position information of the detected marker, or Determining the identity with the Z and three-dimensional position information, and assigning the same ID as the marker predicted position to the two-dimensional position information or the Z and three-dimensional position information determined to be the same as the marker predicted position. The motion capture method according to claim 22, characterized in that the motion capture method is characterized in that:
[24] 前記マーカ検出ステップは、 IDが付されたマーカ予測位置の二次元座標値と検出 されたマーカの二次元位置情報との同一性を判定することで IDを付与することを特 徴とする請求項 23に記載のモーションキヤプチャ方法。  [24] The marker detecting step is characterized in that the ID is assigned by determining the identity between the two-dimensional coordinate value of the marker predicted position to which the ID is assigned and the two-dimensional position information of the detected marker. 24. The motion capture method according to claim 23.
[25] 前記三次元再構成ステップは、同一の IDが付与された二次元位置情報に基づい てマーカの三次元位置情報を取得することを特徴とする請求項 24に記載のモーショ ンキヤプチャ方法。  25. The motion capture method according to claim 24, wherein in the three-dimensional reconstruction step, three-dimensional position information of a marker is obtained based on two-dimensional position information to which the same ID is assigned.
[26] 前記三次元再構成ステップは、 IDが付されたマーカ予測位置と再構成されたマー 力の三次元位置情報との同一性を判定することで IDを付与することを特徴とする請 求項 23乃至 25に記載のモーションキヤプチャ方法。 [26] In the three-dimensional reconstruction step, the marker prediction position to which the ID is assigned and the reconstructed marker are 26. The motion capture method according to claim 23, wherein the ID is assigned by determining the identity of the force with the three-dimensional position information.
[27] マーカが同一力否の判定は検出マーカあるいは再構成されたマーカと、マーカ予 測位置との間の距離によって行うことを特徴とする請求項 23乃至 26いずれかに記載 のモーションキヤプチャ方法。 27. The motion capture according to claim 23, wherein the determination of whether or not the markers have the same force is performed based on a distance between the detected marker or the reconstructed marker and the predicted marker position. Method.
[28] 前記三次元再構成ステップは、 [28] The three-dimensional reconstruction step includes:
マーカ検出ステップにおいて同一の IDが付与された複数のマーカ推定ベクトル間 の距離を計算するステップと、  Calculating a distance between a plurality of marker estimation vectors assigned the same ID in the marker detection step;
予め設定された本数以上のベクトルが閾値以下の距離で交差するか否かを判定す るステップと、  Determining whether or not vectors equal to or greater than a predetermined number intersect at a distance equal to or less than a threshold value;
を含むことを特徴とする請求項 23乃至 27いずれかに記載のモーションキヤプチャ 方法。  The method according to any one of claims 23 to 27, comprising:
[29] 判定が是の場合には、マーカの三次元位置を計算することを特徴とする請求項 28 に記載のモーションキヤプチャ方法。  29. The motion capture method according to claim 28, wherein if the judgment is positive, the three-dimensional position of the marker is calculated.
[30] 判定が否の場合には、マーカの三次元位置の計算に使用されな力つたマーカ推定 ベクトルに対して、閾値以下の距離で交差する組を探索し、新しく見つかったマーカ 推定ベクトルカゝら構成されるマーカの三次元位置に新しい IDを付与することを特徴と する請求項 28に記載のモーションキヤプチャ方法。 [30] If the determination is negative, a set that intersects with a strong marker estimation vector used for calculating the three-dimensional position of the marker at a distance equal to or less than the threshold value is searched, and a newly found marker estimation vector marker 29. The motion capture method according to claim 28, wherein a new ID is assigned to a three-dimensional position of a marker configured by the method.
[31] トラッキングステップでマーカに付与される IDは、予め被験体の所定部位と対応さ せてあり、リアルタイムのラベリングを行うステップを含むことを特徴とする請求項 17乃 至 30いずれかに記載のモーションキヤプチャ方法。 [31] The method according to any one of claims 17 to 30, wherein the ID assigned to the marker in the tracking step is associated with a predetermined part of the subject in advance, and includes a step of performing real-time labeling. Motion capture method.
[32] リアルタイムラベリングに基づいて関節角計算をリアルタイムで行うことを特徴とする 請求項 31に記載のモーションキヤプチャ方法。 32. The motion capture method according to claim 31, wherein the joint angle calculation is performed in real time based on real time labeling.
[33] フレームレート及び解像度にぉ 、て異なるカメラ力も構成される異種カメラ群によつ てマーカが付された被験体を撮影して画像を取得する撮影手段と、 [33] Imaging means for capturing an image of a subject marked with a heterogeneous camera group having different camera powers depending on the frame rate and resolution, and acquiring an image.
各カメラ群において取得した画像力もマーカの二次元位置情報を検出するマーカ 検出手段と、  Marker detecting means for detecting the two-dimensional position information of the marker, the image force acquired by each camera group,
各カメラ群において検出されたマーカの二次元位置情報を用いて、マーカの三次 元位置情報を取得する三次元再構成手段と、 Using the two-dimensional position information of the markers detected in each camera group, Three-dimensional reconstruction means for acquiring original position information;
各カメラ群で取得されたマーカの三次元位置情報を統合する統合手段と、 を有し、該統合手段は、  Integrating means for integrating the three-dimensional position information of the markers acquired by each camera group, and
マーカの三次元位置情報を時系列に探索して、同一と判定されるマーカの三次元 位置情報に同一の IDを付与するトラッキング手段と、  Tracking means for searching for the three-dimensional position information of the marker in time series and assigning the same ID to the three-dimensional position information of the marker determined to be the same;
各カメラ群で取得された同一の IDが付与されたマーカの三次元位置情報を補間す る手段と、  Means for interpolating the three-dimensional position information of the markers with the same ID acquired by each camera group,
を有することを特徴とする光学式モーションキヤプチヤシステム。  An optical motion capture system comprising:
[34] 異種カメラ群は、高精度カメラ群と高速度カメラ群を含むことを特徴とする請求項 33 に記載のモーションキヤプチヤシステム。 [34] The motion capture system according to claim 33, wherein the heterogeneous camera group includes a high-precision camera group and a high-speed camera group.
[35] 前記補間手段は、各カメラ群で計測された複数フレームにわたって同一の IDが付 与されたマーカの時系列三次元位置情報力 なるマーカ軌跡を補間する補間多項 式を求めるものであることを特徴とする請求項 33, 34いずれかに記載のモーションキ ャプチヤシステム。 [35] The interpolating means obtains an interpolation polynomial for interpolating a marker trajectory, which is a time-series three-dimensional position information force of a marker assigned the same ID over a plurality of frames measured by each camera group. 35. The motion capture system according to claim 33, wherein:
[36] 補間は、以下の多項式によって行われることを特徴とする請求項 35に記載のモー シヨンキヤプチヤシステム。  36. The motion capture system according to claim 35, wherein the interpolation is performed by the following polynomial.
[数 3] fmi ) = aof + a '-1 + · · ' + a it + ar ここに、 f (t)は、時刻 t 〜t の間の補間式、 a - - -aは多項式の係数である。 [Equation 3] fmi) = aof + a ' -1 + ·''+ a it + a r where f (t) is the interpolation formula between times t and t, and a---a is the polynomial It is a coefficient.
m m m+ 1 0 r  m m m + 1 0 r
[37] 多項式の係数の算出において、三次元再構成時のマーカ推定ベクトル間の平均 距離に基づく重み付けを行うことを特徴とする請求項 36に記載のモーションキヤプチ ャ方法。  37. The motion capture method according to claim 36, wherein in calculating the coefficients of the polynomial, weighting is performed based on an average distance between marker estimation vectors at the time of three-dimensional reconstruction.
[38] 前記補間に基づいて、各カメラ群のカメラの次フレームにおけるマーカ位置を予測 する手段を含む請求項 33乃至 37いずれかに記載のモーションキヤプチヤシステム。  38. The motion capture system according to claim 33, further comprising means for predicting a marker position in a next frame of a camera of each camera group based on the interpolation.
[39] 前記マーカ予測位置には IDが付与されており、前記マーカ検出手段および Zある いは前記三次元再構成手段は、 IDを備えたマーカ予測位置と、検出されたマーカの 二次元位置情報あるいは Zおよび三次元位置情報との同一性を判定し、マーカ予 測位置と同一と判定された二次元位置情報あるいは zおよび三次元位置情報にマ 一力予測位置と同一の IDを付与するように構成されていることを特徴とする請求項 3[39] The marker prediction position is provided with an ID, and the marker detection means and Z or the three-dimensional reconstruction means provide a marker prediction position provided with an ID and a two-dimensional position of the detected marker. Information or the identity with Z and 3D position information, and 4. The apparatus according to claim 3, wherein the same ID as the force prediction position is assigned to the two-dimensional position information or z and three-dimensional position information determined to be the same as the measured position.
8に記載のモーションキヤプチヤシステム。 8. The motion capture system according to item 8.
[40] 前記マーカ検出手段は、 IDが付されたマーカ予測位置の二次元座標値と検出さ れたマーカの二次元位置情報との同一性を判定することで IDを付与することを特徴 とする請求項 39に記載のモーションキヤプチヤシステム。 [40] The marker detection means assigns an ID by determining the identity between the two-dimensional coordinate value of the marker predicted position to which the ID is assigned and the two-dimensional position information of the detected marker. 40. The motion capture system of claim 39.
[41] 前記三次元再構成手段は、同一の IDが付与された二次元位置情報に基づいてマ 一力の三次元位置情報を取得することを特徴とする請求項 40に記載のモーションキ ャプチヤシステム。 41. The motion capture system according to claim 40, wherein the three-dimensional reconstructing means obtains three-dimensional position information with a strong force based on the two-dimensional position information to which the same ID is assigned. .
[42] 前記三次元再構成手段は、 IDが付されたマーカ予測位置と再構成されたマーカ の三次元位置情報との同一性を判定することで IDを付与することを特徴とする請求 項 39乃至 41に記載のモーションキヤプチヤシステム。  [42] The three-dimensional reconstructing means assigns an ID by judging the identity between the marker predicted position to which the ID is added and the three-dimensional position information of the reconstructed marker. 42. The motion capture system according to any one of 39 to 41.
[43] マーカが同一力否の判定は検出マーカあるいは再構成されたマーカと、マーカ予 測位置との間の距離によって行うことを特徴とする請求項 39乃至 42いずれかに記載 のモーションキヤプチヤシステム。 43. The motion capture according to claim 39, wherein the determination of whether or not the markers have the same force is performed based on a distance between the detected marker or the reconstructed marker and the predicted marker position. Ya system.
[44] 前記三次元再構成手段は、 [44] The three-dimensional reconstruction means,
マーカ検出手段において同一の IDが付与された複数のマーカ推定ベクトル間の 距離を計算する手段と、  Means for calculating a distance between a plurality of marker estimation vectors to which the same ID is assigned in the marker detecting means;
予め設定された本数以上のベクトルが閾値以下の距離で交差するか否かを判定す る手段と、  Means for determining whether or not vectors equal to or greater than a predetermined number intersect at a distance equal to or less than a threshold value;
を含むことを特徴とする請求項 39乃至 43いずれかに記載のモーションキヤプチャ システム。  The motion capture system according to any one of claims 39 to 43, comprising:
[45] 判定が是の場合には、マーカの三次元位置を計算することを特徴とする請求項 44 に記載のモーションキヤプチヤシステム。  45. The motion capture system according to claim 44, wherein if the judgment is positive, the three-dimensional position of the marker is calculated.
[46] 判定が否の場合には、マーカの三次元位置の計算に使用されな力つたマーカ推定 ベクトルに対して、閾値以下の距離で交差する組を探索し、新しく見つかったマーカ 推定ベクトルカゝら構成されるマーカの三次元位置に新しい IDを付与することを特徴と する請求項 44に記載のモーションキヤプチヤシステム。 [46] If the judgment is negative, a set that intersects with a strong marker estimation vector used for calculating the three-dimensional position of the marker at a distance equal to or less than a threshold is searched for, and a newly found marker estimation vector 46. The motion capture system according to claim 44, wherein a new ID is assigned to a three-dimensional position of the marker configured by the motion capture system.
[47] トラッキング手段でマーカに付与される IDは、予め被験体の所定部位と対応させて あり、リアルタイムのラベリングを行う手段を含むことを特徴とする請求項 39乃至 46い ずれかに記載のモーションキヤプチヤシステム。 47. The method according to any one of claims 39 to 46, wherein the ID assigned to the marker by the tracking means corresponds to a predetermined part of the subject in advance, and includes means for performing real-time labeling. Motion capture system.
[48] リアルタイムラベリングに基づ 、て関節角計算をリアルタイムで行うことを特徴とする 請求項 47に記載のモーションキヤプチヤシステム。  48. The motion capture system according to claim 47, wherein the joint angle calculation is performed in real time based on real time labeling.
PCT/JP2005/010644 2004-06-16 2005-06-10 Method for marker tracking in optical motion capture system, optical motion capture method, and system WO2005124687A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006514717A JPWO2005124687A1 (en) 2004-06-16 2005-06-10 Marker tracking method in optical motion capture system, optical motion capture method and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-178001 2004-06-16
JP2004178001 2004-06-16

Publications (1)

Publication Number Publication Date
WO2005124687A1 true WO2005124687A1 (en) 2005-12-29

Family

ID=35509927

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/010644 WO2005124687A1 (en) 2004-06-16 2005-06-10 Method for marker tracking in optical motion capture system, optical motion capture method, and system

Country Status (2)

Country Link
JP (1) JPWO2005124687A1 (en)
WO (1) WO2005124687A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007187524A (en) * 2006-01-12 2007-07-26 Shimadzu Corp Magnetic mapping device
JP2007315968A (en) * 2006-05-26 2007-12-06 Univ Of Tokyo Method and device for estimating kinematics parameter using motion capture
JP2008026265A (en) * 2006-07-25 2008-02-07 Shimadzu Corp Head motion tracker system
JP2008027362A (en) * 2006-07-25 2008-02-07 Shimadzu Corp Motion tracker device
JP2008275340A (en) * 2007-04-25 2008-11-13 Canon Inc Apparatus and method for processing information
ES2319087A1 (en) * 2008-11-03 2009-07-02 Universidad De Cordoba System and method of capture, processing and representation of tridimensional localization in real time of an optical signal. (Machine-translation by Google Translate, not legally binding)
JP2010038707A (en) * 2008-08-05 2010-02-18 Shimadzu Corp Motion tracker device
JP2011503673A (en) 2006-11-01 2011-01-27 ソニー株式会社 Segment tracking in motion pictures
US8330796B2 (en) 2006-11-22 2012-12-11 3D International Europe Gmbh Arrangement and method for the recording and display of images of a scene and/or an object
JP2014127208A (en) * 2012-12-26 2014-07-07 Ricoh Co Ltd Method and apparatus for detecting object
US11308645B2 (en) * 2017-05-12 2022-04-19 The Board Of Trustees Of The Leland Stanford Junior University Apparatus and method for wide-range optical tracking during medical imaging
US11582396B2 (en) 2020-03-24 2023-02-14 Casio Computer Co., Ltd. Information processing device, information processing method, and recording medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11361466B2 (en) 2018-11-30 2022-06-14 Casio Computer Co., Ltd. Position information acquisition device, position information acquisition method, recording medium, and position information acquisition system
JP7006714B2 (en) 2020-03-23 2022-01-24 カシオ計算機株式会社 Positioning system, position measuring device, position measuring method and program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002008043A (en) * 2000-06-16 2002-01-11 Matsushita Electric Ind Co Ltd Device and method for analyzing action

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002008043A (en) * 2000-06-16 2002-01-11 Matsushita Electric Ind Co Ltd Device and method for analyzing action

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007187524A (en) * 2006-01-12 2007-07-26 Shimadzu Corp Magnetic mapping device
JP2007315968A (en) * 2006-05-26 2007-12-06 Univ Of Tokyo Method and device for estimating kinematics parameter using motion capture
JP2008026265A (en) * 2006-07-25 2008-02-07 Shimadzu Corp Head motion tracker system
JP2008027362A (en) * 2006-07-25 2008-02-07 Shimadzu Corp Motion tracker device
JP4656016B2 (en) * 2006-07-25 2011-03-23 株式会社島津製作所 Motion tracker device
JP2011503673A (en) 2006-11-01 2011-01-27 ソニー株式会社 Segment tracking in motion pictures
US8330796B2 (en) 2006-11-22 2012-12-11 3D International Europe Gmbh Arrangement and method for the recording and display of images of a scene and/or an object
JP2008275340A (en) * 2007-04-25 2008-11-13 Canon Inc Apparatus and method for processing information
JP2010038707A (en) * 2008-08-05 2010-02-18 Shimadzu Corp Motion tracker device
ES2319087A1 (en) * 2008-11-03 2009-07-02 Universidad De Cordoba System and method of capture, processing and representation of tridimensional localization in real time of an optical signal. (Machine-translation by Google Translate, not legally binding)
JP2014127208A (en) * 2012-12-26 2014-07-07 Ricoh Co Ltd Method and apparatus for detecting object
US11308645B2 (en) * 2017-05-12 2022-04-19 The Board Of Trustees Of The Leland Stanford Junior University Apparatus and method for wide-range optical tracking during medical imaging
US11582396B2 (en) 2020-03-24 2023-02-14 Casio Computer Co., Ltd. Information processing device, information processing method, and recording medium

Also Published As

Publication number Publication date
JPWO2005124687A1 (en) 2008-04-17

Similar Documents

Publication Publication Date Title
WO2005124687A1 (en) Method for marker tracking in optical motion capture system, optical motion capture method, and system
CN108765498B (en) Monocular vision tracking, device and storage medium
US20180066934A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
JP5746477B2 (en) Model generation device, three-dimensional measurement device, control method thereof, and program
US7092109B2 (en) Position/orientation measurement method, and position/orientation measurement apparatus
US7922652B2 (en) Endoscope system
JP7018566B2 (en) Image pickup device, image processing method and program
KR100855657B1 (en) System for estimating self-position of the mobile robot using monocular zoom-camara and method therefor
US20110249117A1 (en) Imaging device, distance measuring method, and non-transitory computer-readable recording medium storing a program
JP6862211B2 (en) Systems and methods for identifying joint positions based on sensor data analysis
US20130230235A1 (en) Information processing apparatus and information processing method
KR20150119337A (en) Generation of 3d models of an environment
JP2021105887A (en) Three-dimensional pose obtaining method and device
JP6078470B2 (en) Golf club deformation measuring system, measuring method and measuring apparatus
CN108171728B (en) Markless moving object posture recovery method and device based on hybrid camera system
JP2009244929A (en) Tracking processing apparatus, tracking processing method, and program
JP2007319938A (en) Robot device and method of obtaining three-dimensional shape of object
JP6922348B2 (en) Information processing equipment, methods, and programs
CN107945166B (en) Binocular vision-based method for measuring three-dimensional vibration track of object to be measured
CN110428461B (en) Monocular SLAM method and device combined with deep learning
JP6924455B1 (en) Trajectory calculation device, trajectory calculation method, trajectory calculation program
JP5901379B2 (en) Imaging apparatus calibration method and image composition apparatus
CN113272864A (en) Information processing apparatus, information processing method, and program
AU2020480103B2 (en) Object three-dimensional localizations in images or videos
US11625836B2 (en) Trajectory calculation device, trajectory calculating method, and trajectory calculating program

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2006514717

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase