JP2005084817A - Three-dimensional object attitude operation method and program - Google Patents

Three-dimensional object attitude operation method and program Download PDF

Info

Publication number
JP2005084817A
JP2005084817A JP2003314335A JP2003314335A JP2005084817A JP 2005084817 A JP2005084817 A JP 2005084817A JP 2003314335 A JP2003314335 A JP 2003314335A JP 2003314335 A JP2003314335 A JP 2003314335A JP 2005084817 A JP2005084817 A JP 2005084817A
Authority
JP
Japan
Prior art keywords
posture
dimensional
virtual work
work area
dimensional object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003314335A
Other languages
Japanese (ja)
Other versions
JP3953450B2 (en
Inventor
Takaichi Hiraga
▼高▲市 平賀
Kenichi Arakawa
賢一 荒川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP2003314335A priority Critical patent/JP3953450B2/en
Publication of JP2005084817A publication Critical patent/JP2005084817A/en
Application granted granted Critical
Publication of JP3953450B2 publication Critical patent/JP3953450B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Processing Or Creating Images (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To operate the attitude of a three-dimensional object in real time at a low calculating cost. <P>SOLUTION: When a three-dimensional position 100 of the fingertip outside a virtual operation region 130 enters the virtual operation region 130, an operation mode is shifted to an attitude operation mode. At that time, a unit vector 141 going from a reference point 102 to the contact 140 of a border plane 131 between the three-dimensional position 100 of the fingertip and the virtual operation region 130 and the attitude of a three-dimensional object 210 are stored. During the attitude operation, the attitude of the three-dimensional object 210 stored when the operation mode is shifted to the attitude operation mode is rotated only by an angle shown by 144 around an axis 143, and defined as the present attitude of the three-dimensional object 210. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

本発明は3次元オブジェクトの姿勢を操作する方法に関する。   The present invention relates to a method for manipulating the attitude of a three-dimensional object.

上記の技術に関連する特許出願として、特許文献1があるが、この特許出願では、操作者が指示する3次元位置や方向を検出するだけであり、対象物の姿勢を操作する方法については述べられていない。また、特許文献2があるが、様々な姿勢の手先画像をテンプレートとしたマッチング方式であるため、検出される姿勢は離散的であり、かつ計算コストがかかる。
特開平10−326148号公報 特開平8−076917号公報
There is Patent Document 1 as a patent application related to the above technique, but in this patent application, only a three-dimensional position and direction indicated by an operator is detected, and a method for operating the posture of an object is described. It is not done. Moreover, although there exists patent document 2, since it is a matching system which used the hand image of various attitude | positions as a template, the attitude | position detected is discrete and calculation cost starts.
JP-A-10-326148 JP-A-8-076917

掌や拳などの姿勢を検出し、この姿勢情報により3次元オブジェクトの姿勢を操作することは原理的には可能であるが、掌や拳などの姿勢を十分な精度で検出するためには計算コストがかかり実時間での処理は不可能であり、実用的ではなく、逆に実時間での処理を可能にするには、掌や拳などの姿勢検出の精度をおとさなければならないが、検出される姿勢の離散度が大きく、実用的ではない。   Although it is possible in principle to detect the posture of a palm or fist and manipulate the posture of a three-dimensional object using this posture information, calculation is necessary to detect the posture of a palm or fist with sufficient accuracy. Expensive and real-time processing is impossible and impractical. Conversely, in order to enable real-time processing, the accuracy of posture detection such as palms and fists must be reduced. The degree of discreteness of the posture is large and impractical.

本発明の目的は、計算機内の3次元オブジェクトの姿勢を実時間で操作する3次元オブジェクト姿勢操作方法およびプログラムを提供することにある。   An object of the present invention is to provide a three-dimensional object posture operation method and program for manipulating the posture of a three-dimensional object in a computer in real time.

本発明の第1の態様によれば、3次元オブジェクト姿勢操作方法は、
固定された基準点を含む、空間座標系に固定された仮想作業領域を設定し、
動作モードが、空間内の、指先もしくは簡易なマーカーを取り付けた指先などの3次元位置が仮想作業領域の外部にあるアイドルモードから、3次元位置が仮想作業領域の境界面上および内部にある姿勢操作モードに遷移したときに、3次元位置と仮想作業領域の境界面との交点の位置と、3次元オブジェクトの姿勢と、前記3次元位置と前記交点と前記基準点とを含む平面に垂直で、かつ前記基準点を通る直線とを記憶し、
動作モードが姿勢操作モードである場合に限り、前記直線を軸として、前記基準点から前記交点に向かう単位ベクトルを前記基準点から3次元位置に向かう単位ベクトルに重ね合わせる回転変換を前記3次元オブジェクトの姿勢に施し、その結果の姿勢を現在の3次元オブジェクト姿勢とする。
According to the first aspect of the present invention, the three-dimensional object posture operation method comprises:
Set a virtual work area that is fixed in the spatial coordinate system, including a fixed reference point,
From the idle mode in which the operation mode is in the space, such as a fingertip or a fingertip with a simple marker attached, outside the virtual work area, and the posture in which the three-dimensional position is on and inside the boundary surface of the virtual work area When transitioning to the operation mode, the position of the intersection of the three-dimensional position and the boundary surface of the virtual work area, the posture of the three-dimensional object, and the plane perpendicular to the plane including the three-dimensional position, the intersection, and the reference point And a straight line passing through the reference point,
Only when the operation mode is the posture operation mode, the three-dimensional object is subjected to a rotation transformation that superimposes a unit vector from the reference point toward the intersection with a unit vector from the reference point toward a three-dimensional position, with the straight line as an axis. The resulting posture is set as the current three-dimensional object posture.

本発明の第2の態様によれば、3次元オブジェクト姿勢操作方法は、
前記仮想作業領域に加え、該仮想作業領域に含まれかつ前記空間に固定された内部仮想作業領域を設定し、
動作モードが、前記3次元位置が前記仮想作業領域の外部にあるアイドルモードから、3次元位置が内部仮想作業領域の内部に入って、もしくはその境界面に接して、姿勢操作モードに遷移したときに、3次元位置と内部仮想作業領域の境界面との交点の位置と、3次元オブジェクトの姿勢と、3次元位置と前記交点と前記基準点とを含む平面に垂直でかつ前記基準点を通る直線とを記憶し、
動作モードが姿勢操作モードである場合に限り、前記直線を軸として、前記基準点から前記交点に向かう単位ベクトルを前記基準点から前記3次元位置に向かう単位ベクトルに重ね合わせる回転変換を前記3次元オブジェクトの姿勢に施し、その結果の姿勢を現在の3次元オブジェクト姿勢とする。
According to the second aspect of the present invention, the three-dimensional object posture operation method comprises:
In addition to the virtual work area, an internal virtual work area included in the virtual work area and fixed in the space is set,
When the operation mode is changed from the idle mode where the three-dimensional position is outside the virtual work area to the posture operation mode when the three-dimensional position is inside the internal virtual work area or in contact with the boundary surface Furthermore, the position of the intersection of the three-dimensional position and the boundary surface of the internal virtual work area, the posture of the three-dimensional object, the three-dimensional position, the plane perpendicular to the plane including the intersection and the reference point, and pass through the reference point Remember the straight line,
Only when the operation mode is the posture operation mode, the three-dimensional rotation transformation that superimposes the unit vector from the reference point to the intersection with the unit vector from the reference point to the three-dimensional position with the straight line as an axis. This is applied to the posture of the object, and the resulting posture is set as the current three-dimensional object posture.

本発明の第3の態様によれば、3次元オブジェクト姿勢操作方法は、
3次元オブジェクトの個数と同数で、前記3次元オブジェクトに一対一に対応する、前記空間に固定された仮想作業領域を設定し、
前記3次元オブジェクトの個数と同数で、前記3次元オブジェクトに一対一に対応する、前記空間に固定された基準点を設け、
それぞれの仮想作業領域に含まれかつ前記空間に固定された、前記3次元オブジェクトの個数と同数の、互いに交わりのない内部仮想作業領域を設定し、
動作モードが、3次元位置が全ての仮想作業領域の外部にあるアイドルモードから、3次元位置がひとつの内部仮想作業領域の内部に入って、もしくはその境界面に接して、姿勢操作モードに遷移したときに、3次元位置と内部仮想作業領域の境界面との交点の位置と、内部仮想作業領域に対応する3次元オブジェクトの姿勢と、3次元位置と前記交点と前記内部仮想作業領域に対応する基準点とを含む平面に垂直でかつ前記基準点を通る直線とを記憶し、
動作モードが姿勢操作モードである場合に限り、前記直線を軸として、前記基準点から前記交点に向かう単位ベクトルを前記基準点から3次元位置に向かう単位ベクトルに重ね合わせる回転変換を、3次元オブジェクトの姿勢に施し、その結果の姿勢を現在の3次元オブジェクト姿勢とする。
According to the third aspect of the present invention, the three-dimensional object posture operation method comprises:
A virtual work area fixed in the space corresponding to the number of three-dimensional objects and corresponding to the three-dimensional objects on a one-to-one basis is set.
Providing a reference point fixed in the space, which is the same number as the number of the three-dimensional object and corresponds to the three-dimensional object on a one-to-one basis;
Set the same number of internal virtual work areas that are included in each virtual work area and fixed in the space as the number of the three-dimensional objects and do not intersect each other,
The operation mode changes from the idle mode in which the three-dimensional position is outside all virtual work areas to the posture operation mode when the three-dimensional position enters the inside of one internal virtual work area or touches its boundary surface. The position of the intersection between the 3D position and the boundary surface of the internal virtual work area, the posture of the 3D object corresponding to the internal virtual work area, the 3D position, the intersection, and the internal virtual work area And a straight line passing through the reference point and perpendicular to a plane including the reference point
Only when the operation mode is the posture operation mode, a rotation transformation that superimposes a unit vector from the reference point to the intersection with a unit vector from the reference point to a three-dimensional position with the straight line as an axis is a three-dimensional object. The resulting posture is set as the current three-dimensional object posture.

本発明は、指先などの位置を動かすことにより、仮想対象物の把持およびリリースを容易に指定したり、仮想対象物の姿勢を容易に操作することができる。   According to the present invention, by moving the position of a fingertip or the like, it is possible to easily specify gripping and releasing of a virtual object or to easily operate the posture of the virtual object.

次に、本発明の実施の形態について図面を参照して説明する。
[第1の実施形態]
Next, embodiments of the present invention will be described with reference to the drawings.
[First Embodiment]

本実施形態では、指先に取り付けたマーカーを撮像したカメラ画像を解析することで、空間内での指先の3次元的動きを捉え、計算機内の3次元オブジェクトの姿勢を操作する。図1から図5に、空間内での指先の3次元位置を検出する方法を示す。   In the present embodiment, by analyzing a camera image obtained by imaging a marker attached to the fingertip, the three-dimensional movement of the fingertip in the space is captured, and the posture of the three-dimensional object in the computer is manipulated. 1 to 5 show a method for detecting the three-dimensional position of the fingertip in the space.

図1において、カメラ104およびカメラ105は空間座標系190に固定されており、それぞれのカメラ104、105の中心軸は、空間座標系190のZ軸上の基準点102で交差している。表示用モニター103は、カメラ104とカメラ105の間に置かれている。指先に取り付けられたマーカー101は蛍光赤色で塗装された球であるが、波長900ナノメートル程度の発光ダイオードをマーカーとして用いることも可能である。   In FIG. 1, the camera 104 and the camera 105 are fixed to a spatial coordinate system 190, and the central axes of the cameras 104 and 105 intersect at a reference point 102 on the Z axis of the spatial coordinate system 190. The display monitor 103 is placed between the camera 104 and the camera 105. The marker 101 attached to the fingertip is a sphere painted in fluorescent red, but a light emitting diode having a wavelength of about 900 nanometers can also be used as a marker.

図2において、画像200,201は指先マーカー101をそれぞれ撮像したカメラ104,105の赤チャネル画像全体をガウシアンフィルターにより処理した画像である。指先マーカー像202,203が輝度の高い領域として表示されている。   In FIG. 2, images 200 and 201 are images obtained by processing the entire red channel images of the cameras 104 and 105 capturing the fingertip marker 101 with a Gaussian filter. The fingertip marker images 202 and 203 are displayed as high brightness areas.

図3(a)において、カメラ座標系110は空間座標系190に固定されており、そのZ軸はカメラ104の中心軸に一致している。直線113は、画像面111上の最も高輝度な画素112の位置とカメラ座標系110の原点とを結ぶ直線である。図3(b)において、カメラ座標系120は空間座標系190に固定されており、そのZ軸はカメラ105の中心軸に一致している。直線123は、画像面121上の最も高輝度な画素122の位置とカメラ座標系120の原点とを結ぶ直線である。   In FIG. 3A, the camera coordinate system 110 is fixed to the spatial coordinate system 190, and its Z axis coincides with the central axis of the camera 104. A straight line 113 is a straight line that connects the position of the pixel 112 with the highest luminance on the image plane 111 and the origin of the camera coordinate system 110. In FIG. 3B, the camera coordinate system 120 is fixed to the spatial coordinate system 190, and its Z axis coincides with the central axis of the camera 105. A straight line 123 is a straight line that connects the position of the pixel 122 with the highest luminance on the image plane 121 and the origin of the camera coordinate system 120.

図4に、空間座標系190における直線113および直線123を示す。カメラ座標系110およびカメラ座標系120が空間座標系190に固定されているので、それぞれのカメラ座標系110,120での位置姿勢が特定された直線113および直線123は、空間座標系190での位置も特定される。   FIG. 4 shows a straight line 113 and a straight line 123 in the spatial coordinate system 190. Since the camera coordinate system 110 and the camera coordinate system 120 are fixed to the spatial coordinate system 190, the straight line 113 and the straight line 123 whose positions and orientations are specified in the respective camera coordinate systems 110 and 120 are the same in the spatial coordinate system 190. The location is also specified.

図5に、実空間内での指先の3次元位置を検出する方法を示す。空間座標系190での位置が特定された直線113および直線123とを最短で結ぶ直線と、直線113および直線123との交点をそれぞれP1およびP2とする。P1とP2の中点PMを、実空間内での指先の3次元位置100とする。 FIG. 5 shows a method for detecting the three-dimensional position of the fingertip in real space. Intersections between the straight line 113 and the straight line 123 whose positions in the spatial coordinate system 190 are identified in the shortest and the straight line 113 and the straight line 123 are defined as P 1 and P 2 , respectively. A midpoint P M between P 1 and P 2 is set as a three-dimensional position 100 of the fingertip in the real space.

図6から図11に、実空間内での指先の3次元位置により計算機内の3次元オブジェクトを操作する方法を示す。   FIGS. 6 to 11 show a method of operating a three-dimensional object in the computer by the three-dimensional position of the fingertip in the real space.

図6に、基準点102を含むように空間座標系190に固定された仮想作業領域130を示す。   FIG. 6 shows a virtual work area 130 fixed to the spatial coordinate system 190 so as to include the reference point 102.

図7に、動作モードが操作開始の前であるアイドルモードの状態を示す。図7(a)はモニター画面で、操作対象の3次元オブジェクト210、指先位置を示すカーソル211、および仮想作業領域130の境界131の像212が表示されている。図7(b)に、指先の3次元位置100が仮想作業領域130の外部にある状態を示す。   FIG. 7 shows a state of the idle mode in which the operation mode is before the start of operation. FIG. 7A is a monitor screen on which a three-dimensional object 210 to be operated, a cursor 211 indicating the fingertip position, and an image 212 of the boundary 131 of the virtual work area 130 are displayed. FIG. 7B shows a state where the three-dimensional position 100 of the fingertip is outside the virtual work area 130.

図8に、仮想作業領域130の外部にある指先の3次元位置100が、仮想作業領域130に侵入することで、動作モードが姿勢操作モードに遷移した状態を示す。その際に、図10に示す、基準点102から、指先の3次元位置100と仮想作業領域130の境界面131との接点140に向かう単位ベクトル141、および3次元オブジェクト210の姿勢を記憶しておく。このとき、動作モードの遷移をオペレータに知らせる方法として、3次元オブジェクト210の表示色を変化させる、仮想作業領域130の境界131の像212やカーソル211の表示色を変化させる、もしくはチャイムを鳴らすなどの方法がある。指先の3次元位置100が仮想作業領域130の内部もしくは境界131の上にある間は、姿勢操作モードが継続される。   FIG. 8 illustrates a state in which the operation mode is changed to the posture operation mode when the three-dimensional position 100 of the fingertip outside the virtual work area 130 enters the virtual work area 130. At that time, the unit vector 141 from the reference point 102 shown in FIG. 10 toward the contact point 140 between the three-dimensional position 100 of the fingertip and the boundary surface 131 of the virtual work area 130 and the posture of the three-dimensional object 210 are stored. deep. At this time, as a method of notifying the operator of the transition of the operation mode, the display color of the three-dimensional object 210 is changed, the display color of the image 212 of the boundary 131 and the cursor 211 of the virtual work area 130 is changed, or a chime is sounded. There is a way. While the three-dimensional position 100 of the fingertip is inside the virtual work area 130 or on the boundary 131, the posture operation mode is continued.

図9に、姿勢操作中の状態を示す。3次元オブジェクト210の現在の姿勢については、動作モードが姿勢操作モードに遷移した際に記憶しておいた3次元オブジェクト210の姿勢を、図10(b)に示す軸143の周りに144で示す角度だけ回転させた姿勢を以って、その現在姿勢とする。ここで、軸143は、図10(a)に示す単位ベクトル141、および基準点102から指先の3次元位置100に向かう単位ベクトル142の両方を含む平面に垂直で、かつ基準点102を通る直線である。また、144で示す角度は、単位ベクトル142が単位ベクトル141に対してなす角度である。   FIG. 9 shows a state during posture operation. Regarding the current posture of the three-dimensional object 210, the posture of the three-dimensional object 210 stored when the motion mode is changed to the posture operation mode is indicated by 144 around the axis 143 shown in FIG. The posture rotated by an angle is taken as the current posture. Here, the axis 143 is a straight line perpendicular to a plane including both the unit vector 141 shown in FIG. 10A and the unit vector 142 from the reference point 102 toward the three-dimensional position 100 of the fingertip and passing through the reference point 102. It is. An angle indicated by 144 is an angle formed by the unit vector 142 with respect to the unit vector 141.

図11に、指先の3次元位置100が仮想作業領域130から出ることにより、動作モードが姿勢操作モードからアイドルモードに遷移し、姿勢操作が完了した状態を示す。この状態は、図7に示す状態と同じである。
[第2の実施形態]
FIG. 11 shows a state where the posture operation is completed when the three-dimensional position 100 of the fingertip comes out of the virtual work area 130 and the operation mode changes from the posture operation mode to the idle mode. This state is the same as the state shown in FIG.
[Second Embodiment]

第1の実施形態における、図8で説明したアイドルモードから姿勢操作モードへの動作モードの動作遷移の際や、図11で説明した姿勢操作モードからアイドルモードへの動作モードの遷移の際には、検出された指先の3次元位置100にノイズなどが畳重することで、チャタリングが生じることがある。   In the first embodiment, at the time of the operation mode transition from the idle mode to the posture operation mode described in FIG. 8 or at the time of the operation mode transition from the posture operation mode to the idle mode described in FIG. Chattering may occur when noise or the like overlaps with the detected three-dimensional position 100 of the fingertip.

第2の実施形態として、図12から図16に、チャタリングを防止しつつ、実空間内での指先の3次元位置により計算機内の3次元オブジェクトを操作する方法を示す。   As a second embodiment, FIGS. 12 to 16 show a method of operating a three-dimensional object in a computer based on the three-dimensional position of a fingertip in real space while preventing chattering.

図12に動作モードがアイドルモードの状態を示す。図12(b)に示すように、図7に示す仮想作業領域130に加え、これに含まれる内部仮想作業領域150を用いる。両仮想作業領域130,150の境界面131と境界面151との距離の最小値は、指先の3次元位置100に畳重しているノイズのレベルと比較し十分に大きいとする。図12(a)に示す表示用モニター103には、仮想作業領域130の境界面131の像215は表示されないか、もしくは淡色で表示される一方、内部仮想作業領域150の境界面151の像213は濃色で表示され、傾注すべき境界面をオペレータに知らせている。   FIG. 12 shows a state where the operation mode is the idle mode. As shown in FIG. 12B, in addition to the virtual work area 130 shown in FIG. 7, an internal virtual work area 150 included therein is used. It is assumed that the minimum value of the distance between the boundary surface 131 and the boundary surface 151 of both virtual work areas 130 and 150 is sufficiently larger than the level of noise superimposed on the three-dimensional position 100 of the fingertip. In the display monitor 103 shown in FIG. 12A, the image 215 of the boundary surface 131 of the virtual work area 130 is not displayed or is displayed in a light color, while the image 213 of the boundary surface 151 of the internal virtual work area 150 is displayed. Is displayed in dark color and informs the operator of the boundary surface to be tilted.

図13に動作モードアイドルモードから姿勢操作モードへの遷移の様子を示す。アイドルモード時に限り、内部仮想作業領域150の外部にある指先の3次元位置100が、内部仮想作業領域150に侵入する、もしくはその境界面151に接触すると、動作モードは姿勢操作モードに遷移する。その際に、図15に示す、基準点102から、指先の3次元位置100と仮想作業領域130の境界面131との接点140に向かう単位ベクトル141、および3次元オブジェクト210の姿勢を記憶しておく。このとき、カーソル211の表示色を変化させる、チャイムを鳴らす、もしくは、内部仮想作業領域150の境界面151の像213を消去、もしくは淡色表示に変更する一方で、仮想作業領域130の境界面131の像215を濃色表示に変更する、などすることで、動作モードの遷移をオペレータに知らせるとともに、傾注すべき境界面も知らせる。指先の3次元位置100が仮想作業領域130の内部にある間は、姿勢操作モードが継続される。   FIG. 13 shows a state of transition from the operation mode idle mode to the posture operation mode. Only in the idle mode, when the three-dimensional position 100 of the fingertip outside the internal virtual work area 150 enters the internal virtual work area 150 or comes into contact with the boundary surface 151, the operation mode transitions to the posture operation mode. At that time, the unit vector 141 from the reference point 102 shown in FIG. 15 toward the contact point 140 between the three-dimensional position 100 of the fingertip and the boundary surface 131 of the virtual work area 130 and the posture of the three-dimensional object 210 are stored. deep. At this time, the display color of the cursor 211 is changed, the chime is sounded, or the image 213 of the boundary surface 151 of the internal virtual work area 150 is deleted or changed to a light color display, while the boundary surface 131 of the virtual work area 130 is changed. By changing the image 215 to a dark color display, the operator is notified of the transition of the operation mode, and the boundary surface to be tilted is also notified. While the three-dimensional position 100 of the fingertip is inside the virtual work area 130, the posture operation mode is continued.

図14に姿勢操作中の状態を示す。3次元オブジェクト210の現在の姿勢については、動作モード、姿勢操作モードに遷移した際に記憶しておいた3次元オブジェクト210の姿勢を、図15(b)に示す軸143の周りに144で示す角度だけ回転させた姿勢を以って、その現在姿勢とする。ここで、軸143は、図15(a)に示す単位ベクトル141、および基準点102から指先の3次元位置100に向かう単位ベクトル142の両方を含む平面に垂直で、かつ基準点102を通る直線である。また、144で示す角度は、単位ベクトル142が単位ベクトル141に対してなす角度である。   FIG. 14 shows a state during the posture operation. As for the current posture of the three-dimensional object 210, the posture of the three-dimensional object 210 stored when the operation mode and the posture operation mode are changed is indicated by 144 around the axis 143 shown in FIG. The posture rotated by an angle is taken as the current posture. Here, the axis 143 is a straight line perpendicular to a plane including both the unit vector 141 shown in FIG. 15A and the unit vector 142 from the reference point 102 toward the three-dimensional position 100 of the fingertip and passing through the reference point 102. It is. An angle indicated by 144 is an angle formed by the unit vector 142 with respect to the unit vector 141.

図16に、姿勢操作モードからアイドルモードへの動作モードの遷移の様子を示す。姿勢操作モード時に限り、指先の3次元位置100が仮想作業領域130の外部に出るか、もしくはその境界面131に接触するとアイドルモードに遷移する。
[第3の実施形態]
FIG. 16 shows a transition of the operation mode from the posture operation mode to the idle mode. Only in the posture operation mode, when the three-dimensional position 100 of the fingertip comes out of the virtual work area 130 or comes into contact with the boundary surface 131, the mode changes to the idle mode.
[Third Embodiment]

図17に示すように、第2の実施形態における内部仮想作業領域150の代わりに、3次元オブジェクト210と同一形状の領域である内部仮想作業領域160を用いる。
[第4の実施形態]
As shown in FIG. 17, instead of the internal virtual work area 150 in the second embodiment, an internal virtual work area 160 that is an area having the same shape as the three-dimensional object 210 is used.
[Fourth Embodiment]

第4の実施形態では、第1の実施形態の図7に示す仮想作業領域と同様の仮想作業領域が2つあり、かつこれらの仮想作業領域が交わっている場合について説明する。   In the fourth embodiment, a case will be described in which there are two virtual work areas similar to the virtual work area shown in FIG. 7 of the first embodiment, and these virtual work areas intersect.

図18に示すように、2つの内部仮想作業領域の交わり領域を、それぞれの内部仮想作業領域から取り除いた領域を、それぞれ内部仮想作業領域150と内部仮想作業領域550とすることで、第1の実施形態と同様の方法により、3次元オブジェクトの姿勢を操作する。
[第5の実施形態]
As shown in FIG. 18, the areas obtained by removing the intersecting areas of the two internal virtual work areas from the respective internal virtual work areas are defined as an internal virtual work area 150 and an internal virtual work area 550, respectively. The posture of the three-dimensional object is manipulated by the same method as in the embodiment.
[Fifth Embodiment]

第1の実施形態では、指先の3次元位置を検出する手段として、マーカーを用いる方式の検出エンジンを利用していたが、このエンジンの代わりに、何もつけない指先の位置検出を行うエンジンを利用することもできる。このような指先の3次元位置検出エンジンとしては、「Robust Finger Tracking with Multiple Cameras」 Cullen Jennings、1999、などがある。   In the first embodiment, a detection engine using a marker is used as means for detecting the three-dimensional position of the fingertip. However, instead of this engine, an engine for detecting the position of the fingertip without anything is attached. It can also be used. As such a fingertip three-dimensional position detection engine, there is “Robust Finger Tracking with Multiple Cameras”, Cullen Jennings, 1999, and the like.

図19は以上説明した第1から第5の実施形態の3次元オブジェクト姿勢操作方法を実施する装置のブロック図である。   FIG. 19 is a block diagram of an apparatus that implements the three-dimensional object posture operation methods of the first to fifth embodiments described above.

指先もしくはマーカー位置検出部301は指先または指先に取り付けられたマーカー101の3次元位置100を検出する。領域判定部302は、指先またはマーカー101の3次元位置100が仮想作業領域130(第2の実施形態ではさらに内部仮想作業領域150)、仮想作業領域130の外部のいずれかにあるかを判定する。動作モード判定部304は領域判定部302の判定検索から動作モード(アイドルモード、姿勢操作モード)を判定し、判定した動作モードを動作モード記憶部303に記憶するとともに、動作モードがアイドルモードから姿勢操作モードに遷移したときに、単位ベクトル41と3次元オブジェクト210の姿勢を位置情報記憶部305に記憶する。姿勢算出部306は動作モードが姿勢操作モードのとき位置情報記憶部305に記憶した情報を用いて3次元オブジェクト210の姿勢を軸143の周りに角度144だけ回転させる計算を行なう。   The fingertip or marker position detection unit 301 detects the three-dimensional position 100 of the fingertip or the marker 101 attached to the fingertip. The area determination unit 302 determines whether the three-dimensional position 100 of the fingertip or the marker 101 is located outside the virtual work area 130 (in addition, the internal virtual work area 150 in the second embodiment) or the outside of the virtual work area 130. . The operation mode determination unit 304 determines the operation mode (idle mode, posture operation mode) from the determination search of the region determination unit 302, stores the determined operation mode in the operation mode storage unit 303, and the operation mode changes from the idle mode to the posture. When transitioning to the operation mode, the unit vector 41 and the posture of the three-dimensional object 210 are stored in the position information storage unit 305. The posture calculation unit 306 performs calculation for rotating the posture of the three-dimensional object 210 around the axis 143 by an angle 144 using the information stored in the position information storage unit 305 when the operation mode is the posture operation mode.

なお、以上説明した本発明の3次元オブジェクト姿勢操作方法は、その機能を実現するためのプログラムを、コンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することができる。コンピュータ読み取り可能な記録媒体とは、フロッピーディスク、光磁気ディスク、CD−ROM等の記録媒体、コンピュータシステムに内蔵されるハードディスク装置等の記憶装置を指す。さらに、コンピュータ読み取り可能な記録媒体は、インターネットを介してプログラムを送信する場合のように、短時間の間、動的にプログラムを保持するもの(伝送媒体もしくは伝送波)、その場合のサーバとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持しているものも含む。   The three-dimensional object posture operation method of the present invention described above records a program for realizing the function on a computer-readable recording medium, and reads the program recorded on the recording medium into a computer system. Can be executed. The computer-readable recording medium refers to a recording medium such as a floppy disk, a magneto-optical disk, a CD-ROM, or a storage device such as a hard disk device built in the computer system. Furthermore, a computer-readable recording medium is a server that dynamically holds a program (transmission medium or transmission wave) for a short period of time, as in the case of transmitting a program via the Internet, and a server in that case. Some of them hold programs for a certain period of time, such as volatile memory inside computer systems.

指先マーカー101を2台のカメラ104,105で撮影する様子を示す図である。It is a figure which shows a mode that the fingertip marker 101 is image | photographed with two cameras 104,105. 指先マーカー101を撮影したカメラ104,105の赤チャネル画像全体をガウシアンフィルターにより処理した画像を示す図である。It is a figure which shows the image which processed the whole red channel image of the cameras 104 and 105 which image | photographed the fingertip marker 101 by the Gaussian filter. 画像面111,121上の最も高輝度な画像112,122とカメラ座標系120の原点とを結ぶ直線113,123を示す図である。It is a figure which shows the straight lines 113 and 123 which connect the image 112 and 122 with the highest brightness | luminance on the image planes 111 and 121, and the origin of the camera coordinate system 120. FIG. 空間座標系190における直線113,123を示す図である。It is a figure which shows the straight lines 113 and 123 in the spatial coordinate system 190. FIG. 実空間内での指先の3次元位置を検出する方法を示す図である。It is a figure which shows the method of detecting the three-dimensional position of the fingertip in real space. 基準点102を含む仮想作業領域130を示す図である。FIG. 4 is a diagram showing a virtual work area 130 including a reference point 102. 動作モードがアイドルモードの状態を示す図である。It is a figure which shows the state whose operation mode is an idle mode. 動作モードが姿勢操作モードに遷移した状態を示す図である。It is a figure which shows the state which operation mode changed to attitude | position operation mode. 姿勢操作中の状態を示す図である。It is a figure which shows the state during attitude | position operation. 3次元オブジェクト210の姿勢の変化を示す図である。FIG. 6 is a diagram illustrating a change in posture of a three-dimensional object 210. 動作モードが姿勢モードからアイドルモードに遷移し、姿勢操作が完了した状態を示す図である。It is a figure which shows the state which operation mode changed from attitude | position mode to idle mode, and attitude | position operation was completed. 第2の実施形態におけるアイドルモードを示す図である。It is a figure which shows the idle mode in 2nd Embodiment. 第2の実施形態におけるアイドルモードから姿勢操作モードへの動作モードの遷移の様子を示す図である。It is a figure which shows the mode of the operation mode transition from idle mode in 2nd Embodiment to attitude | position operation mode. 第2の実施形態における姿勢操作中の状態を示す図である。It is a figure which shows the state during attitude | position operation in 2nd Embodiment. 第2の実施形態における3次元オブジェクト210の姿勢の変化を示す図である。It is a figure which shows the change of the attitude | position of the three-dimensional object 210 in 2nd Embodiment. 第2の実施形態における姿勢操作モードからアイドルモードへの動作モードの遷移の様子を示す図である。It is a figure which shows the mode of the operation mode transition from the attitude | position operation mode in 2nd Embodiment to idle mode. 第3の実施形態で用いる内部仮想作業領域を示す図である。It is a figure which shows the internal virtual work area used by 3rd Embodiment. 第4の実施形態で用いる仮想作業領域を示す図である。It is a figure which shows the virtual work area used in 4th Embodiment. 第1から第5の実施形態の3次元オブジェクト姿勢操作方法を実施する装置のブロック図である。It is a block diagram of the apparatus which implements the three-dimensional object attitude | position operation method of 1st-5th embodiment.

符号の説明Explanation of symbols

100 実空間内での推定されたマーカー位置
101 実空間内でのマーカー位置
102 実空間座標系のZ軸上の姿勢操作基準点
103 表示用モニター
104 カメラ
105 カメラ
190 実空間座標系
200 カメラ画像
201 カメラ画像
202 カメラ画像上でのマーカー位置
203 カメラ画像上でのマーカー位置
110 カメラ座標系
111 カメラ撮像面
112 カメラ撮像面上でのマーカー位置
113 カメラ座標系の原点とカメラ撮像面上でのマーカー位置とを結ぶ直線
120 カメラ座標系
121 カメラ撮像面
122 カメラ撮像面上でのマーカー位置
123 カメラ座標系の原点とカメラ撮像面上でのマーカー位置とを結ぶ直線
130 仮想作業領域
131 仮想作業領域の境界面
132 仮想作業領域
151 内部仮想作業領域の境界面
210 モニターに表示された3次元オブジェクト
211 マーカー位置のカーソル
212 モニターに表示された仮想作業領域
140 推定されたマーカー位置と仮想作業領域の境界面との接点
141 姿勢操作基準点から接点に向かう単位ベクトル
142 姿勢操作基準点から推定されたマーカー位置に向かう単位ベクトル
143 2つの単位ベクトルを含む平面に垂直で姿勢操作基準点を通る直線
144 2つの単位ベクトルがなす角
108 3次元オブジェクト
109 3次元オブジェクト
301 指先もしくはマーカー位置検出部
302 領域判定部
303 動作モード記憶部
304 動作モード判定部
305 位置情報記憶部
306 姿勢算出部
100 Estimated Marker Position in Real Space 101 Marker Position in Real Space 102 Attitude Operation Reference Point on Z-axis of Real Space Coordinate System 103 Display Monitor 104 Camera 105 Camera 190 Real Space Coordinate System 200 Camera Image 201 Camera image 202 Marker position on camera image 203 Marker position on camera image 110 Camera coordinate system 111 Camera imaging surface 112 Marker position on camera imaging surface 113 Origin of camera coordinate system and marker position on camera imaging surface 120 Camera coordinate system 121 Camera imaging plane 122 Marker position on camera imaging plane 123 Straight line connecting origin of camera coordinate system and marker position on camera imaging plane 130 Virtual work area 131 Boundary of virtual work area Surface 132 Virtual work area 151 Internal virtual work area Boundary surface 210 Three-dimensional object displayed on monitor 211 Cursor at marker position 212 Virtual work area displayed on monitor 140 Contact point between estimated marker position and boundary surface of virtual work area 141 From posture operation reference point to contact point Heading unit vector 142 Unit vector heading to the marker position estimated from the posture operation reference point 143 Straight line passing through the posture operation reference point perpendicular to the plane including the two unit vectors 144 Angle formed by the two unit vectors 108 Three-dimensional object 109 3 Dimensional object 301 Fingertip or marker position detection unit 302 Area determination unit 303 Operation mode storage unit 304 Operation mode determination unit 305 Position information storage unit 306 Posture calculation unit

Claims (10)

空間内の所定の物体の3次元位置により、3次元オブジェクトの姿勢を操作する方法であって、
固定された基準点を含む、空間座標系に固定された仮想作業領域を設定し、
動作モードが、前記3次元位置が前記仮想作業領域の外部にあるアイドルモードから、前記3次元位置が前記仮想作業領域の境界面上および内部にある姿勢操作モードに遷移したときに、前記3次元位置と前記仮想作業領域の境界面との交点の位置と、前記3次元オブジェクトの姿勢と、前記3次元位置と前記交点と前記基準点とを含む平面に垂直で、かつ前記基準点を通る直線とを記憶し、
動作モードが前記姿勢操作モードである場合に限り、前記直線を軸として、前記基準点から前記交点に向かう単位ベクトルを前記基準点から前記3次元位置に向かう単位ベクトルに重ね合わせる回転変換を前記3次元オブジェクトの姿勢に施し、その結果の姿勢を現在の3次元オブジェクト姿勢とする3次元オブジェクト姿勢操作方法。
A method for manipulating a posture of a three-dimensional object according to a three-dimensional position of a predetermined object in space,
Set a virtual work area that is fixed in the spatial coordinate system, including a fixed reference point,
When the operation mode is changed from the idle mode in which the three-dimensional position is outside the virtual work area to the posture operation mode in which the three-dimensional position is on and inside the boundary surface of the virtual work area, the three-dimensional A straight line that passes through the reference point and is perpendicular to a plane including the position of the intersection of the position and the boundary surface of the virtual work area, the posture of the three-dimensional object, the three-dimensional position, the intersection, and the reference point And remember
Only when the operation mode is the posture operation mode, the rotation transformation that superimposes the unit vector from the reference point to the intersection with the unit vector from the reference point to the three-dimensional position with the straight line as an axis is the 3 A three-dimensional object posture operation method that applies to the posture of a three-dimensional object and sets the resultant posture as the current three-dimensional object posture.
空間内の所定の物体の3次元位置により、3次元オブジェクトの姿勢を操作する方法であって、
固定された基準点を含む、空間座標系に固定された仮想作業領域と、前記仮想作業領域に含まれ、かつ前記空間に固定された内部仮想作業領域とを設定し、
動作モードが、前記3次元位置が前記仮想作業領域の外部にあるアイドルモードから、前記3次元位置が前記内部仮想作業領域の内部に入って、もしくはその境界面に接して、姿勢操作モードに遷移したときに、前記3次元位置と前記内部仮想作業領域の境界面との交点と、前記3次元オブジェクトの姿勢と、前記3次元位置と前記交点と前記基準点とを含む平面に垂直で、かつ前記基準点を通る直線とを記憶し、
動作モードが姿勢操作モードである場合に限り、前記直線を軸として、前記基準点から前記交点に向かう単位ベクトルを前記基準点から前記3次元位置に向う単位ベクトルに重ね合わせる回転変換を前記3次元オブジェクトの姿勢に施し、その結果の姿勢を現在の3次元オブジェクト姿勢とする3次元オブジェクト姿勢操作方法。
A method for manipulating a posture of a three-dimensional object according to a three-dimensional position of a predetermined object in space,
Including a fixed reference point, a virtual work area fixed in a spatial coordinate system, and an internal virtual work area included in the virtual work area and fixed in the space;
The operation mode is changed from the idle mode in which the three-dimensional position is outside the virtual work area to the posture operation mode in which the three-dimensional position is inside the inner virtual work area or in contact with the boundary surface. When the three-dimensional position and the boundary surface of the internal virtual work area, the posture of the three-dimensional object, the three-dimensional position, the intersection and the reference point are perpendicular to the plane, and Storing a straight line passing through the reference point;
Only when the operation mode is the posture operation mode, the three-dimensional rotation transformation that superimposes the unit vector from the reference point toward the intersection with the unit vector from the reference point toward the three-dimensional position with the straight line as an axis. A three-dimensional object posture operation method that applies to the posture of an object and sets the resulting posture as the current three-dimensional object posture.
前記内部仮想作業領域として、前記3次元オブジェクトの形状がなす領域を用いる、請求項2記載の姿勢操作方法。   The posture operation method according to claim 2, wherein an area formed by the shape of the three-dimensional object is used as the internal virtual work area. 空間内の所定の物体の3次元位置により、2個以上の3次元オブジェクトからひとつを選択し、前記3次元オブジェクトの姿勢を操作する方法であって、
前記3次元オブジェクトの個数と同数で、前記3次元オブジェクトに一対一に対応する、前記空間に固定された仮想作業領域を設定し、
前記3次元オブジェクトの個数と同数で、前記3次元オブジェクトに一対一に対応する、前記空間に固定された基準点を設け、
それぞれの仮想作業領域に含まれかつ前記空間に固定された、前記3次元オブジェクトの個数と同数の、互いに交わりのない内部仮想作業領域を設定し、
動作モードが、前記3次元位置が全ての前記仮想作業領域の外部にあるアイドルモードから、前記3次元位置がひとつの内部仮想作業領域の内部に入って、もしくはその境界面に接して、姿勢操作モードに遷移したときに、前記3次元位置と前記内部仮想作業領域の境界面との交点の位置と、前記内部仮想作業領域に対応する3次元オブジェクトの姿勢と、前記3次元位置と前記交点と前記内部仮想作業領域に対応する基準点とを含む平面に垂直で、かつ前記基準点を通る直線とを記憶し、
動作モードが姿勢操作モードである場合に限り、前記直線を軸として、前記基準点から前記交点に向かう単位ベクトルを前記基準点から前記3次元位置に向かう単位ベクトルに重ね合わせる回転変換を、前記3次元オブジェクトの姿勢に施し、その結果の姿勢を現在の3次元オブジェクト姿勢とする3次元オブジェクト姿勢操作方法。
A method of selecting one of two or more three-dimensional objects according to a three-dimensional position of a predetermined object in space and operating the posture of the three-dimensional object,
A virtual work area fixed in the space, corresponding to the number of the three-dimensional objects and corresponding to the three-dimensional objects on a one-to-one basis;
Providing a reference point fixed in the space, the number of which is the same as the number of the three-dimensional object and corresponding to the three-dimensional object on a one-to-one basis;
Set the same number of internal virtual work areas that are included in each virtual work area and fixed in the space as the number of the three-dimensional objects and do not intersect each other,
The operation mode changes from the idle mode in which the three-dimensional position is outside all the virtual work areas, and the three-dimensional position enters the inside of one internal virtual work area or touches the boundary surface to perform the posture operation. When the mode is changed, the position of the intersection between the three-dimensional position and the boundary surface of the internal virtual work area, the posture of the three-dimensional object corresponding to the internal virtual work area, the three-dimensional position and the intersection A line perpendicular to a plane including a reference point corresponding to the internal virtual work area and passing through the reference point;
Only when the operation mode is the posture operation mode, the rotational transformation that superimposes the unit vector from the reference point to the intersection with the unit vector from the reference point to the three-dimensional position, with the straight line as an axis, A three-dimensional object posture operation method that applies to the posture of a three-dimensional object and sets the resulting posture as the current three-dimensional object posture.
前記内部仮想作業領域として、前記内部仮想作業領域に対応する3次元オブジェクトの形状がなす領域から、前記3次元オブジェクト以外の全ての3次元オブジェクトの形状がなす領域と前記3次元オブジェクトの形状がなす領域との交わりを取り除いた領域を用いる、請求項4記載の方法。   As the internal virtual work area, an area formed by the shape of a three-dimensional object corresponding to the internal virtual work area, an area formed by all three-dimensional objects other than the three-dimensional object, and a shape of the three-dimensional object The method according to claim 4, wherein a region from which the intersection with the region is removed is used. 前記仮想作業領域として、立方体領域、直方体領域、もしくは球領域を用いる、請求項1、2、4のいずれかに記載の姿勢操作方法。   The posture operation method according to claim 1, wherein a cubic area, a rectangular parallelepiped area, or a spherical area is used as the virtual work area. 姿勢操作モードであることを、表示用モニターに表示された、前記3次元位置を示すカーソル、もしくは前記仮想作業領域を示す図形、およびその両方、の表示色を変更する、チャイムを鳴動させる、指先に取り付けたマーカーを振動させる、のいずれかを用いてオペレータに知らせる、請求項1から6のいずれか1項に記載の姿勢操作方法。   Change the display color of the cursor indicating the three-dimensional position and / or the graphic indicating the virtual work area displayed on the display monitor to indicate that it is in the posture operation mode, sound the chime, fingertip The posture operating method according to any one of claims 1 to 6, wherein the operator is informed using any one of vibrating the marker attached to the head. 空間内位置として、指先に取り付けたマーカーの空間内位置を用いる、請求項7に記載の姿勢操作方法。   The posture operation method according to claim 7, wherein the position in space of the marker attached to the fingertip is used as the position in space. 空間内位置として、指先の空間内位置を用いる、請求項7に記載の姿勢操作方法。   The posture operation method according to claim 7, wherein a position of the fingertip in space is used as the position in space. 請求項1から9のいずれか1項に記載の3次元オブジェクト姿勢操作方法をコンピュータに実行させるための3次元オブジェクト姿勢操作プログラム。   A three-dimensional object posture operation program for causing a computer to execute the three-dimensional object posture operation method according to any one of claims 1 to 9.
JP2003314335A 2003-09-05 2003-09-05 3D object posture operation method and program Expired - Fee Related JP3953450B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003314335A JP3953450B2 (en) 2003-09-05 2003-09-05 3D object posture operation method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003314335A JP3953450B2 (en) 2003-09-05 2003-09-05 3D object posture operation method and program

Publications (2)

Publication Number Publication Date
JP2005084817A true JP2005084817A (en) 2005-03-31
JP3953450B2 JP3953450B2 (en) 2007-08-08

Family

ID=34414982

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003314335A Expired - Fee Related JP3953450B2 (en) 2003-09-05 2003-09-05 3D object posture operation method and program

Country Status (1)

Country Link
JP (1) JP3953450B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010086367A (en) * 2008-10-01 2010-04-15 Sony Corp Positional information inputting device, positional information inputting method, program, information processing system, and electronic equipment
EP2383612A1 (en) 2010-04-30 2011-11-02 Fujifilm Corporation Lithographic printing plate precursor and plate making method thereof
JP2012208705A (en) * 2011-03-29 2012-10-25 Nec Casio Mobile Communications Ltd Image operation apparatus, image operation method and program
US8402393B2 (en) 2008-10-23 2013-03-19 Samsung Electronics Co., Ltd. Apparatus and method for manipulating virtual object
JP2015108870A (en) * 2013-12-03 2015-06-11 富士通株式会社 Operation input device, operation input program, and operation input method
WO2023238663A1 (en) * 2022-06-07 2023-12-14 ソニーグループ株式会社 Information processing device and information processing method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4938617B2 (en) * 2007-10-18 2012-05-23 幸輝郎 村井 Object operating device and method for specifying marker from digital image frame data

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010086367A (en) * 2008-10-01 2010-04-15 Sony Corp Positional information inputting device, positional information inputting method, program, information processing system, and electronic equipment
US8402393B2 (en) 2008-10-23 2013-03-19 Samsung Electronics Co., Ltd. Apparatus and method for manipulating virtual object
EP2383612A1 (en) 2010-04-30 2011-11-02 Fujifilm Corporation Lithographic printing plate precursor and plate making method thereof
JP2012208705A (en) * 2011-03-29 2012-10-25 Nec Casio Mobile Communications Ltd Image operation apparatus, image operation method and program
JP2015108870A (en) * 2013-12-03 2015-06-11 富士通株式会社 Operation input device, operation input program, and operation input method
WO2023238663A1 (en) * 2022-06-07 2023-12-14 ソニーグループ株式会社 Information processing device and information processing method

Also Published As

Publication number Publication date
JP3953450B2 (en) 2007-08-08

Similar Documents

Publication Publication Date Title
JP4768196B2 (en) Apparatus and method for pointing a target by image processing without performing three-dimensional modeling
JP3859574B2 (en) 3D visual sensor
JP5122887B2 (en) Control method, apparatus, and medium for live-action base mobile device
US8660362B2 (en) Combined depth filtering and super resolution
JP5275978B2 (en) Controlling data processing
JP2019515407A (en) System and method for initializing a robot-learned route to travel autonomously
JP5182229B2 (en) Image processing apparatus, image processing method, and program
JP7162079B2 (en) A recording medium for recording a method, system and computer program for remotely controlling a display device via head gestures
JP3953450B2 (en) 3D object posture operation method and program
JP4694624B2 (en) Image correction apparatus and method, and computer program
JPWO2005096129A1 (en) Method and apparatus for detecting designated position of imaging apparatus, and program for detecting designated position of imaging apparatus
WO2005096130A1 (en) Method and device for detecting directed position of image pickup device and program for detecting directed position of image pickup device
JP2001166881A (en) Pointing device and its method
JP4041060B2 (en) Image processing apparatus and image processing method
JP2009134677A (en) Gesture interface system, wand for gesture input, application control method, camera calibration method, and control program
JP2009134677A6 (en) Gesture interface system, gesture input wand, application control method, camera calibration method, and control program
JP6425299B2 (en) Finger motion detection device, finger motion detection method, finger motion detection program, and virtual object processing system
Lee et al. A stereo-vision approach for a natural 3D hand interaction with an AR object
JP2002157079A (en) Method of discriminating intention
JP3763409B2 (en) 3D position input device
Radkowski et al. A hybrid tracking solution to enhance natural interaction in marker-based augmented reality applications
Fabian et al. One-point visual odometry using a RGB-depth camera pair
JP2005266471A (en) Image projection method and apparatus with pointing function, and program
US20230326056A1 (en) Information processing apparatus, information processing method, and program
WO2023062792A1 (en) Image processing device, image processing method, and storage medium

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20050621

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050722

RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20050722

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070124

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20070201

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070316

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20070411

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20070424

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100511

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110511

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120511

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130511

Year of fee payment: 6

LAPS Cancellation because of no payment of annual fees