JPS61196317A - Information input system - Google Patents

Information input system

Info

Publication number
JPS61196317A
JPS61196317A JP60036580A JP3658085A JPS61196317A JP S61196317 A JPS61196317 A JP S61196317A JP 60036580 A JP60036580 A JP 60036580A JP 3658085 A JP3658085 A JP 3658085A JP S61196317 A JPS61196317 A JP S61196317A
Authority
JP
Japan
Prior art keywords
input device
information
information input
camera
reference point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP60036580A
Other languages
Japanese (ja)
Inventor
Kenji Mase
健二 間瀬
Yasuhito Suenaga
末永 康仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP60036580A priority Critical patent/JPS61196317A/en
Publication of JPS61196317A publication Critical patent/JPS61196317A/en
Pending legal-status Critical Current

Links

Abstract

PURPOSE:To detect 3-dimensional position information by setting plural TV cameras with different visual field angles and processing image pickup signals. CONSTITUTION:The X, Y and Z coordinates are detected in response to the finger pointing positions of both cameras 1 and 2. Then the results of detection are delivered to an information processor 4. An information input device 5 calculates the coefficient of a straight line connecting a reference point B given from a reference point input device 5 and a finger pointing position P. This coefficient is delivered to the processor 4. Then the coordinates are decided for an object on the straight line.

Description

【発明の詳細な説明】 〔産業上の利用分野〕 本発明は、人間の手の如き可動物体の動きを直接映像と
してとらえ、画像処理によって核子の動きによって表現
される人間の意図をコンピュータに伝えるための安価に
して簡便な情報入力方式に関するものである。
[Detailed Description of the Invention] [Industrial Application Field] The present invention directly captures the movement of a movable object such as a human hand as an image, and uses image processing to convey the human intention expressed by the movement of nuclear particles to a computer. The present invention relates to an inexpensive and simple information input method.

〔従来の技術〕[Conventional technology]

さて、キーボード付きディスプレイ、ライトペン、タブ
レット、マウス、ジョイスティック等は、従来より広く
使用されているコンピュータへの情報入力機器である。
Now, displays with keyboards, light pens, tablets, mice, joysticks, and the like are devices for inputting information into computers that have been widely used in the past.

一方、古今東西、人間の身振り手振りは最も基本的表意
志伝達手段として重要な役割を果たしてきている。特に
手や頭の動作は、人間が赤ん坊から幼児へと成長してい
く過程で、言葉よりも先に使うようになる情報伝達手段
である。もしも、人間の手や頭の動作を直接コンピュー
タで理解できるようになれば、それは、キーボード等の
機器による入力を補い、より自然なマンマシンインタフ
エースを実現するものとして大きな意味をもつようにな
ると考えられる。
On the other hand, throughout the ages, human gestures have played an important role as the most basic means of communicating intentions. In particular, hand and head movements are a means of transmitting information that humans come to use before words as they grow from babies to toddlers. If it were possible to directly understand human hand and head movements using a computer, this would have great significance as a way to supplement input from devices such as keyboards and realize a more natural human-machine interface. Conceivable.

特許第1144011号(末永、山岸二 “情報入力方
式”、特公昭57−1009.特願昭5l−10408
)は上記問題点を一部解決し、TVカメラから入力され
た映像から指先もしくは指示器の位置を検出するもので
あるが、その使用範囲はTVカメラの視野内の2次元の
位置情報入力に限定されるという欠点があった。
Patent No. 1144011 (Suenaga, Yamagishi Ni “Information input method”, Special Publication No. 57-1009, Patent Application No. 51-10408)
) partially solves the above problems and detects the position of the fingertip or indicator from the image input from the TV camera, but its scope of use is limited to inputting two-dimensional position information within the field of view of the TV camera. It had the disadvantage of being limited.

〔発明が解決しようとする問題点〕[Problem that the invention seeks to solve]

そこで本発明は、TVカメラから入力された映像から指
先もしくは指示器の位rItを3次元の位置情報として
検出し、コンピュータに入力することを可能ならしめる
ことを解決すべき問題点としている。
Therefore, the problem to be solved by the present invention is to make it possible to detect the position rIt of a fingertip or pointer as three-dimensional position information from an image input from a TV camera and input it into a computer.

従って本発明は、上述のことを可能にする情報入力方式
を提供することを目的としている。
Therefore, it is an object of the present invention to provide an information input method that makes the above possible.

〔問題点を解決すべき手段および作用〕本発明は、人間
の指先の如き可動物体に対して視野角を異にして複数台
のTVカメラを配置し、それら各カメラからの撮像信号
を処理することにより、指先などの位置の6次元的位置
情報の検出を可能にしている。
[Means and effects for solving the problem] The present invention arranges a plurality of TV cameras with different viewing angles for a movable object such as a human fingertip, and processes the imaging signals from each camera. This makes it possible to detect six-dimensional positional information such as the position of a fingertip.

〔実施例〕〔Example〕

次に図を参照して本発明の詳細な説明する。 The present invention will now be described in detail with reference to the drawings.

第1図は、2台のTVカメラを用いた場合の本発明の一
実施例を示す説明図である。同図において、1は、平面
映像を取り込むTVカメラ#1.2は立面映像を取り込
むTVカメラ#2.6は取り込んだ映像を処理して目標
点の3次元座標を出力する情報入力装置、4は3次元座
標を利用する情報処理装置等、5は基準点入力装置であ
る。
FIG. 1 is an explanatory diagram showing an embodiment of the present invention when two TV cameras are used. In the figure, 1 is a TV camera #1 that captures a planar image; 2 is a TV camera #2 that captures an elevational image; and 6 is an information input device that processes the captured image and outputs the three-dimensional coordinates of the target point; 4 is an information processing device that uses three-dimensional coordinates, and 5 is a reference point input device.

なお、基準点入力装置5は、情報入力装置3に組みこま
れることもある。
Note that the reference point input device 5 may be incorporated into the information input device 3.

また図中、6から9までの数字は動作を説明するための
ものであり、6は基準点の例を示すもので、ここでは人
の顔の中心を指している。7は位置を指示する手、8.
9はそれぞれ指先位置である。
Further, in the figure, the numbers 6 to 9 are for explaining the movements, and 6 indicates an example of a reference point, which here indicates the center of the person's face. 7 is a hand that indicates the position; 8.
9 is the position of each fingertip.

次に動作を説明する。先ず第1図に示すような配置によ
り上方と側方におかれた2台のTVカメラ1,2から手
7の映像を入力し、該映像を情報入力装置3において処
理することによって指先の3次元的座標位置P−(X、
Y、Z)(図の8や9)を検出して、結果を情報処理装
置4に出力する。
Next, the operation will be explained. First, images of the hand 7 are inputted from two TV cameras 1 and 2 placed above and on the side in the arrangement shown in FIG. Dimensional coordinate position P-(X,
Y, Z) (8 and 9 in the figure) and outputs the results to the information processing device 4.

ここで情報入力装置3に、基準点入力装置5により与え
た基準点B(図の6)と点Pを結ぶ直線の係数を計算さ
せて情報処理装置4に出力すると、該直線と交差する物
体または点の座標を判定させることができる。
Here, when the information input device 3 calculates the coefficient of a straight line connecting the reference point B (6 in the figure) given by the reference point input device 5 and the point P and outputs it to the information processing device 4, an object that intersects with the straight line Alternatively, the coordinates of a point can be determined.

この基準点Bとしては空間内の適当な固定点をBと定め
ておくだけで一応目的を達成することもできるが、例え
ば、顔の中心位置を固定にしておいて、その座標を基準
点入力装置5で入力して用いることにすれば、人間の身
振り手振りの入力に際し、さらに自然な使用が可能とな
る。また将来は、顔の中心位置をやはりTVカメラ等で
認識入力し、これを基準点Bとして使用することも可能
である。
The purpose can be achieved by simply setting an appropriate fixed point in space as reference point B, but for example, fixing the center position of the face and inputting its coordinates as the reference point By inputting and using the device 5, it becomes possible to use it more naturally when inputting human gestures. In the future, it will also be possible to recognize and input the center position of the face using a TV camera or the like and use this as the reference point B.

次に情報入力装置3は、まず2個の2次元座標の組を内
部で計算する。このために該装置は、前記特許1144
011号明細書に記載の如き処理方法などを具体的処理
方法として用いる。ただし、該特許の方法では対象物の
TVカメラからの距離の変動が少ないために、TVカメ
ラからの距離の変動によって映倫上の位置を実空間上の
位置に補正することがほとんど不要(またはと(簡単)
であった。
Next, the information input device 3 first internally calculates a set of two two-dimensional coordinates. For this purpose, the device is
A processing method such as that described in the specification of No. 011 is used as a specific processing method. However, in the method of the patent, since there is little variation in the distance of the object from the TV camera, it is almost unnecessary (or unnecessary) to correct the position in the image plane to the position in real space based on the variation in distance from the TV camera. (easy)
Met.

この方法を直接利用するためには、TVカメラのレンズ
に焦点距離の大きなものを使う必要がある。そこで、通
常のレンズを使い、比較的近い位置にカメラを置いても
検出が出来るように、情報入力装置3は位置補正を行う
。この補正動作を第2図によって説明する。
To use this method directly, it is necessary to use a TV camera lens with a large focal length. Therefore, the information input device 3 performs position correction so that detection can be performed even if the camera is placed at a relatively close position using a normal lens. This correction operation will be explained with reference to FIG.

第2図は第1図の配置を原点0からY軸方向に涜ってみ
たところを示す図である。まず、2台のTVカメラの絶
対位置座標と、指示点P(8)の映像中の座標から2本
の3次元直線11.12の方程式を計算する。これらの
直線はそれぞれカメラ#1.@2の指示点P(8)に対
する視線(10゜11)となる。次にこの2刀根式の交
点を計算するとそれが求める指示点P(8)の3次元座
標となる。
FIG. 2 is a diagram showing the arrangement of FIG. 1 viewed from the origin 0 in the Y-axis direction. First, equations of two three-dimensional straight lines 11 and 12 are calculated from the absolute position coordinates of the two TV cameras and the coordinates of the indicated point P(8) in the video. Each of these straight lines corresponds to camera #1. This is the line of sight (10° 11) with respect to the designated point P(8) of @2. Next, when the intersection point of this two-pronged equation is calculated, it becomes the three-dimensional coordinates of the indicated point P(8) to be sought.

なお、上記の例では2台のTVカメラを使用したが、6
台以上のTVカメラを同時に使用し、類似の処理を行う
こともできる。この場合、処理方法はやや複雑になるが
、原理は同じであり、手や指示器を色々な方向にむけた
ときの検出精度を向上させることができる。
In addition, in the above example, two TV cameras were used, but six
Similar processing can also be performed using more than one TV camera at the same time. In this case, the processing method is a little more complicated, but the principle is the same, and the detection accuracy can be improved when the hand or pointer is pointed in various directions.

〔発明の効果〕〔Effect of the invention〕

本発明によれば、2台以上複数台のTVカメラ、情報入
力装置、それに場合によっては基準点入力装置を備える
ととくより、3次元ビデオタブレット、3次元空間内の
仮想的指示棒、ワイヤレスリモコンスイッチ等管実現で
きる。しかも、本来、人間の手の動きは種々の意味に使
用されるものであり、本発明方式で入力された3次元点
列に対する種々の識別法を開発することにより、数値の
入力、コマンドの入力等、広い用途に本発明は使用でき
ると考えられる。
According to the present invention, in addition to providing two or more TV cameras, an information input device, and in some cases a reference point input device, a three-dimensional video tablet, a virtual pointing stick in a three-dimensional space, and a wireless remote control are provided. Switches etc. can be realized. Moreover, human hand movements are originally used for various meanings, and by developing various identification methods for the three-dimensional point sequence input using the method of the present invention, it is possible to input numerical values and commands. It is believed that the present invention can be used in a wide range of applications, such as:

以上説明したように、本発明によれば、高価な特殊装置
を使用することなく6次元空間内の位置情報および方向
の情報を入力することが可能であるから、安価で手軽な
3次元情報装置を実現できるという利点がある。
As explained above, according to the present invention, it is possible to input position information and direction information in a six-dimensional space without using expensive special equipment, so an inexpensive and easy-to-use three-dimensional information device can be obtained. It has the advantage that it can be realized.

【図面の簡単な説明】[Brief explanation of drawings]

第1図は本発明の一実施例を示す説明図、第2図は第1
図において原点OからY軸方向に沿って見たTVカメラ
の配置図、である。 符号説明 1.2・・・・・・TVカメラ、3・・・・・・情報入
力装置、4・・・・・・情報処理装置等、訃・・・・・
基準点入力装置、6・・・・・・基準点、7・・・・・
・手、8.9・・・・・・指先位置、10.11・・・
・・・カメラの視線(3次元直線)代理人 弁理士 並
 木 昭 夫 代理人 弁理士 松 崎    清 第1図 第2図 カメラ#2の中/W
FIG. 1 is an explanatory diagram showing one embodiment of the present invention, and FIG.
It is a layout diagram of the TV camera seen along the Y-axis direction from the origin O in the figure. Description of symbols 1.2...TV camera, 3...Information input device, 4...Information processing device, etc., deceased...
Reference point input device, 6...Reference point, 7...
・Hand, 8.9...Fingertip position, 10.11...
... Camera's line of sight (3-dimensional straight line) Agent Patent attorney Akio Namiki Agent Patent attorney Kiyoshi Matsuzaki Figure 1 Figure 2 Inside camera #2 / W

Claims (1)

【特許請求の範囲】 1)可動物体に対して視野角を異にして配置された複数
台のテレビカメラと、前記各カメラからの可動物体の撮
像信号を入力され処理してその3次元空間内の座標位置
を決定して出力する情報入力装置とから成ることを特徴
とする情報入力方式。 2)可動物体に対して視野角を異にして配置された複数
台のテレビカメラと、前記各カメラからの可動物体の撮
像信号を入力され処理してその3次元空間内の座標位置
を決定して出力する情報入力装置とから成る情報入力方
式において、前記可動物体の座標位置とは異なる位置に
ある基準点の座標位置を前記情報入力装置に入力するた
めの基準入力装置を具備し、前記情報入力装置において
前記二組の座標位置を用いて両者を結ぶ直線上の特定点
の座標位置を決定して出力するようにしたことを特徴と
する情報入力方式。
[Scope of Claims] 1) A plurality of television cameras arranged at different viewing angles with respect to a movable object, and imaging signals of the movable object from each camera are input and processed to capture images within the three-dimensional space. and an information input device that determines and outputs the coordinate position of. 2) A plurality of television cameras are arranged at different viewing angles with respect to the movable object, and the imaging signals of the movable object from each camera are input and processed to determine the coordinate position in the three-dimensional space. and an information input device that outputs the information, the information input method comprising: a reference input device for inputting into the information input device a coordinate position of a reference point located at a position different from the coordinate position of the movable object; An information input method characterized in that the input device uses the two sets of coordinate positions to determine and output the coordinate position of a specific point on a straight line connecting the two sets.
JP60036580A 1985-02-27 1985-02-27 Information input system Pending JPS61196317A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP60036580A JPS61196317A (en) 1985-02-27 1985-02-27 Information input system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP60036580A JPS61196317A (en) 1985-02-27 1985-02-27 Information input system

Publications (1)

Publication Number Publication Date
JPS61196317A true JPS61196317A (en) 1986-08-30

Family

ID=12473707

Family Applications (1)

Application Number Title Priority Date Filing Date
JP60036580A Pending JPS61196317A (en) 1985-02-27 1985-02-27 Information input system

Country Status (1)

Country Link
JP (1) JPS61196317A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0683512A (en) * 1992-02-07 1994-03-25 Internatl Business Mach Corp <Ibm> Method and device for inputting command and data
EP0571702A3 (en) * 1992-05-26 1994-10-12 Takenaka Corp Hand pointing type input unit and wall computer module.
JP2000181601A (en) * 1998-12-18 2000-06-30 Fujitsu General Ltd Information display system
US6674424B1 (en) 1999-10-29 2004-01-06 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
JP2004246814A (en) * 2003-02-17 2004-09-02 Takenaka Komuten Co Ltd Indication movement recognition device
JP2009134677A (en) * 2007-02-28 2009-06-18 Fuji Xerox Co Ltd Gesture interface system, wand for gesture input, application control method, camera calibration method, and control program
WO2009157792A1 (en) * 2008-06-24 2009-12-30 Rurin Oleg Stanislavovich Method for producing an effect on virtual objects
US7893920B2 (en) 2004-05-06 2011-02-22 Alpine Electronics, Inc. Operation input device and method of operation input
CH702146A1 (en) * 2009-11-04 2011-05-13 Ininet Solutions Gmbh A method for three-dimensional support of the manual operation of graphical user interfaces.
USRE42794E1 (en) 1999-12-27 2011-10-04 Smart Technologies Ulc Information-inputting device inputting contact point of object on recording surfaces as information
FR2971864A1 (en) * 2011-02-22 2012-08-24 Peugeot Citroen Automobiles Sa Virtual reality equipment i.e. immersive virtual reality environment equipment, for virtual reality interaction with human-machine interface car, has contact device with touch pad positioned at point where interface is intended to appear
JP2012198608A (en) * 2011-03-18 2012-10-18 Nec Personal Computers Ltd Input device and input method
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5870108A (en) * 1981-10-22 1983-04-26 Toray Ind Inc Measuring system for position or shape
JPS59142094A (en) * 1983-02-02 1984-08-15 工業技術院長 Three-dimensional environment display device
JPS59218972A (en) * 1983-01-28 1984-12-10 Mitsubishi Electric Corp Measuring device of three-dimensional position

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5870108A (en) * 1981-10-22 1983-04-26 Toray Ind Inc Measuring system for position or shape
JPS59218972A (en) * 1983-01-28 1984-12-10 Mitsubishi Electric Corp Measuring device of three-dimensional position
JPS59142094A (en) * 1983-02-02 1984-08-15 工業技術院長 Three-dimensional environment display device

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0683512A (en) * 1992-02-07 1994-03-25 Internatl Business Mach Corp <Ibm> Method and device for inputting command and data
EP0571702A3 (en) * 1992-05-26 1994-10-12 Takenaka Corp Hand pointing type input unit and wall computer module.
EP0829799A3 (en) * 1992-05-26 1998-08-26 Takenaka Corporation Wall computer module
JP2000181601A (en) * 1998-12-18 2000-06-30 Fujitsu General Ltd Information display system
USRE43084E1 (en) 1999-10-29 2012-01-10 Smart Technologies Ulc Method and apparatus for inputting information including coordinate data
US7342574B1 (en) 1999-10-29 2008-03-11 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
US6674424B1 (en) 1999-10-29 2004-01-06 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
USRE42794E1 (en) 1999-12-27 2011-10-04 Smart Technologies Ulc Information-inputting device inputting contact point of object on recording surfaces as information
JP2004246814A (en) * 2003-02-17 2004-09-02 Takenaka Komuten Co Ltd Indication movement recognition device
US7893920B2 (en) 2004-05-06 2011-02-22 Alpine Electronics, Inc. Operation input device and method of operation input
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
JP2009134677A (en) * 2007-02-28 2009-06-18 Fuji Xerox Co Ltd Gesture interface system, wand for gesture input, application control method, camera calibration method, and control program
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
WO2009157792A1 (en) * 2008-06-24 2009-12-30 Rurin Oleg Stanislavovich Method for producing an effect on virtual objects
CH702146A1 (en) * 2009-11-04 2011-05-13 Ininet Solutions Gmbh A method for three-dimensional support of the manual operation of graphical user interfaces.
FR2971864A1 (en) * 2011-02-22 2012-08-24 Peugeot Citroen Automobiles Sa Virtual reality equipment i.e. immersive virtual reality environment equipment, for virtual reality interaction with human-machine interface car, has contact device with touch pad positioned at point where interface is intended to appear
JP2012198608A (en) * 2011-03-18 2012-10-18 Nec Personal Computers Ltd Input device and input method

Similar Documents

Publication Publication Date Title
JP3114813B2 (en) Information input method
US8571258B2 (en) Method of tracking the position of the head in real time in a video image stream
WO2000007148A1 (en) Method and apparatus for three-dimensional input entry
JPS61196317A (en) Information input system
KR100361462B1 (en) Method for Acquisition of Motion Capture Data
EP2755115A1 (en) Method for detecting motion of input body and input device using same
JP6344530B2 (en) Input device, input method, and program
CN109243575B (en) Virtual acupuncture method and system based on mobile interaction and augmented reality
WO2013162236A1 (en) Transparent display virtual touch apparatus not displaying pointer
US11507192B2 (en) Gesture acquisition system
CN111353930A (en) Data processing method and device, electronic equipment and storage medium
KR100532525B1 (en) 3 dimensional pointing apparatus using camera
Kuno et al. Vision-based human interface with user-centered frame
JPH1080886A (en) Vision control robot
JPS61199178A (en) Information input system
JPH07160412A (en) Pointed position detecting method
CN107247424A (en) A kind of AR virtual switches and its method based on laser distance sensor
CN104536568B (en) Detect the dynamic control system of user&#39;s head and its control method
JPH08129449A (en) Signal input device
Miwa et al. Four-dimensional viewing direction control by principal vanishing points operation and its application to four-dimensional fly-through experience
JPS5856152B2 (en) 3D figure reading display device
TWI476678B (en) Interactive simulated-globe display system
JP2015106255A (en) Display device and program
US20240112421A1 (en) System and method of object tracking for extended reality environment
CN112000219B (en) Movable gesture interaction method for augmented reality game