JPS59139433A - Information processing device - Google Patents

Information processing device

Info

Publication number
JPS59139433A
JPS59139433A JP58011548A JP1154883A JPS59139433A JP S59139433 A JPS59139433 A JP S59139433A JP 58011548 A JP58011548 A JP 58011548A JP 1154883 A JP1154883 A JP 1154883A JP S59139433 A JPS59139433 A JP S59139433A
Authority
JP
Japan
Prior art keywords
information
operator
character
correction
yama
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP58011548A
Other languages
Japanese (ja)
Inventor
Hirohide Endo
遠藤 裕英
Kunihiro Okada
邦弘 岡田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP58011548A priority Critical patent/JPS59139433A/en
Publication of JPS59139433A publication Critical patent/JPS59139433A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements

Abstract

PURPOSE:To shorten the correcting operation time of characters inputted through an optical character reader or the like, by watching only information displayed on a display screen to designate a correction position without key operations of the operator. CONSTITUTION:For example, Kanji (Chinese character) ''yama'' (''mountain'' in English) is displayed on a display device 14 and should be corrected. When an operator 11 sees character ''yama'', the position of the operator's eye is detected by a television camera 13. The position of his eye is converted to coordinates, and they allowed to correspond to character ''yama''. When the position of character ''yama'' is detected, a cursor is moved automatically to the position to report it to the operator. The operator inputs information to be corrected to a microphone 12. Correction information is recognized by a voice recognizing processing part of a processor 15. The processor replaces character ''yama'' with information from the operator and displays this information on the display device. Though correction information can be inputted by a keyboard, position information of his eye is prevented from being lost in this case.

Description

【発明の詳細な説明】 〔発明の利用分野〕 本発明は表示装置に表示された情報を簡単な手段によっ
て修正出来る装置に係り、特に日本語ワードプロセッサ
やイメージプロセッサ等に好適な1i1報処理装置に関
する。
[Detailed Description of the Invention] [Field of Application of the Invention] The present invention relates to a device that can modify information displayed on a display device by simple means, and particularly relates to a 1i1 information processing device suitable for Japanese word processors, image processors, etc. .

〔従来技術〕[Prior art]

従来の日本語ワードプロセッサを例にと9、その問題点
を述べる。
Using a conventional Japanese word processor as an example9, we will discuss its problems.

従来ワードプロセッサに人力する手段はキーボードが一
般的で、その他音声人力、タブレットによるオンライン
文字認識、光学的文字読取装置等が考えられる。これら
の人力手段は一般に人力のミスが有り、人力した情報を
確認する作業が必要である。最も一般的なのは操作者の
手でカーソルを修正する文字位置に移動して修正する方
法である。この場合修正する文字位置にシフトキイ等で
カーソルを移動する作業に時間t−要し、修正作業が遅
くなる。
Conventionally, the means for manually inputting word processors has generally been a keyboard, but other methods include voice input, online character recognition using a tablet, and optical character reading devices. These manual methods are generally prone to human errors and require work to confirm the manually-generated information. The most common method is for the operator to manually move the cursor to the character position to be corrected. In this case, it takes time t to move the cursor to the position of the character to be corrected using a shift key or the like, which slows down the correction work.

〔発明の目的〕[Purpose of the invention]

本発明の目的は前述した修正作業を操作者のキイ操作で
行なわず修正筒所を見る(以下これを注視と称す)こと
によシその位置を検出し、修正を行なう装置を提供する
ことにある。
The object of the present invention is to provide a device that detects the position of the correction tube by looking at the correction tube (hereinafter referred to as gazing) and performs the correction without having to perform the above-mentioned correction work by operating keys. be.

〔発明の概要〕[Summary of the invention]

表示装置に表示された情報を修正する手順を考えると(
1)表示情報を見る、(2)修正するか判断する、(3
)修正する、と3つのステップを繰返すことによシおこ
なわれる。この中で(3)は修正のための作業そのもの
でめシ、(2)J’を情報の認識なので修正作業のため
の手順は(1)のみで、必ず行なわれる処理である。本
発明はこの点に注目し、表示情報を見た時の目の位置か
ら表示装置上のどの点を注視しているかを決ボし、この
情報を基に修正すべき情報の位置決めを行ない、例えば
音声人力装置等から修正のための情報を人力して作業を
進める点に特似がある。
Considering the procedure for modifying the information displayed on the display device (
1) View the displayed information, (2) Decide whether to correct it, (3)
) is done by repeating three steps. Among these, (3) is the correction work itself, and (2) J' is information recognition, so the only procedure for the correction work is (1), which is always performed. The present invention focuses on this point, determines which point on the display device the user is gazing at based on the position of the eyes when viewing the displayed information, and uses this information to determine the position of the information to be corrected. For example, there is a similarity in that the work is proceeded by manually inputting information for correction from a voice input device or the like.

〔発明の実施例〕[Embodiments of the invention]

第1図は本発明の一実施例を示し、第1図(1)はその
概観図で、第1図(2)は第1図(1)に対応したシス
テム構成図である。まず、第1図を用いて本発明の動作
の概略を説明する。第1図において、15は日本語ワー
ドプロセンサなどの処理装置で14Vi表示装庫である
。いま14に「山」という字を表示しているものとし、
これを修正する。操作者11が14の「山」を見るとテ
レビカメラ13により11の目の位置を検出する。13
に写しだされた目は処理装置15によって14上の位置
に座標変換し、「山」との対応づけを行なう。
FIG. 1 shows an embodiment of the present invention, FIG. 1 (1) is an overview diagram thereof, and FIG. 1 (2) is a system configuration diagram corresponding to FIG. 1 (1). First, an outline of the operation of the present invention will be explained using FIG. In FIG. 1, 15 is a processing device such as a Japanese word processing sensor and a 14Vi display device. Assume that the character ``yama'' is currently displayed at 14,
Fix this. When the operator 11 looks at the "mountain" 14, the television camera 13 detects the position of the eyes 11. 13
The coordinates of the eyes projected on the image are converted to a position on 14 by the processing device 15, and the eyes are correlated with the "mountain".

「山」の位置が検出されるとその位置に自動的にカーソ
ルが移動しく又は「山」を輝度変調する等の手段でも良
い)、11に知らせる。11は目の位置が検出されると
マイク12に向かい音声で修正する情報を人力するとこ
れも15内に有る音声認識処理部によって11からの情
報を認識し、15内にある日本語情報処理部へ修正情報
を送る。
When the position of the "mountain" is detected, the cursor may be automatically moved to that position, or the "mountain" may be brightly modulated, etc.), and the system 11 is notified. When the eye position is detected, 11 goes to the microphone 12 and manually inputs the information to be corrected by voice.The information from 11 is recognized by the voice recognition processing section also located in 15, and then the Japanese information processing section located inside 15 recognizes the information from 11. Send correction information to.

日本語情報処理部は「山」と11からの情報を交換して
14に表示する。以上の如ぐして修正作業は終了する。
The Japanese information processing section exchanges the information from 11 with "yama" and displays it on 14. The correction work is completed in the above manner.

もちろん他の修正手段例えばキーボード等で修正情報全
入力しても良い。この場合検出した目の位置情報が失な
われないようにする必要がある。
Of course, all the correction information may be entered using other correction means, such as a keyboard. In this case, it is necessary to prevent the detected eye position information from being lost.

つぎに、第2図と第3図にもとづき、本発明の%徴であ
る操作者の目の位置と表示装置上の位置の対応のとりが
たについて説明する。
Next, based on FIGS. 2 and 3, the correspondence between the position of the operator's eyes and the position on the display device, which is a characteristic of the present invention, will be explained.

まず操作者の目の動く範囲を測定し、この値に基づいて
表示装置上の位置を更正する手段について述べる。第2
図は表示装置14に更正用の記号21.22,23.2
4を表示したものである。
First, a means for measuring the range of movement of the operator's eyes and correcting the position on the display device based on this value will be described. Second
The figure shows correction symbols 21.22, 23.2 on the display device 14.
4 is displayed.

操作者11は更正を行なう旨全処理装置15に人力する
と14の21の位置に更正用の記号を表示する。11は
この記号を見て15にその旨を人力するとテレビカメラ
13上の第3図(2)に示す撮像管上の位置31に目が
検出される。31を検出すると仄に15は22の位置に
記号を表示する。
When the operator 11 manually inputs a message to the entire processing device 15 to perform the correction, a symbol for correction is displayed at the position 21 of 14. When 11 sees this symbol and manually tells 15 to do so, the eye is detected at position 31 on the image pickup tube of television camera 13 as shown in FIG. 3(2). When 31 is detected, 15 displays a symbol at 22 position.

111fi22の位置に目を移し上記と同様の操作を行
なう。以下同様にして第2図に示す21,22゜23.
24に対応した第3図(21に示すカメラ上の位1m3
1,32,33.34が検出出来る。ココで目の位置と
け瞳の中心と定義する。瞳の中心の横出方法については
ここでは述べないが、通常の11!l11J1処理技術
によって容易に求めることが出来る。
Move your eyes to position 111fi22 and perform the same operation as above. Thereafter, in the same manner as shown in FIG.
Figure 3 corresponding to 24 (1 m3 above the camera shown in 21)
1, 32, 33, and 34 can be detected. Position the eye here and define it as the center of the pupil. I won't talk about how to draw out the center of the pupil laterally, but the usual 11! It can be easily determined using l11J1 processing technology.

但し操作者の誦が微小範囲で変動する場合これをフィル
タリングする手段を具備することが必要である。第3図
は14と13の対応を示したものである。21と22の
距離をXL、21と24の距11M ’fr: Y L
とし、これに対応して検出されたカメラ上の距離をXo
 、Yo とする。
However, if the operator's recitation fluctuates within a minute range, it is necessary to provide means for filtering this. FIG. 3 shows the correspondence between 14 and 13. The distance between 21 and 22 is XL, the distance between 21 and 24 is 11M 'fr: Y L
and the corresponding detected distance on the camera is Xo
, Yo.

第4図は14上の第3図(1)に示した任意の点Pを1
3上の点P′から求めるだめの説明図である。
Figure 4 shows the arbitrary point P shown in Figure 3 (1) on 14 as 1
FIG.

以下、Pk求める方法について述べる。The method for determining Pk will be described below.

第4図においてAA’は14上の21.22の水平距離
XLを表わしている。いま21を見た時の目の位置を3
1.22を見た時の目の位置を32とすると目の移動距
離(臘の中心の偏位)はE 1 + E 2である。Q
は網膜に相当し、cFi水晶体と綱膜の距離である。
In FIG. 4, AA' represents a horizontal distance XL of 21.22 on 14. When you look at 21 now, the eye position is 3
If the position of the eye when looking at 1.22 is 32, the distance the eye moves (deviation of the center of the scale) is E 1 + E 2 . Q
corresponds to the retina and is the distance between the cFi crystalline lens and the tunica membrane.

14とQ迄の距離H1は(1)式よシ求まる。The distance H1 between 14 and Q can be found using equation (1).

ここでXt、、Cは与えられ、El十E2は更正時に計
測出来る。
Here, Xt, , C are given, and El + E2 can be measured at the time of correction.

目の中心からXだけ離れた点Pを求めると式(2)式(
2)よりHl、Cに無関係にテレビカメラ上の偏位Ex
を計測することによって表示装置114上のxf求める
ことが出来る。
To find the point P that is X away from the center of the eye, we obtain equation (2) and equation (
2) From this, the deviation Ex on the television camera is independent of Hl and C.
By measuring xf on the display device 114, it is possible to obtain xf.

次に更正後に11が水平方向にEoだけ移動した時につ
いて述べる。、 Eoだけ移動した時の点をQ′とする。この時で21.
22′fc見た時の目の移動路KIiをB4.E3とす
ると これから E3 + E4 = El + E2        
・・・・・・・・・(5)すなわち水平方向に移動した
時の目の移動距離は一定である。したがって更正時に求
めたEl+E2を用いて艮い。
Next, a case will be described when 11 moves by Eo in the horizontal direction after correction. , Let the point when moved by Eo be Q'. 21 at this time.
22′fcThe eye movement path KIi when looking at B4. Assuming E3, from now on E3 + E4 = El + E2
(5) That is, the distance the eyes move when moving in the horizontal direction is constant. Therefore, use El+E2 obtained at the time of correction.

Q′よシ点Pを見た時の距離はテレビカメラ上の偏位E
X′に対し となる。これから、 ここでEoは後述するように中心の移動距離として求め
られる。したがって更正後に移動しても14上のどの点
を見ているかはテレビカメラの目の位置を検出すること
によって求めることが出来る。Y方向の移動についても
同様に述べられる。
The distance when looking at point P from Q' is the deviation E on the TV camera.
For X'. From this, Eo can be found as the moving distance of the center as described later. Therefore, even if the user moves after correction, it is possible to determine which point on 14 the user is looking at by detecting the position of the eye of the television camera. The same can be said for movement in the Y direction.

第5図は移動距離Eoを求めるための説明図である。簡
単のために片方の目の時について説明する。いまCの位
置に目があり画像処理によって目の輪郭を検出する。同
様にC′の位置に目が移動するとこの時の輪郭とCの輪
郭との移動距離E。XとEoyを知ることによって目の
移動、すなわち操作者11の移動を検出出来る。
FIG. 5 is an explanatory diagram for determining the moving distance Eo. For the sake of simplicity, we will explain the case with one eye. There is an eye at position C now, and the outline of the eye is detected by image processing. Similarly, when the eye moves to position C', the distance traveled between the contour at this time and the contour of C is E. By knowing X and Eoy, the movement of the eyes, that is, the movement of the operator 11 can be detected.

次に操作者11が更正後に前後に移動した時の補正につ
いて述べる。
Next, correction when the operator 11 moves back and forth after correction will be described.

更正時の操作者とカメラの距離をHl 、この時の目の
移動距離をE、 +E2とすると移動後の距離H2と目
の移動距離Ewの関係は これから、 Hzはカメラの倍率mを一定とすれば弐QOIよシ求め
られる。
If the distance between the operator and the camera at the time of correction is Hl, and the distance the eye moves at this time is E, +E2, then the relationship between the distance H2 after movement and the distance Ew the eye moves is as follows: Hz is the same as the magnification m of the camera. If you do that, you will be asked for a second QOI.

Hz = f (1+m )      ・・・・・団
・αωここでfはレンズの焦点距離である。
Hz = f (1+m) ... group αω where f is the focal length of the lens.

次に表示装置14上の任意の位置を操作者11が見たこ
とを15が確認したかどうかを知らせる手段については
、上述した手段によって14上の位置が求められたら、
14上の対応する位置にカーソルを移動したり、又は対
応する位置に表示されている情報を輝度変調することに
よって実現することが出来る。
Next, as for the means for informing whether the operator 15 has confirmed that the operator 11 has viewed an arbitrary position on the display device 14, once the position on the display device 14 has been determined by the above-mentioned means,
This can be realized by moving the cursor to a corresponding position on the screen 14 or by modulating the brightness of information displayed at the corresponding position.

〔発明の効果〕〔Effect of the invention〕

以上説明したごとく本発明によれば表示画面に例えば光
学的文字読取装置等で人力した文字の修正等において表
示された情報を見るだけでその修正位置を指定出来るの
で修正作業の効率向上に著しい効果がある。
As explained above, according to the present invention, it is possible to specify the correction position by simply looking at the information displayed on the display screen, for example, when correcting manually written characters using an optical character reading device, etc., which has a significant effect on improving the efficiency of correction work. There is.

【図面の簡単な説明】 第1図は本発明の一実施例を示す図、第2図は史正用記
号の表示装置への表示例を示す図、棺3図は更正用記号
の表示装置上における位置と撮像雪上における位置との
対応関係を示す図、第4図は撮像管上における点の位置
から表示装置上における点の位置を求めるための説明図
、第5図1id目の移動距離を示す図である。 13・・・テレビカメラ。 第 1 日 (+2 3
[Brief Description of the Drawings] Figure 1 is a diagram showing an embodiment of the present invention, Figure 2 is a diagram showing an example of displaying historical symbols on a display device, and Figure 3 of the coffin is a display device for correction symbols. Figure 4 is an explanatory diagram for determining the position of a point on the display device from the position of the point on the image pickup tube; Figure 5 is the distance traveled by the 1st id. FIG. 13...TV camera. Day 1 (+2 3

Claims (1)

【特許請求の範囲】[Claims] 情報を表示する表示手段と、操作者が該表示手段のどの
位置を見ているかを検出する検出手段と、検出した位置
に表示された情報を変更する変更手段とを具備したこと
を%徴とする情報処理装置。
It is a characteristic that the system is equipped with a display means for displaying information, a detection means for detecting which position on the display means the operator is looking at, and a changing means for changing the information displayed at the detected position. information processing equipment.
JP58011548A 1983-01-28 1983-01-28 Information processing device Pending JPS59139433A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP58011548A JPS59139433A (en) 1983-01-28 1983-01-28 Information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP58011548A JPS59139433A (en) 1983-01-28 1983-01-28 Information processing device

Publications (1)

Publication Number Publication Date
JPS59139433A true JPS59139433A (en) 1984-08-10

Family

ID=11781010

Family Applications (1)

Application Number Title Priority Date Filing Date
JP58011548A Pending JPS59139433A (en) 1983-01-28 1983-01-28 Information processing device

Country Status (1)

Country Link
JP (1) JPS59139433A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6394232A (en) * 1986-10-08 1988-04-25 Canon Inc Control device for camera
US4768028A (en) * 1985-03-29 1988-08-30 Ferranti Plc Display control apparatus having a cursor
US6351273B1 (en) * 1997-04-30 2002-02-26 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen
US6437794B1 (en) * 1995-05-15 2002-08-20 Canon Kabushiki Kaisha Interactive image generation method and apparatus utilizing a determination of the visual point position of an operator
US6603491B2 (en) 2000-05-26 2003-08-05 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4768028A (en) * 1985-03-29 1988-08-30 Ferranti Plc Display control apparatus having a cursor
JPS6394232A (en) * 1986-10-08 1988-04-25 Canon Inc Control device for camera
US6437794B1 (en) * 1995-05-15 2002-08-20 Canon Kabushiki Kaisha Interactive image generation method and apparatus utilizing a determination of the visual point position of an operator
US6351273B1 (en) * 1997-04-30 2002-02-26 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen
US6603491B2 (en) 2000-05-26 2003-08-05 Jerome H. Lemelson System and methods for controlling automatic scrolling of information on a display or screen

Similar Documents

Publication Publication Date Title
US6345111B1 (en) Multi-modal interface apparatus and method
US10930010B2 (en) Method and apparatus for detecting living body, system, electronic device, and storage medium
JP2758952B2 (en) Display Method for Japanese Document Reading and Translation System at Correction
US20170124386A1 (en) Method, device and computer-readable medium for region recognition
US20170124412A1 (en) Method, apparatus, and computer-readable medium for area recognition
US7224834B2 (en) Computer system for relieving fatigue
JPWO2008012905A1 (en) Authentication apparatus and authentication image display method
JP2006107048A (en) Controller and control method associated with line-of-sight
US11429200B2 (en) Glasses-type terminal
JP3382045B2 (en) Image projection system
JPS59139433A (en) Information processing device
CN114299587A (en) Eye state determination method and apparatus, electronic device, and storage medium
JP3307075B2 (en) Imaging equipment
CN114281236B (en) Text processing method, apparatus, device, medium, and program product
KR20160133335A (en) System for making dynamic digital image by voice recognition
EP3812951A1 (en) Augmenting biligual training corpora by replacing named entities
JPH0643851A (en) Device for displaying image
US20240089362A1 (en) Terminal Device
JP7031112B1 (en) Glasses type terminal
JPH06290301A (en) Character/graphic recognizing device
CN116229536A (en) Face image recognition method, device, equipment and storage medium
CN110969161B (en) Image processing method, circuit, vision-impaired assisting device, electronic device, and medium
JPH05204526A (en) View point monitoring input device
JPH0239307A (en) Plant monitor device
CN111506195A (en) Focal region highlighting system based on eyeball tracking equipment