WO2011038527A1 - 一种注视点检测方法及其装置 - Google Patents
一种注视点检测方法及其装置 Download PDFInfo
- Publication number
- WO2011038527A1 WO2011038527A1 PCT/CN2009/001105 CN2009001105W WO2011038527A1 WO 2011038527 A1 WO2011038527 A1 WO 2011038527A1 CN 2009001105 W CN2009001105 W CN 2009001105W WO 2011038527 A1 WO2011038527 A1 WO 2011038527A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- camera
- reference table
- gaze point
- screen
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
Definitions
- Embodiments of the present invention relate to the field of image processing, and in particular, to a gaze point detection method and apparatus based on face detection and image measurement. Background technique
- the movement of the hand is limited for some reason, such as physical disability or trauma, so the above cursor movement can become difficult or even impossible.
- the activity of the hand is normal, in some special cases, it is desirable to perform the above-described cursor movement without using a hand, or to reduce the movement distance of the hand as much as possible.
- An aspect of the present invention provides a gaze point detecting apparatus for calculating a gaze point of a user on a screen, the apparatus comprising: a camera for capturing a face image of a user; a reference table obtaining unit, configured to acquire a reference table that refers to a relationship between the face image and the direction of the line of sight of the user; and a calculation unit that performs image measurement according to the face image of the user captured by the camera, and searches a reference table in the reference table acquisition unit to calculate the user The gaze point on the screen.
- the reference table obtaining unit includes at least one of the following: a reference table building unit, The reference table is constructed according to at least one reference face image of the user captured by the camera; and a reference table storage unit in which the already constructed reference table is stored.
- the calculation unit includes: a line-of-sight direction calculation unit that measures a distance between a midpoint between the two pupils of the user in the user's face image and the camera according to the position of the camera, and calculates a line of sight direction of the user by searching the reference table;
- the fixation point calculation unit calculates the gaze point of the user on the screen according to the position of the camera, the distance between the midpoint between the user's two pupils and the camera, and the direction of the user's line of sight.
- the gaze point detecting device further comprises: a cursor moving unit, after the gaze point is calculated, if the gaze point is located within the screen, the cursor moving unit moves the cursor on the screen to the gaze point.
- the cursor movement unit does not move the cursor if the distance between the fixation point and the current cursor is less than a predefined value.
- the fixation point detecting means further comprises: an attachment unit for performing an operation at the cursor position.
- the accessory unit includes at least one of a mouse, a keyboard, a touch pad, a handle, and a remote controller.
- Another aspect of the present invention provides a gaze point detecting method for calculating a gaze point of a user on a screen, the method comprising the following steps: a reference table obtaining step of acquiring a reference face image and a user's line of sight direction The reference table of relationships; the gaze point calculation step, using the camera to capture the user's face image, performing image measurements and looking up a reference table to calculate the user's gaze point on the screen.
- the reference table obtaining step comprises: acquiring at least one reference face image of the user using the camera to construct a reference table including a relationship between the reference face image and the direction of the user's line of sight; or directly acquiring the already constructed reference table.
- the gaze point calculation step comprises: measuring a distance between a midpoint between the two pupils of the user in the user's face image and the camera according to the position of the camera, and calculating a line of sight direction of the user by searching the reference table; and according to the camera The position, the distance between the user's two pupils and the camera, and the user's line of sight, calculate the user's gaze point on the screen.
- the gaze point detecting method further comprises: after calculating the gaze point, if the gaze point is located within the screen, moving the cursor on the screen to the gaze point.
- the distance between the fixation point and the current cursor is less than a predefined value, the cursor is not moved.
- the predefined value can be set as needed.
- Yet another aspect of the present invention provides a multi-screen computer having a plurality of screens surrounding a user, the multi-screen computer including the gaze point detecting device of the present invention.
- FIG. 1 is a block diagram showing an embodiment of a gaze point detecting apparatus according to the present invention
- FIG. 2b is a flowchart showing sub-steps of the fixation point detection method of FIG. 2a
- FIG. 3 is a diagram showing a reference face in an exemplary coordinate system Schematic diagram of the image
- FIG. 4 is a schematic diagram of an exemplary facial image
- Figure 5a is a schematic diagram showing different face directions
- Figure 5b is a coded diagram showing different face directions
- Figure 6a is a schematic diagram showing an eyeball model in different directions
- Figure 6b is a schematic diagram showing the relationship between the vertical angle and the horizontal angle of the eyeball model in the exemplary coordinate system
- Figure 7 is a schematic diagram showing the relationship between the projection circle radius and the cone apex angle
- Figure 8 is a diagram showing the angle between the projection of the camera and the user on the image ( ⁇ 'B') and the X-axis (A., C ');
- Figure 9 is a schematic diagram showing gaze point detection in accordance with the present invention.
- Figure 10 is a block diagram showing an example of an eye direction table
- Fig. 11 is a block diagram showing one example of a projected circle radius-cone apex table. detailed description
- FIG. 1 is a block diagram showing an embodiment of a fixation point detecting apparatus 100 according to the present invention.
- the fixation point detecting apparatus 100 includes a camera 102, a reference table acquisition unit 104, and a calculation unit 106.
- the camera 102 can be a conventional camera in the art for capturing a facial image of a user.
- the reference table acquisition unit 104 is configured to acquire a reference table including a relationship between the reference face image and the direction of the user's line of sight.
- the calculation unit 106 can calculate the line of sight direction of the user through the reference table, and then calculate the gaze point of the user on the screen 108.
- a 3-axis coordinate system as shown in Fig. 3 can be established, the origin of which is located in the upper left corner of the screen. From the perspective of the computer user, the axis extending from left to right along the upper edge of the screen is the X axis, and the axis extending from the top to the bottom along the left side of the screen is the Y axis, and perpendicular to the screen from the far side (the screen). The axis extending to the near (user end) is the Z axis.
- the camera 102 is mounted at a point A of coordinates ( Xl , y h 0). As shown in Figure 4, point B is the midpoint between the two pupils of the user.
- the AB distance is the distance between point A (the position of the camera) and point B.
- the pupil distance is the distance between the centers of the two pupils of the user in the image.
- the plane Pb refers to a plane perpendicular to the straight line AB where the point B is located.
- the Yb axis is the line of intersection between the vertical plane where the line AB is located and the plane Pb
- the Xb axis is a line perpendicular to the Yb axis in the plane Pb.
- the distance between point A and point B can be detected according to the size of the face image or the related component distance.
- a reference face image is introduced. As shown in FIG. 3, the reference face image refers to when the user's face is directly in front of the camera, and the distance between A and B is D. (The distance between the camera and the midpoint of the two pupils) The image captured by the camera. Since there may be a relative error, the more the number of reference face images, the smaller the relative error, and the more accurate the detection result. For example, two reference face images are introduced, one of which has an AB distance of D.
- the center of each pupil can be positioned so that the distance P between the point B and the center of the two pupils can be obtained, as shown in FIG. If the user's face image is at a distance of D. The reference face image, then the distance between the centers of the two pupils is the reference pupil distance P. . If the user's face image is a reference face image of distance, then the distance between the centers of the two pupils is the reference pitch P!.
- the reference table includes an eyeball direction table and a projected circle radius-cone top angle table, which will be described in detail below with reference to Figs.
- Figure 5a shows the possible face directions. According to the different orientation of the face, the face orientation is roughly divided into nine directions, and different face directions are encoded, as shown in Fig. 5b.
- the contour of the user's eye pupil can be determined simultaneously.
- the user's eyeball can be regarded as a sphere, and the pupil can be regarded as a circle on the surface of the eyeball. Also, the pupil will point directly to the gaze point on the screen.
- Figure 6a shows an eyeball model with 2 different eyeball directions. As shown in Figure 6a, when the user looks in a different direction, the pupil changes direction with the eye. In the image obtained by the camera, the contour of the pupil changes from one ellipse to another. Based on the contour of the pupil and the direction of the face, the angle of rotation of each eyeball can be obtained, including:
- ⁇ ⁇ refers to the angle between the pupil direction and the Yb axis
- the table includes at least five columns of information: the first column represents the index; the second column represents the vertical rotation angle ⁇ ⁇ iller; the third column represents the horizontal rotation angle list 4 The corresponding approximate facial direction is shown; and the fifth column includes images relating to the pupil contour of the eye (pupil) after vertical and horizontal rotation.
- the values of the second column ( ⁇ Ver) and the third column ( ⁇ Hor ) vary between 0. 0- and 180. 0. As shown in Figure 6b, ⁇ ⁇ and ⁇ H . The value of r must satisfy point 0 on the spherical surface.
- Eye direction in the range table is shown in Figure 6 the camera side facing the spherical surface (i.e., the negative direction of the Z axis-axis) corresponding to the sample points on 9 ⁇ and ⁇ ⁇ ", and the camera can see the pupil Bian The contour of the sample.
- the default angle increment is 0.1 degrees.
- Figure 10 only shows the table contents of the pupil at point M, point ⁇ [, point Q and point Q' (where the index column should be an integer value increment in the actual implementation, such as 1, 2, 3, etc., For the convenience of writing, write I M , I N , I Q , etc.).
- the table is used as follows: After obtaining the eye image, extract the outline of the left eye (or right eye) and find the most suitable contour from the table to obtain the following angle: 9 V . ⁇ , ⁇ character or - L (or ⁇ Ver . R , ⁇ Hor _ B ) 0 From the table we can see the points symmetrical around the center of the sphere in Figure 6b, such as point Q and point Q ', seen by the camera The pupil contour is the same, which needs to be added and judged by the direction of the face. In actual operation, according to the positional relationship of the user with respect to the camera 102, and the size of the screen, the user may be at an angle e. v The range of n ⁇ increases the interpolation density, which is helpful for improving the accuracy of the results.
- FIG. 11 shows the relationship of all possible apex angles and the radius of the projected circle for a certain type of camera cone.
- the distance units used in the table are pixels, which can be converted into other units.
- the radius of the projected circle ranges from 0 to R MAX .
- R MM is the furthest distance from the center of the image to the corner of the image.
- the contents of the table can be set according to different cameras, because different cameras have different resolutions, focal lengths and wide angles.
- the recommended projection radius radius increment is 5 pixels. The smaller the granularity, the more accurate the result, but the more time it takes to perform the calculations and the number of comparisons.
- the projection circle radius shown in Fig. 11 - the cone top angle table is in units of 10 pixels, the camera is 20 Q pixels, and the maximum viewing angle of the camera is 40 degrees (20 degrees left and right).
- the interpolation density is increased for the angle corresponding to the position where the user is often located (that is, the apex angle of the cone), which is helpful for improving the accuracy of the result.
- the reference table acquisition unit 104 includes a reference table construction unit 1042 that is captured by the camera 102 with a distance D. And the reference face image, construct the eye direction table and the projection circle radius-cone apex table described above.
- the reference table acquisition unit 104 may further include a reference table storage unit 1044. If the reference table has been constructed and saved in the reference table storage unit 1044, the reference table acquisition unit 104 can directly read therefrom. Further, the reference table constructed by the reference table construction unit 1042 can be stored in the reference table storage unit 1044.
- the calculation unit 106 may include a line of sight direction calculation unit 1062 and a fixation point calculation unit.
- the line-of-sight direction calculating unit 1062 measures the distance between the midpoint between the two pupils of the user in the user's face image and the camera according to the position of the camera, and calculates the line-of-sight direction of the user by searching the reference table. Specifically, the line-of-sight direction calculating unit 1062 detects the general direction of the user's face, the outline of the user's eyes and pupils, and the pupil distance P using a mature face detection/recognition algorithm such as OpenCV. Use the distance? , reference distance P. And to calculate the AB distance L.
- the Di Stance and the Image Size have the following relationship:
- the line-of-sight direction calculating unit 1062 further calculates the angles (1 and 3. Specifically, ⁇ is the line ⁇ in the plane 2, the angle between the ⁇ and the X-axis, where ⁇ is the vertical projection point on the plane ⁇ 2, and B is two The midpoint between the pupils (as shown in Figure 9.) Since plane 2 is parallel to plane 1, this angle a is the same as the projection angle a in the camera image.
- Figure 8 shows the points within the image. ' , ⁇ ' and angle a ' , they satisfy:
- a 0 , B , Xsin(a') B'C (8)
- the line-of-sight direction calculation unit 1062 can search in the projection circle radius-cone apex table to find its projection circle radius value and length ⁇ . 'B' matches the most suitable line. Thus, the apex angle of the cone in the same row is the angle 0. Then, the line-of-sight direction calculation unit 1062 calculates the coordinates of the point B. Using the previously obtained result, when point B is at the lower left of the Ao point (viewing the image angle from the front, as shown in Fig. 9, the same below), the coordinates of point B can be calculated according to the following equation (x 3 , y 3 , z 3 ) :
- the gaze direction calculation unit 1062 calculates the eyeball rotation angle. Specifically, according to the image of the camera, the contour of the pupil of the left eye is detected, and the most suitable contour is found from the above-mentioned eye direction table, combined with the direction of the face, thereby obtaining the vertical of the eyeball relative to the Yb axis. Turn the angle ⁇ H. Rt and horizontal rotation angle ⁇ H relative to the Xb axis. Rt . The right eye ⁇ v er - R , 6 folk. ⁇ can also be obtained in the same procedure.
- the line-of-sight direction calculating unit 1062 calculates the direction of the user's line of sight:
- the direction of the above line of sight is relative to the Xb axis and the Yb axis in the plane Pb, and needs to be further converted into an angle with respect to the X axis and the Y axis.
- the line-of-sight direction calculating unit 1062 calculates the angle ⁇ ⁇ between the horizontal axis Xb axis of the plane Pb and the horizontal axis X-axis of the plane P1 and the angle ⁇ ⁇ ⁇ between the Yb axis and the vertical axis Y-axis of the plane PI. As shown in 9, they satisfy:
- the line-of-sight direction calculating unit 1062 can calculate the final ⁇ Ver-Final and ⁇ Hor-Final ⁇
- the fixation point calculation unit 1064 calculates the fixation point of the user on the screen 108 based on the position of the camera, the distance between the midpoint between the user's two pupils and the camera, and the direction of the user's line of sight. Specifically, the gaze point calculation unit 1064 calculates the coordinates of the fixation point D on the screen 108 by using the vertex and the - Hor-Final 7 calculated by the gaze direction calculation unit 1062 in accordance with the following equation: 74):
- the fixation point detecting device 100 may further include a cursor moving unit 112.
- the cursor moving unit 112 judges whether or not the cursor needs to be moved. Move the cursor to the fixation point if needed. Otherwise, the cursor is not moved.
- there may be some deviation between the true gaze point and the calculated gaze point D due to the accuracy of the calculation and other factors.
- the concept of a gaze area is introduced here, which is a circular area on the screen centered on the point D (the calculated gaze point) with a predefined length G as a radius. Therefore, whenever a new gaze point D is obtained, the cursor is not moved when the gaze point is outside the displayable range of the screen. Furthermore, as long as the distance between the current cursor and point D is less than the predefined value G, the cursor does not move. Conversely, move the cursor to the fixation point D.
- the fixation point detecting device 100 may further include an accessory unit 110.
- the user can perform operations at the cursor using one or more of the attached units, such as the mouse, keyboard, touch pad, handle, and remote control. For example, the user can click or double-click with the mouse, or use the handle or remote control to perform various key operations.
- the method begins at S20.
- preparatory work is performed.
- the preparation work includes: collecting a reference face image on the camera, obtained in this embodiment with a distance D. And the reference face image.
- the reference face image is crucial for the user's face detection/recognition.
- the distance between the midpoints of the two pupils is obtained as the reference pupil distance P.
- oh the distance between the midpoints of the two pupils.
- determine the location of the camera which is the coordinates of point A ( ⁇ , ⁇ , 0).
- step S24 gaze point detection is performed.
- the specific steps of this fixation point detection are shown in Figure 2b. Specifically, at S241, the face of the user, the pupil contour, and the pupil distance P are detected. In S243, what is the distance used? , reference distance ⁇ and? To calculate the AB distance L. At step S245, - angles 0 and 3 are obtained. At step S247, the coordinates of point B are calculated. Thereafter, in step S249, the eyeball rotation angle is calculated.
- the contour of the pupil of the left eye is detected, and the most suitable contour is found from the above-mentioned eye direction table, combined with the direction of the face, thereby obtaining the eyeball relative to the Yb axis.
- the vertical rotation angle ⁇ ⁇ ⁇ ⁇ with respect to the horizontal axis of the rotation angle ⁇ "or - Lo right eye ⁇ Ver _ R, ⁇ ⁇ " - ⁇ can be obtained following the same steps.
- calculate the direction of the user's line of sight calculate the direction of the user's line of sight.
- step S251 the coordinates (, y 4 , 0) of the fixation point D on the screen 108 are calculated based on the calculated direction of the user's line of sight.
- step S26 it is optionally judged at step S26 whether or not the cursor needs to be moved. If necessary, the cursor is moved to the fixation point in step S28. Otherwise, the cursor is not moved. Thereafter, the method flow may return to step S24 to perform gaze point detection cyclically. If the method is terminated, the method ends at S30.
- the present invention provides a gaze point detection method and apparatus based on face detection and image measurement.
- a user's gaze point on the screen is calculated, and the cursor can be used. Move to this area.
- the possible gaze area can be calculated and the cursor is moved into the area, after which the user manually moves the cursor to the desired precise position, which greatly shortens the actual moving distance of the user and reduces the time.
- the calculation load of the fixation point detecting device can be realized by intentionally setting a large predefined radius G according to the actual device accuracy.
- the detection method and apparatus of the present invention can also be applied to a multi-screen computer having a plurality of screens surrounding a user.
- the specific implementation is: When there are multiple screens, determine the individual screens The orientation and its angular relationship with the plane of the camera.
- the above-described principle of the present invention is utilized, and by calculating the intersection of the line of sight extension line and the associated plane, the point of gaze is finally obtained.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Image Processing (AREA)
- Eye Examination Apparatus (AREA)
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200980159898.4A CN102473033B (zh) | 2009-09-29 | 2009-09-29 | 一种注视点检测方法及其装置 |
JP2012531204A JP5474202B2 (ja) | 2009-09-29 | 2009-09-29 | 顔検出および画像測定に基づいて注視点を検出する方法および装置 |
KR1020127010876A KR101394719B1 (ko) | 2009-09-29 | 2009-09-29 | 주시점을 검출하는 방법, 장치 및 다중 스크린 컴퓨터 |
EP20090849935 EP2485118A4 (en) | 2009-09-29 | 2009-09-29 | METHOD FOR DETECTING VISUALIZATION POINTS AND CORRESPONDING APPARATUS |
US13/496,565 US20120169596A1 (en) | 2009-09-29 | 2009-09-29 | Method and apparatus for detecting a fixation point based on face detection and image measurement |
PCT/CN2009/001105 WO2011038527A1 (zh) | 2009-09-29 | 2009-09-29 | 一种注视点检测方法及其装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2009/001105 WO2011038527A1 (zh) | 2009-09-29 | 2009-09-29 | 一种注视点检测方法及其装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011038527A1 true WO2011038527A1 (zh) | 2011-04-07 |
Family
ID=43825476
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2009/001105 WO2011038527A1 (zh) | 2009-09-29 | 2009-09-29 | 一种注视点检测方法及其装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20120169596A1 (zh) |
EP (1) | EP2485118A4 (zh) |
JP (1) | JP5474202B2 (zh) |
KR (1) | KR101394719B1 (zh) |
CN (1) | CN102473033B (zh) |
WO (1) | WO2011038527A1 (zh) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013210742A (ja) * | 2012-03-30 | 2013-10-10 | Fujitsu Ltd | 情報処理装置、情報処理方法、および情報処理プログラム |
CN103455298A (zh) * | 2013-09-06 | 2013-12-18 | 深圳市中兴移动通信有限公司 | 一种外来数据显示方法和外来数据显示设备 |
CN103870097A (zh) * | 2012-12-12 | 2014-06-18 | 联想(北京)有限公司 | 信息处理的方法及电子设备 |
CN104461005A (zh) * | 2014-12-15 | 2015-03-25 | 东风汽车公司 | 一种车载屏幕开关控制方法 |
WO2016115872A1 (zh) * | 2015-01-21 | 2016-07-28 | 成都理想境界科技有限公司 | 双目ar头戴显示设备及其信息显示方法 |
CN108986766A (zh) * | 2014-01-15 | 2018-12-11 | 麦克赛尔株式会社 | 信息显示终端以及信息显示方法 |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140053115A1 (en) * | 2009-10-13 | 2014-02-20 | Pointgrab Ltd. | Computer vision gesture based control of a device |
KR101699922B1 (ko) * | 2010-08-12 | 2017-01-25 | 삼성전자주식회사 | 하이브리드 사용자 추적 센서를 이용한 디스플레이 시스템 및 방법 |
US8433710B2 (en) | 2010-09-09 | 2013-04-30 | Ebay Inc. | Sizing content recommendation system |
KR101231510B1 (ko) * | 2010-10-11 | 2013-02-07 | 현대자동차주식회사 | 운전자 주시방향 연동 전방충돌 위험경보 시스템, 그 방법 및 그를 이용한 차량 |
US20140111452A1 (en) * | 2012-10-23 | 2014-04-24 | Electronics And Telecommunications Research Institute | Terminal and method of controlling touch operations in the terminal |
JP6157165B2 (ja) * | 2013-03-22 | 2017-07-05 | キヤノン株式会社 | 視線検出装置及び撮像装置 |
CN105190515A (zh) | 2013-05-08 | 2015-12-23 | 富士通株式会社 | 输入装置以及输入程序 |
CN103413467A (zh) * | 2013-08-01 | 2013-11-27 | 袁苗达 | 可控强制引导型自主学习系统 |
JP6260255B2 (ja) * | 2013-12-18 | 2018-01-17 | 株式会社デンソー | 表示制御装置およびプログラム |
EP4250738A3 (en) * | 2014-04-22 | 2023-10-11 | Snap-Aid Patents Ltd. | Method for controlling a camera based on processing an image captured by other camera |
JP6346018B2 (ja) * | 2014-07-18 | 2018-06-20 | 国立大学法人静岡大学 | 眼球計測システム、視線検出システム、眼球計測方法、眼球計測プログラム、視線検出方法、および視線検出プログラム |
CN105183169B (zh) * | 2015-09-22 | 2018-09-25 | 小米科技有限责任公司 | 视线方向识别方法及装置 |
US9830708B1 (en) * | 2015-10-15 | 2017-11-28 | Snap Inc. | Image segmentation of a video stream |
CN106123819B (zh) * | 2016-06-29 | 2018-07-24 | 华中科技大学 | 一种注意力焦点测量方法 |
CN106325505B (zh) * | 2016-08-17 | 2019-11-05 | 传线网络科技(上海)有限公司 | 基于视点跟踪的控制方法和装置 |
EP3305176A1 (en) | 2016-10-04 | 2018-04-11 | Essilor International | Method for determining a geometrical parameter of an eye of a subject |
CN106444404A (zh) * | 2016-10-29 | 2017-02-22 | 深圳智乐信息科技有限公司 | 一种控制方法及系统 |
CN106569467A (zh) * | 2016-10-29 | 2017-04-19 | 深圳智乐信息科技有限公司 | 基于移动终端选择场景的方法及系统 |
CN106444403A (zh) * | 2016-10-29 | 2017-02-22 | 深圳智乐信息科技有限公司 | 一种智能家居场景设置和控制的方法及系统 |
CN107003744B (zh) * | 2016-12-01 | 2019-05-10 | 深圳前海达闼云端智能科技有限公司 | 视点确定方法、装置和电子设备 |
CN106791794A (zh) * | 2016-12-30 | 2017-05-31 | 重庆卓美华视光电有限公司 | 一种显示设备、图像处理方法及装置 |
CN107392120B (zh) * | 2017-07-06 | 2020-04-14 | 电子科技大学 | 一种基于视线估计的注意力智能监督方法 |
CN109993030A (zh) * | 2017-12-29 | 2019-07-09 | 上海聚虹光电科技有限公司 | 基于数据统计的注视点预测模型建立方法 |
CN108874127A (zh) * | 2018-05-30 | 2018-11-23 | 北京小度信息科技有限公司 | 信息交互方法、装置、电子设备及计算机可读存储介质 |
CN109947253B (zh) * | 2019-03-25 | 2020-06-19 | 京东方科技集团股份有限公司 | 眼球追踪的模型建立方法、眼球追踪方法、设备、介质 |
WO2021029117A1 (ja) * | 2019-08-09 | 2021-02-18 | 富士フイルム株式会社 | 内視鏡装置、制御方法、制御プログラム、及び内視鏡システム |
CN112445328A (zh) * | 2019-09-03 | 2021-03-05 | 北京七鑫易维信息技术有限公司 | 映射控制方法及装置 |
CN111736698A (zh) * | 2020-06-23 | 2020-10-02 | 中国人民解放军63919部队 | 一种手动辅助定位的视线指点方法 |
CN112541400A (zh) | 2020-11-20 | 2021-03-23 | 小米科技(武汉)有限公司 | 基于视线估计的行为识别方法及装置、电子设备、存储介质 |
CN112434595A (zh) * | 2020-11-20 | 2021-03-02 | 小米科技(武汉)有限公司 | 行为识别方法及装置、电子设备、存储介质 |
CN112804504B (zh) * | 2020-12-31 | 2022-10-04 | 成都极米科技股份有限公司 | 画质调整方法、装置、投影仪及计算机可读存储介质 |
TWI768704B (zh) | 2021-02-05 | 2022-06-21 | 宏碁股份有限公司 | 計算關注焦點的方法及電腦程式產品 |
CN113627256B (zh) * | 2021-07-09 | 2023-08-18 | 武汉大学 | 基于眨眼同步及双目移动检测的伪造视频检验方法及系统 |
CN117017235A (zh) * | 2023-10-09 | 2023-11-10 | 湖南爱尔眼视光研究所 | 一种视觉认知检测方法、装置及设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19819961A1 (de) * | 1998-05-05 | 1999-11-11 | Dirk Kukulenz | Automatische Blickpunktanalyse mit Methoden der Bilderkennung zur Computersteuerung |
US20050200806A1 (en) * | 2004-03-12 | 2005-09-15 | Honda Motor Co., Ltd. | Line-of-sight detection method and apparatus therefor |
CN101311882A (zh) * | 2007-05-23 | 2008-11-26 | 华为技术有限公司 | 视线跟踪人机交互方法及装置 |
CN101326546A (zh) * | 2005-12-27 | 2008-12-17 | 松下电器产业株式会社 | 图像处理装置 |
JP2009042956A (ja) * | 2007-08-08 | 2009-02-26 | Hitachi Ltd | 商品販売装置、商品販売管理システム、商品販売管理方法およびプログラム |
CN101419672A (zh) * | 2008-12-03 | 2009-04-29 | 中国科学院计算技术研究所 | 一种同步采集人脸图像和注视视角的装置及方法 |
CN101489467A (zh) * | 2006-07-14 | 2009-07-22 | 松下电器产业株式会社 | 视线方向检测装置和视线方向检测方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09251342A (ja) * | 1996-03-15 | 1997-09-22 | Toshiba Corp | 注視箇所推定装置とその方法及びそれを使用した情報表示装置とその方法 |
US6351273B1 (en) * | 1997-04-30 | 2002-02-26 | Jerome H. Lemelson | System and methods for controlling automatic scrolling of information on a display or screen |
JP3361980B2 (ja) * | 1997-12-12 | 2003-01-07 | 株式会社東芝 | 視線検出装置及びその方法 |
AU2211799A (en) * | 1998-01-06 | 1999-07-26 | Video Mouse Group, The | Human motion following computer mouse and game controller |
JP2000089905A (ja) * | 1998-09-14 | 2000-03-31 | Sony Corp | ポインティングデバイス |
JP2008129775A (ja) * | 2006-11-20 | 2008-06-05 | Ntt Docomo Inc | 表示制御装置、表示装置、表示制御方法 |
-
2009
- 2009-09-29 CN CN200980159898.4A patent/CN102473033B/zh not_active Expired - Fee Related
- 2009-09-29 US US13/496,565 patent/US20120169596A1/en not_active Abandoned
- 2009-09-29 KR KR1020127010876A patent/KR101394719B1/ko not_active IP Right Cessation
- 2009-09-29 EP EP20090849935 patent/EP2485118A4/en not_active Withdrawn
- 2009-09-29 WO PCT/CN2009/001105 patent/WO2011038527A1/zh active Application Filing
- 2009-09-29 JP JP2012531204A patent/JP5474202B2/ja not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19819961A1 (de) * | 1998-05-05 | 1999-11-11 | Dirk Kukulenz | Automatische Blickpunktanalyse mit Methoden der Bilderkennung zur Computersteuerung |
US20050200806A1 (en) * | 2004-03-12 | 2005-09-15 | Honda Motor Co., Ltd. | Line-of-sight detection method and apparatus therefor |
CN101326546A (zh) * | 2005-12-27 | 2008-12-17 | 松下电器产业株式会社 | 图像处理装置 |
CN101489467A (zh) * | 2006-07-14 | 2009-07-22 | 松下电器产业株式会社 | 视线方向检测装置和视线方向检测方法 |
CN101311882A (zh) * | 2007-05-23 | 2008-11-26 | 华为技术有限公司 | 视线跟踪人机交互方法及装置 |
JP2009042956A (ja) * | 2007-08-08 | 2009-02-26 | Hitachi Ltd | 商品販売装置、商品販売管理システム、商品販売管理方法およびプログラム |
CN101419672A (zh) * | 2008-12-03 | 2009-04-29 | 中国科学院计算技术研究所 | 一种同步采集人脸图像和注视视角的装置及方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2485118A4 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013210742A (ja) * | 2012-03-30 | 2013-10-10 | Fujitsu Ltd | 情報処理装置、情報処理方法、および情報処理プログラム |
CN103870097A (zh) * | 2012-12-12 | 2014-06-18 | 联想(北京)有限公司 | 信息处理的方法及电子设备 |
CN103455298A (zh) * | 2013-09-06 | 2013-12-18 | 深圳市中兴移动通信有限公司 | 一种外来数据显示方法和外来数据显示设备 |
CN108986766A (zh) * | 2014-01-15 | 2018-12-11 | 麦克赛尔株式会社 | 信息显示终端以及信息显示方法 |
CN108986766B (zh) * | 2014-01-15 | 2021-08-17 | 麦克赛尔株式会社 | 信息显示终端以及信息显示方法 |
CN104461005A (zh) * | 2014-12-15 | 2015-03-25 | 东风汽车公司 | 一种车载屏幕开关控制方法 |
CN104461005B (zh) * | 2014-12-15 | 2018-01-02 | 东风汽车公司 | 一种车载屏幕开关控制方法 |
WO2016115872A1 (zh) * | 2015-01-21 | 2016-07-28 | 成都理想境界科技有限公司 | 双目ar头戴显示设备及其信息显示方法 |
Also Published As
Publication number | Publication date |
---|---|
EP2485118A4 (en) | 2014-05-14 |
EP2485118A1 (en) | 2012-08-08 |
CN102473033A (zh) | 2012-05-23 |
KR101394719B1 (ko) | 2014-05-15 |
CN102473033B (zh) | 2015-05-27 |
KR20120080215A (ko) | 2012-07-16 |
JP2013506209A (ja) | 2013-02-21 |
US20120169596A1 (en) | 2012-07-05 |
JP5474202B2 (ja) | 2014-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011038527A1 (zh) | 一种注视点检测方法及其装置 | |
US9600078B2 (en) | Method and system enabling natural user interface gestures with an electronic system | |
JP5728009B2 (ja) | 指示入力装置、指示入力方法、プログラム、記録媒体および集積回路 | |
US9778748B2 (en) | Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program | |
EP2950180B1 (en) | Method for determining screen display mode and terminal device | |
US11042732B2 (en) | Gesture recognition based on transformation between a coordinate system of a user and a coordinate system of a camera | |
US20120206333A1 (en) | Virtual touch apparatus and method without pointer on screen | |
US20030004678A1 (en) | System and method for providing a mobile input device | |
US9280204B2 (en) | Method and apparatus for tracking user's gaze point using mobile terminal | |
US20170316582A1 (en) | Robust Head Pose Estimation with a Depth Camera | |
JP5798183B2 (ja) | ポインティング制御装置とその集積回路、およびポインティング制御方法 | |
US20150009119A1 (en) | Built-in design of camera system for imaging and gesture processing applications | |
JP2017188766A (ja) | カメラを備える電子機器、撮影映像の補正方法および記憶媒体 | |
WO2014029229A1 (zh) | 一种显示控制方法、装置及终端 | |
WO2017147748A1 (zh) | 一种可穿戴式系统的手势控制方法以及可穿戴式系统 | |
WO2019033322A1 (zh) | 手持式控制器、跟踪定位方法以及系统 | |
JP2012238293A (ja) | 入力装置 | |
TW201430720A (zh) | 手勢辨識模組及手勢辨識方法 | |
TWI499938B (zh) | 觸控系統 | |
WO2018171363A1 (zh) | 一种位置信息确定方法、投影设备和计算机存储介质 | |
JP6124862B2 (ja) | ポインティング・ジェスチャに応じたアクションをする方法、会議支援システム、およびコンピュータ・プログラム | |
JP6643825B2 (ja) | 装置及び方法 | |
JP2015149036A (ja) | タッチスクリーンに対する操作精度を向上する方法、電子機器およびコンピュータ・プログラム | |
JP6124863B2 (ja) | ポインティング・ジェスチャ位置を認識する方法、コンピュータ、およびコンピュータ・プログラム | |
US11734940B2 (en) | Display apparatus, display method, and non-transitory recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980159898.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09849935 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13496565 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012531204 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 20127010876 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009849935 Country of ref document: EP |