WO2004057450A1 - Hand pointing apparatus - Google Patents

Hand pointing apparatus Download PDF

Info

Publication number
WO2004057450A1
WO2004057450A1 PCT/EP2002/014739 EP0214739W WO2004057450A1 WO 2004057450 A1 WO2004057450 A1 WO 2004057450A1 EP 0214739 W EP0214739 W EP 0214739W WO 2004057450 A1 WO2004057450 A1 WO 2004057450A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
pointing
hand
cameras
location
Prior art date
Application number
PCT/EP2002/014739
Other languages
English (en)
French (fr)
Inventor
Alberto Del Bimbo
Alessandro Valli
Carlo Colombo
Original Assignee
Universita' Degli Studi Di Firenze
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universita' Degli Studi Di Firenze filed Critical Universita' Degli Studi Di Firenze
Priority to EP02796729A priority Critical patent/EP1579304A1/de
Priority to AU2002361212A priority patent/AU2002361212A1/en
Priority to PCT/EP2002/014739 priority patent/WO2004057450A1/en
Publication of WO2004057450A1 publication Critical patent/WO2004057450A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • This inventions refers to a hand pointing detection apparatus for determining a specific location pointed at by the user.
  • Human-machine interfaces enabling the transfer of information between the user and the system represent a field of growing importance.
  • Human-machine interfaces enable bi-directional communication; on the one side input devices allow users to send commands to the system, on the other side output devices provide users with both responses to commands and feedback about user actions.
  • keyboard, mouse, touch screens are typical input devices while display, loudspeakers and printers are output devices.
  • An important drawback of the most common input devices descends from the physical contact of the user with some of their mechanical parts that ends up wearing the device out.
  • these kinds of input devices require to be close to the PC making it difficult, for the user, to input data when distant from the computer .
  • a certain degree of training and familiarity with the device is required to the user for an efficient use of the device itself.
  • vision-based hand pointing systems appear to be particularly promising. These systems are typically based on a certain number of cameras, a video projector, a screen and a data processing system like a personal computer. The cameras are located so as to have the user and the screen in view; the system output is displayed by the projector onto the screen whose locations can be pointed at by the user. The presence of the screen is not necessary, the user pointing action can be detected even if it's related to objects located in a closed space (i.e. appliances in a room) or in an open one (i.e.
  • the present invention overcomes the above drawbacks introducing a method and an apparatus for the detection of hand pointing of a user based on standard, low cost hardware equipment.
  • This method and apparatus is independent of the number of cameras used, the minimum number being two, and no constraints are set on cameras placement, save that the user must be in view of at least two cameras.
  • the user is allowed to move freely while pointing and the system is independent of environmental changes and user position.
  • users of the apparatus described in the present invention are not requested to calibrate the system before interacting with it, since self-calibration at run time ensures adaptation to user characteristics such as physical dimensions and pointing style.
  • Fig. 1 is an overview of a typical embodiment of the present invention.
  • Fig. 2 shows how the location of the point P pointed at by the user is calculated as the intersection of the screen plane and the line L described by the user's pointing arm.
  • Fig. 3 is a block diagram of the algorithm followed by the data processing unit to detect the hand pointing action.
  • Fig. 4 is the flowchart of the "Background Learning” step of the algorithm.
  • Fig. 5 is the flowchart of the "Calibration” step of the algorithm.
  • Fig. 6 is the flowchart of the "User Detection” step of the algorithm.
  • Fig. 7 is the flowchart of the "Lighting Adaptation” step of the algorithm.
  • Fig. 8 is the flowchart of the "User Localization” step of the algorithm.
  • FIG. 9 is the flowchart of the "Re-mapping” step of the algorithm.
  • Fig. 10 is the flowchart of the "Selection” step of the algorithm.
  • Fig. 11 is the flowchart of the "Adaptation” step of the algorithm.
  • Fig. 1 A preferred embodiment of the present invention is depicted in Fig. 1 where we can see the systems components:
  • a personal computer (23) that processes data received by the cameras and turns them into interaction parameters and then into commands for its graphical interface.
  • An image projector (24) driven by the graphical interface of the personal computer.
  • the projector illuminates the screen (22) pointed at by the user (21 ).
  • Graphic Interface operation is based on both spatial and temporal analysis of user action.
  • the screen location P currently pointed at by the user is continuously evaluated as the intersection of the pointing direction with the screen plane. From each and every acquisition of the systems cameras the position of the head and of the pointing are arm of the user are detected and input to the next processing phase based on a stereo triangulation algorithm.
  • the system monitors persistency: as point P is detected on a limited portion of the screen for an appropriate amount of time, a discrete event, similar to a mouse click, i.e.
  • a selection action is generated for the interface.
  • the overall interaction system behavior is that of a one-button mouse, whose "drags” and “clicks” reflect respectively changes and fixations of interest as communicated by the user through his natural hand pointing actions.
  • the operation of the hand pointing system described in the present invention can be sketched as in Fig. 3. After the cameras have acquired the images of the user, said images are transferred to the PC that processes them following three distinct operational steps: Initialization (200) , Feature Extraction (201 ) and Runtime (203) where the Feature Extraction is a procedure that is used by both the other two phases since it is the one that is able to understand where, in the images, the head and the arm of the user are located.
  • Initialization 200
  • Feature Extraction a procedure that is used by both the other two phases since it is the one that is able to understand where, in the images, the head and the arm of the user are located.
  • the Initialization is composed by two sub-steps: a phase of Background Learning (A) and a phase of Calibration (B).
  • the Background Learning is described in Fig. 4.
  • a number N of frames are chosen for background modeling.
  • the N frames acquired by the cameras are input to the PC (100) and then, for each chromatic channel the mean value and the variance are calculated at each pixel (101 ).
  • the mean value and the variance of the three color channels at each pixel of the background images are calculated (103).
  • the Initialization phase proceeds with the Calibration step (Fig. 3 - B) that will be described later on.
  • the next operational step is called Feature Extraction (201 ) at the end of which the system will acquire the information regarding the possible presence of a user in the cameras view field and his possible pointing action.
  • the Feature Extraction starts with the phase of User Detection (Fig. 6).
  • the current frame is acquired by the camcorders (100) then the background image previously detected is subtracted from the acquired frame (104).
  • the difference image current frame minus background image
  • variance 105
  • the calculated value is then compared to an appropriate threshold value X to decide if the pixel under consideration belongs to the background (calculated difference is less than X) or to the foreground (calculated difference is greater than X).
  • X an appropriate threshold value
  • the system updates its parameters basing on the light level of the actual frame acquired by the cameras.
  • the statistics of the background pixels are thus recalculated in terms of mean value and variance (107).
  • the system updates the thresholds used for the image binarization during the previous step.
  • the number of isolated foreground pixels is computed (108), in order to estimate the noise level of the CCD (charge coupled device) sensor of the camera and consequently update the threshold values (109) used to binarize the image in order to dynamically adjust system sensitiveness.
  • the threshold values used to binarize the image in order to dynamically adjust system sensitiveness.
  • the updated parameters that the system will use to compute the image binarization of the next frame acquired and the binary mask used at the previous cycle of acquisition is refined by topological filters (Fig. 6 - 110).
  • the user presence is classified by his shape and eventually detected (112).
  • the User Localization (Fig. 8) step is started and it is carried on through two different and parallel processes.
  • the silhouette of the user is estimated by detecting the user head and the user arm position (115) through the use of the binary mask previously computed and of geometrical heuristics.
  • the system detects the color of the detected user shape, to determine the zone of exposed skin internal to it. This process runs through several sub-steps: first the foreground is split up into skin and non-skin parts (116) applying the binary mask previously computed and a skin color model to the image acquired by the cameras.
  • the detected skin parts are aggregated into connected blobs (117) and then again the user head and arm are identified by the use of geometrical heuristics (115).
  • the results of the above described estimation are filtered by a smoothing filter (118) and a predictor (119) to reach the final estimate of the color based user localization step.
  • the shape based estimate and the color based estimate are then combined (120) and the coefficients of the image line on each single image acquired are finally determined (121 ), where the image line is the line ideally connecting the head and the hand of the user and represents the pointing direction.
  • the next step, once the pointing direction for every single frame is determined, is called runtime processing (Fig. 3 - 202).
  • the first sub-step of this phase is called Remapping (Fig. 9).
  • the system described in the present invention determines, in the way described above, as many lines as the cameras employed (li, dx; li, sx). These lines, together with the point the cameras are located at (Cdx, Csx), determine a plane in the real 3D space ( ⁇ p.dx; rip.sx). Each one of these planes determines in turn a screen line (Ip.dx; Ip.sx) as their intersection with the plane of the screen (IT) pointed at by the user. The point to be determined is thus the intersection P of these screen lines.
  • the remapping phase starts with the computing of the screen lines described above (122) one each iteration (123). Once the screen lines are all determined the location the user is pointing at is determined as the pseudo- intersection of the screen lines (124).
  • the system After remapping, the system enters the phase of Selection (Fig. 3 - G). With reference to Fig. 10, the screen point detected at the end of the previous phase is recorded (125), then its position is periodically checked with respect to a certain radius R (126, 127, 128 and 129) to determine if the point maintains the same position for a time that is recognized to be enough to show a pointing action by the user and as a consequence the system performs a "clicking" action in response to the persisting pointing action by the user.
  • the current screen point (130) represents the input datum for the following phase of Adaptation by which the system is trained to work with different users.
  • the system calibration parameters are recomputed by optimization (132).
  • the Calibration phase is displayed in detail in Fig. 5.
  • the PC drives the projector to show on the screen the calibration points (133) that have to be pointed at by the user, then the image line coefficients coming from the phase of User Localization are recorded too (134) and these steps are taken for each one of the K points chosen for the calibration (135).
  • a new set of optimised system calibration parameters is estimated (136).
  • the above described system can be implemented in presence of any kind of actuator and any kind of interface driven by the data processing unit.
  • the present invention can be applied to home automation systems where the target of the user's pointing action might be a set of appliances and the computer interface might simply be a control board for switching the appliances on and off.
  • the target of the user's pointing action could be represented by the landscape in front of the user.
  • the computer interface in this case, can be just a driver for an audio playback system providing, for example, information regarding the monuments and locations pointed at by the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
PCT/EP2002/014739 2002-12-23 2002-12-23 Hand pointing apparatus WO2004057450A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP02796729A EP1579304A1 (de) 2002-12-23 2002-12-23 Handzeigegerät
AU2002361212A AU2002361212A1 (en) 2002-12-23 2002-12-23 Hand pointing apparatus
PCT/EP2002/014739 WO2004057450A1 (en) 2002-12-23 2002-12-23 Hand pointing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2002/014739 WO2004057450A1 (en) 2002-12-23 2002-12-23 Hand pointing apparatus

Publications (1)

Publication Number Publication Date
WO2004057450A1 true WO2004057450A1 (en) 2004-07-08

Family

ID=32668686

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2002/014739 WO2004057450A1 (en) 2002-12-23 2002-12-23 Hand pointing apparatus

Country Status (3)

Country Link
EP (1) EP1579304A1 (de)
AU (1) AU2002361212A1 (de)
WO (1) WO2004057450A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013206569A1 (de) * 2013-04-12 2014-10-16 Siemens Aktiengesellschaft Gestensteuerung mit automatisierter Kalibrierung

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6101268A (en) * 1996-04-22 2000-08-08 Gilliland; Malcolm T. Method and apparatus for determining the configuration of a workpiece
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6442416B1 (en) * 1993-04-22 2002-08-27 Image Guided Technologies, Inc. Determination of the position and orientation of at least one object in space

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6442416B1 (en) * 1993-04-22 2002-08-27 Image Guided Technologies, Inc. Determination of the position and orientation of at least one object in space
US6101268A (en) * 1996-04-22 2000-08-08 Gilliland; Malcolm T. Method and apparatus for determining the configuration of a workpiece
US6198485B1 (en) * 1998-07-29 2001-03-06 Intel Corporation Method and apparatus for three-dimensional input entry
US6147678A (en) * 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JENNINGS C: "Robust finger tracking with multiple cameras", RECOGNITION, ANALYSIS, AND TRACKING OF FACES AND GESTURES IN REAL-TIME SYSTEMS, 1999. PROCEEDINGS. INTERNATIONAL WORKSHOP ON CORFU, GREECE 26-27 SEPT. 1999, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 26 September 1999 (1999-09-26), pages 152 - 160, XP002229564, ISBN: 0-7695-0378-0 *
JOJIC N ET AL: "Detection and estimation of pointing gestures in dense disparity maps", AUTOMATIC FACE AND GESTURE RECOGNITION, 2000. PROCEEDINGS. FOURTH IEEE INTERNATIONAL CONFERENCE ON GRENOBLE, FRANCE 28-30 MARCH 2000, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 28 March 2000 (2000-03-28), pages 468 - 475, XP010378301, ISBN: 0-7695-0580-5 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013206569A1 (de) * 2013-04-12 2014-10-16 Siemens Aktiengesellschaft Gestensteuerung mit automatisierter Kalibrierung
US9880670B2 (en) 2013-04-12 2018-01-30 Siemens Aktiengesellschaft Gesture control having automated calibration
DE102013206569B4 (de) 2013-04-12 2020-08-06 Siemens Healthcare Gmbh Gestensteuerung mit automatisierter Kalibrierung

Also Published As

Publication number Publication date
EP1579304A1 (de) 2005-09-28
AU2002361212A1 (en) 2004-07-14

Similar Documents

Publication Publication Date Title
US11887312B2 (en) Fiducial marker patterns, their automatic detection in images, and applications thereof
US10179407B2 (en) Dynamic multi-sensor and multi-robot interface system
US8698898B2 (en) Controlling robotic motion of camera
CN108283018B (zh) 电子设备和用于电子设备的姿态识别的方法
CN107852447B (zh) 基于设备运动和场景距离使电子设备处的曝光和增益平衡
JP2018522348A (ja) センサーの3次元姿勢を推定する方法及びシステム
KR20120014925A (ko) 가변 자세를 포함하는 이미지를 컴퓨터를 사용하여 실시간으로 분석하는 방법
US11675178B2 (en) Virtual slide stage (VSS) method for viewing whole slide images
CN107452018B (zh) 主讲人跟踪方法及系统
EP1579304A1 (de) Handzeigegerät
US9761009B2 (en) Motion tracking device control systems and methods
Lee et al. Robust multithreaded object tracker through occlusions for spatial augmented reality
JP2021174089A (ja) 情報処理装置、情報処理システム、情報処理方法およびプログラム
EP3734960A1 (de) Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren und informationsverarbeitungssystem
Guedri et al. Finger movements tracking of the human hand using the smart camera to control the Allegro Hand robot
Espinosa et al. Minimalist artificial eye for autonomous robots and path planning
Shanmugapriya et al. Gesture Recognition using a Touch less Feeler Machine
CN116204060A (zh) 鼠标指针基于手势的移动和操纵
CN115702320A (zh) 信息处理装置、信息处理方法和程序
CN114527922A (zh) 一种基于屏幕识别实现触控的方法及屏幕控制设备
Nair et al. 3D Position based multiple human servoing by low-level-control of 6 DOF industrial robot
Nair et al. Visual servoing of presenters in augmented virtual reality TV studios
KR20150001242A (ko) 손 제스처 인식용 초광각 스테레오 카메라 시스템 장치 및 방법

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2002796729

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2002796729

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP