WO2008126069A2 - An apparatus system and method for human-machine-interface - Google Patents

An apparatus system and method for human-machine-interface Download PDF

Info

Publication number
WO2008126069A2
WO2008126069A2 PCT/IL2007/000475 IL2007000475W WO2008126069A2 WO 2008126069 A2 WO2008126069 A2 WO 2008126069A2 IL 2007000475 W IL2007000475 W IL 2007000475W WO 2008126069 A2 WO2008126069 A2 WO 2008126069A2
Authority
WO
WIPO (PCT)
Prior art keywords
module
present
machine interface
segmented data
dimensional
Prior art date
Application number
PCT/IL2007/000475
Other languages
English (en)
French (fr)
Other versions
WO2008126069A3 (en
Inventor
Dor Givon
Original Assignee
Xtr - Extreme Reality
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xtr - Extreme Reality filed Critical Xtr - Extreme Reality
Priority to US12/517,210 priority Critical patent/US8432390B2/en
Priority to JP2010502633A priority patent/JP5147933B2/ja
Priority to KR1020097023103A priority patent/KR101379074B1/ko
Priority to CA2684020A priority patent/CA2684020C/en
Priority to EP07736215A priority patent/EP2147393A4/en
Priority to PCT/IL2007/000475 priority patent/WO2008126069A2/en
Publication of WO2008126069A2 publication Critical patent/WO2008126069A2/en
Publication of WO2008126069A3 publication Critical patent/WO2008126069A3/en
Priority to IL201514A priority patent/IL201514A/he
Priority to US13/355,643 priority patent/US8928654B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition

Definitions

  • the present invention generally relates to user interfaces and more particularly to methods and systems of 3D Human-Machine-Interface.
  • the job would generate a printout, containing final results or (all too often) an abort notice with an attached error log. Successful runs might also write a result on magnetic tape or generate some data cards to be used in later computation.
  • the turnaround time for a single job often spanned entire days. If one were very lucky, it might be hours; real-time response was unheard of. But there were worse fates than the card queue; some computers actually required an even more tedious and error-prone process of toggling in programs in binary code using console switches. The very earliest machines actually had to be partly rewired to incorporated program logic into themselves, using devices known as plugboards.
  • Command-line interfaces were closely associated with the rise of timesharing computers.
  • the concept of timesharing dates back to the 1950s; the most influential early experiment was the MULTICS operating system after 1965; and by far the most influential of present-day command-line interfaces is that of Unix itself, which dates from 1969 and has exerted a shaping influence on most of what came after it.
  • VDTs video-display terminals
  • the PDP-1 console display had been descended from the radar display tubes of World War II, twenty years earlier, reflecting the fact that some key pioneers of minicomputing at MIT's Lincoln Labs were former radar technicians. Across the continent in that same year of 1962, another former radar technician was beginning to blaze a different trail at Stanford Research Institute. His name was Doug Engelbart. He had been inspired by both his personal experiences with these very early graphical displays and by Vannevar Bush's seminal essay As We May Think, which had presented in 1945 a vision of what we would today call hypertext.
  • VR can confuse the human proprioceptive system; VR motion at even moderate speeds can induce dizziness and nausea as the brain tries to reconcile the visual simulation of motion with the inner ear's report of the body's real-world motions.
  • Jef Raskin's THE project (The Humane Environment) is exploring the zoom world model of GUIs, described in that spatializes them without going 3D.
  • THI the screen becomes a window on a 2-D virtual world where data and programs are organized by spatial locality.
  • Objects in the world can be presented at several levels of detail depending on one's height above the reference plane, and the most besic selection operation is to zoom in and land on them.
  • 3D human machine interface may include (1) an image acquisition assembly, (2) an initializing module, (3) an image segmentation module, (4) a segmented data processing module, (5) a scoring module, (6) a projection module, (7) a fitting module,(8) a scoring and error detection module, (9) a recovery module, (10) a three dimensional correlation module, (11) a three dimensional skeleton prediction module, and (12) an output module.
  • 3D HMI 3D human machine interface
  • 3D HMI may include (1) an image acquisition assembly, (2) an initializing module, (3) an image segmentation module, (4) a segmented data processing module, (5) a scoring module, (6) a projection module, (7) a fitting module,(8) a scoring and error detection module, (9) a recovery module, (10) a three dimensional correlation module, (11) a three dimensional skeleton prediction module, and (12) an output module.
  • the image acquisition assembly may be adapted to acquire a set of images, wherein substantially each image is associated with a different point in time.
  • the images may be of a single user or multiple users.
  • the initialization module may be adapted to detect and define the user's (1) colors, (2) organ's parameters, surrounding, and other parameters which are associated with the user.
  • the user may be any person and/or animal and/or moving object which enters the frame.
  • the image segmentation module may be adapted to extract segmented data from the image.
  • the segmented data may also comprise: • Color
  • the segmented data processing module may be adapted to process the segmented data.
  • the segmented data may be processed in the following way:
  • Color - using known color parameters to detect elements and/or light changes for example, use skin color to detect palms and face.
  • Edge detection detect the edges of the image.
  • the segmented data processing module may be adapted to detect deviation in the distance of an organ from the image acquisition assembly, in accordance with the deviation of the organs relative size.
  • the scoring module may be adapted to (1) examine the processed segmented data, (2) estimate the quality of the processed segmented data, and according to the quality (3) decide which portions of the segmented data are reliable enough to be used by the HMI system.
  • the three dimensional skeleton prediction module may be adapted to predict the position of the three dimensional skeleton which will have the best match or correlation with the processed image.
  • the three dimensional prediction module may use constraints which derive from the type of skeleton used, for example, if the skeleton is of a human figure, the head of the skeleton can't rotate 360 degrees.
  • the three dimensional prediction module may also use a set of dynamic and motion process to predict the position of the three dimensional skeleton.
  • the projection module may be adapted to project the skeleton onto the image.
  • the projection may be applied in the two-dimensional plane.
  • the fitting module may be adapted to fit segmented data to the projected skeleton.
  • the fitting module may be adapted to associate portions of the segmented data with portions of the projected skeleton.
  • the scoring and error detection module may be adapted (1) to examine the processed skeleton after it was associated with segmented data, (2) to evaluate the fitting quality of said skeleton and (3) determine whether an error has occurred during the skeleton prediction process or the association of segmented data.
  • the recovery module may be adapted to recover from a detected error.
  • the recovery may be a process of multiple processing layers, re segmenting the image, using the 3D skeleton motion history to re predict correct position, re projecting and re fitting the 3D skeleton.
  • the recovery module may also decide to skip a frame if the image information is corrupt.
  • the three dimensional correlation module may be adapted to update the position of the three dimensional skeleton in accordance with the position of the fitted skeleton.
  • said updating process associates the 3D skeleton on the fitted skeleton, fits between the 3D skeleton and the fitted skeleton, and updates the 3D skeleton to the correct position.
  • FIG. 1 there is shown a block diagram depicting a system in accordance with some embodiments of the present invention.
  • FIG. 2 there is shown a flow-chart depicting the steps of an HMI system in accordance with some embodiments of the present invention.
  • FIG. 3 there is shown a block diagram depicting a system in accordance with some embodiments of the present invention.
  • FIG. 4 there is shown a flow-chart depicting the steps of an HMI system in accordance with some embodiments of the present invention.
  • FIG. 5 there is shown a block diagram depicting a system in accordance with some embodiments of the present invention.
  • FIG. 6 there is shown a flow-chart depicting the steps of an HMI system in accordance with some embodiments of the present invention.
  • Embodiments of the present invention may include apparatuses for performing the operations herein. This apparatus may be specially constructed for the desired purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
  • a computer readable storage medium such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs) electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, or any other type of media suitable for storing electronic instructions, and capable of being coupled to a computer system bus.
  • 3D human machine interface may include (1) an image acquisition assembly, (2) an initializing module, (3) an image segmentation module, (4) a segmented data processing module, (5) a scoring module, (6) a projection module, (7) a fitting module,(8) a scoring and error detection module, (9) a recovery module, (10) a three dimensional correlation module, (11) a three dimensional skeleton prediction module, and (12) an output module.
  • 3D HMI 3D human machine interface
  • 3D HMI may include (1) an image acquisition assembly, (2) an initializing module, (3) an image segmentation module, (4) a segmented data processing module, (5) a scoring module, (6) a projection module, (7) a fitting module,(8) a scoring and error detection module, (9) a recovery module, (10) a three dimensional correlation module, (11) a three dimensional skeleton prediction module, and (12) an output module.
  • the image acquisition assembly may be adapted to acquire a set of images, wherein substantially each image is associated with a different point in time.
  • the images may be of a single user or multiple users.
  • the initialization module may be adapted to detect and define the user's (1) colors, (2) organ's parameters, surrounding, and other parameters which are associated with the user and decide on the best way for image segmentation in the next steps (thresholds, score for every image segmentation etc.)
  • the image segmentation module may be adapted to extract segmented data from the image.
  • the segmented data may also comprise:
  • the segmented data processing module may be adapted to process the segmented data.
  • the segmented data may be processed in the following way:
  • Color - using known color parameters to detect elements and/or light changes for example, use skin color to detect palms and face.
  • Edge detection detect the edges of the image.
  • the scoring module may be adapted to (1) examine the processed segmented data, (2) estimate the quality of the processed segmented data, and according to the quality (3) decide which portions of the segmented data are reliable enough to be used by the HMI system.
  • the three dimensional skeleton prediction module may be adapted to predict the position of the three dimensional skeleton which will have the best match or correlation with the processed image.
  • the three dimensional prediction module may use constraints which derive from the type of skeleton used, for example, if the skeleton is of a human figure, the head of the skeleton can't rotate 360 degrees.
  • the three dimensional prediction module may use a set of dynamic and motion process to predict the position of the three dimensional skeleton.
  • the projection module may be adapted to project the skeleton onto the image.
  • the projection may be applied in the two-dimensional plane.
  • the fitting module may be adapted to fit segmented data to the projected skeleton.
  • the fitting module may be adapted to associate portions of the segmented data with portions of the projected skeleton.
  • the scoring and error detection module may be adapted (1) to examine the processed skeleton after it was associated with segmented data, (2) to evaluate the fitting quality of said skeleton and (3) determine whether an error has occurred during the skeleton prediction process or the association of segmented data.
  • the recovery module may be adapted to recover from a detected error.
  • the recovery may be a process of multiple processing layers, re segmenting the image, using the 3D skeleton motion history to re predict correct position, re projecting and re fitting the 3D skeleton. The recovery module may also decide to skip a frame if the image information is corrupt.
  • the three dimensional correlation module may be adapted to update the position of the three dimensional skeleton in accordance with the position of the fitted skeleton.
  • said updating process associates the 3D skeleton on the fitted skeleton, fits between the 3D skeleton and the fitted skeleton, and updates the 3D skeleton the correct position.
  • FIG. 1 there is shown an exemplary HMI system in accordance with some embodiments of the present invention, which system may be best described in conjunction with Figure 2, there is shown a flow chart depicting the steps of such an HMI system.
  • Figure 1 shows a 3D human machine interface (“3D HMI"), which 3D HMI may include (1) an image acquisition assembly 1000, (2) an initializing module 1100, (3) an image segmentation module 1200, (4) a segmented data processing module 1300, (5) a scoring module 1400, (6) a projection module 1450, (7) a fitting module 1500,(8) a scoring and error detection module 1550, (9) a recovery module 1600, (10) a three dimensional correlation module 1700, (11) a three dimensional skeleton prediction module 1800, and (12) an output module 1900.
  • 3D HMI 3D human machine interface
  • the image acquisition assembly may be adapted to acquire a set of images, as seen in step 2000, wherein substantially each image is associated with a different point in time.
  • the images may be of a single user or multiple users.
  • the image acquisition assembly may comprise of a digital camera, a web camera, a film camera, a video camera, a web camera, a digital video camera, an analogue
  • the system may enter an initialization phase, step 2100, which is performed by the initialization module 1100.
  • which initialization module may be adapted to detect and define the user's (1) colors,
  • organ's parameters (2) organ's parameters, (3) surrounding, and other parameters which are associated with the user.
  • the system may be adapted to extract segmentation data, as shown in step 2200, which segmented data may comprise:
  • the image segmentation module 1200 may be adapted to extract the segmented data from the image.
  • the system may be adapted to process the segmented data, as shown in step 2300.
  • the segmented data may be processed in the following way:
  • Color - using known color parameters to detect elements and/or light changes for example, use skin color to detect palms and face.
  • Edge detection detect the edges in the image.
  • the segmented data processing module, 1300 may be adapted to process the segmented data.
  • the system may be adapted to evaluate the quality of the segmented data, as shown in step 2400, the evaluation is performed by (1) examining the processed segmented data, (2) estimating the quality of the processed segmented data, and according to the estimated quality (3) decide which portions of the segmented data are reliable enough to be used by the HMI system.
  • the scoring module 1400 may be adapted evaluate the quality of the segmented information.
  • the system may be adapted to predict the position of the three dimensional skeleton, as shown in step 2800, which position will have the best match or correlation with the processed image.
  • the prediction may be more accurate with the use of constraints which derive from the type of skeleton used, for example, if the skeleton is of a human figure, the head of the skeleton can't rotate 360 degrees without a motion of the shoulders.
  • the prediction sequence may also use a set of dynamic and motion process and so on.
  • the three dimensional skeleton prediction module 1800 may be adapted to predict the position of the three dimensional skeleton.
  • the system may be further adapted to project the skeleton onto the image, as shown in step
  • the projection may be applied in the two-dimensional plane.
  • the projection module, 2450 may be adapted to project the skeleton onto the image.
  • the system may be further adapted to fit the segmented data with the projected skeleton, as shown in step 2500.
  • the fitting process may comprise the association of portions of the segmented data with portions of the projected skeleton.
  • fitting the segmented data may comprise associating portions of the extracted segmented data with current skeleton parameters, which current skeleton parameters may support the associated portions of extracted segmented data.
  • the outcome of this process is a "fitted skeleton”.
  • the fitting module, 2500 may be adapted to associate the segmented data with the projected skeleton.
  • the system may be further adapted to give score to the fitted skeleton and detect errors, as shown in step 2550.
  • giving score and detecting errors may comprise of (1) examining the fitted skeleton, (2) evaluating the fitting quality of said skeleton and (3) determining whether an error has occurred during the skeleton prediction process or the association of segmented data.
  • the scoring and error detection module 1550 may be adapted to give score and detect errors.
  • the system may enter a recovery phase, as shown in step 2600.
  • the recover process may be a process of multiple processing layers.
  • the recovery phase may comprise re-segmenting the image, re-predicting the 3D skeleton position, re-projecting and re-fitting the skeleton using extended effort.
  • the recovery module may also decide to skip a frame or more if the image information is corrupt.
  • the system may be adapted to detect that the object its tracking is not in the frame.
  • the system may be adapted to skip one or more frames until the object is back in the frame.
  • the recovery phase may direct the system back to the initialization step.
  • the recovery module 2600 may be adapted to perform the recovery process.
  • the system may be adapted to update the position of the three dimensional skeleton in accordance with the position of the fitted skeleton, as shown in step 2700.
  • the updating process may comprise (1) projecting the 3D skeleton on the fitted skeleton, (2) associating the 3D skeleton with the fitted skeleton, and (3) updating the position of the 3D skeleton.
  • the three dimensional correlation module, 1700 may be adapted to update the position of the three dimensional skeleton.
  • the three- dimensional correlation module 1700 and the skeleton prediction module 1800 may use some or all of the algorithms and processes which were disclosed in PCT application serial number.PCT/IL2005/000813, filed on 31 July 2005 under the same assignee as the present application.
  • FIG. 3 there is shown an exemplary HMI system in accordance with some embodiments of the present invention, which system may be best described in conjunction with Figure 4, there is shown a flow chart depicting the steps of such an HMI system.
  • Figure 3 shows a 3D human machine interface (“3D HMI"), which 3D HMI may include (1) a Zlens image acquisition assembly 3000, (2) an initializing module 3100, (3) an image segmentation module 3200, (4) a segmented data processing module 3300, (5) a fitting module 3500. (6) a scoring module 3550, (7)a three dimensional correlation module 3700, and (8) an output module 3900.
  • the Zlens acquisition assembly may be adapted to acquire a set of images, as seen in step 4000 wherein substantially each image is associated with a different point in time.
  • the images may be of a single user or multiple users.
  • the Zlens acquisition assembly may be mounted on another image acquisition assembly, i.e. element 1000 of Fig.1.
  • the Zlens acquisition assembly (3000) may be best described in conjunction with PCT/IL2006/001254 filed on October 31 2006 under the same assignee as the present application and with US Patent application 60/731 ,274 US filed on October 31 2005 under the same assignee as the present application.
  • system is further adapted to enter an initialization phase, as shown in step 4100,
  • the initialization module 3100 which is performed by the initialization module 3100. Which initialization module may be adapted to detect and define the user's (1) colors, (2) organ's parameters, surroundings, (3) and other parameters which are associated with the user.
  • the system may be adapted to extract segmentation data, as shown in step 4200, which segmented data may comprise:
  • the image segmentation module 3200 may be adapted to extract the segmented
  • the system may be adapted to process the segmented data, as shown in step 4300.
  • Color - using known color parameters to detect elements and/or light changes for example, use skin color to detect palms and face.
  • Edge detection detect the contours of every organ.
  • the segmented data processing module, 3300 may be adapted to process the segmented data.
  • the system may be further adapted to fit portions of the extracted segmented data with the acquired image, as shown in step 4500.
  • the fitting process may comprise associating portions of the extracted segmented data with dedicated areas of the acquired image.
  • the dedicated areas may be stored in the system or may be determined during the initialization phase.
  • the dedicated areas may be specific organs of the user (hands, head, feet) or any other element which may be acquired during step 3000.
  • the fitting process may comprise testing whether the extracted segmented data defines parameters which are relevant to the dedicated areas.
  • the fitting module, 3500 may be adapted to the associate portions of the segmented data with the acquired image.
  • the outcome of this process is a "fitted image”.
  • the system may be further adapted to evaluate the quality of the fitted segmented data, as shown in step 4550.
  • evaluating the quality of the fitted segmented data may comprise of (1) examining the processed segmented data, (2) estimating the quality of the processed segmented data, and according to the estimated quality (3) decide which portions of the segmented data are reliable enough to be used by the HMI system, (4) examine the fitted image, (5) evaluate the fitting quality of said image and (6) determining whether an error has occurred during the association of segmented data.
  • the scoring module, 3550 may be adapted to evaluate the quality of the fitted segmented data.
  • the system may comprise an error detection mechanism and a recovery mechanism as was described hereinabove.
  • the system may be adapted to update the position of a three dimensional body in accordance with the fitted image and the extrapolation of a depth map using the Zlens image acquisition assembly , as shown in step 4700.
  • the updating process may comprise associating the extracted depth map with the extracted segmented data, and (3) updating the position of the three-dimensional body of the output model.
  • the three dimensional correlation module, 3700 may be adapted to update the position of the three dimensional body.
  • the functionality of the three-dimensional correlation module 3700 and the Zlens image acquisition assembly 3000 and particularly the extrapolation of depth from an image acquired using the Zlens apparatus may best be described in conjunction with PCTVI L2006/001254 filed on October 31 2006 under the same assignee as the present application and with US Patent application 60/731,274
  • FIG. 5 there is shown an exemplary HMI system in accordance with some embodiments of the present invention, which system may be best described in conjunction with Figure 6, there is shown a flow chart depicting the steps of such an HMI system.
  • Figure 5 shows a 3D human machine interface (“3D HMI"), which 3D HMI may include (1) a Zlens acquisition assembly 5000, (2) an initializing module 5100, (3) an image segmentation module 5200, (4) a segmented data processing module 5300, (5) a scoring module 5400, (6) a projection module 5450, (7) a fitting module 5500,(8) a scoring and error detection module 5550, (9) a recovery module 5600, (10) a three dimensional correlation module 5700, (11) a three dimensional skeleton prediction module 5800, (12) an output module 5900 and an optional (13) depth extraction module 5050.
  • 3D HMI 3D human machine interface
  • the Zlens acquisition assembly may be adapted to acquire a set of images, as seen in step 6000, wherein substantially each image is associated with a different point in time.
  • the images may be of a single user or multiple users.
  • the Zlens acquisition assembly may be mounted on another image acquisition assembly, i.e. element 1000 of Fig.1.
  • the Zlens acquisition assembly (5000) may be best described in conjunction with PCT/IL2006/001254 filed on October 31 2006 under the same assignee as the present application and with US Patent application 60/731 ,274 US filed on October 31 2005 under the same assignee as the present application.
  • the system may enter an initialization phase, step 6100, which is performed by the initialization module 5100.
  • which initialization module may be adapted to detect and define the user's (1) colors, (2) organ's parameters, (3) surroundings, and other parameters which are associated with the user.
  • the system may be adapted to extract segmentation data, as shown in step 6200, which segmented data may comprise:
  • the image segmentation module 5200 may be adapted to extract the segmented data from the image.
  • the system may be adapted to process the segmented data, as shown in step 6300.
  • the segmented data may be processed in the following way:
  • Color - using known color parameters to detect elements and/or light changes for example, use skin color to detect palms and face.
  • Edge detection detect the contours of every organ.
  • the segmented data processing module, 5300 may be adapted to process the
  • the system may be adapted to evaluate the quality of the segmented data, as shown in step 6400, the evaluation is performed by (1) examining the processed segmented data, (2) estimating the quality of the processed segmented data, and according to the estimated quality (3) decide which portions of the segmented data are reliable enough to be used by the HMI system.
  • the scoring module, 5400 may be adapted evaluate the quality of the segmented information.
  • the system may be adapted to predict the position of the three dimensional skeleton, as shown in step 6800, which position will have the best match or correlation with the processed image.
  • the prediction may be more accurate with the use of constraints which derive from the type of skeleton used, for example, if the skeleton is of a human figure, the head of the skeleton can't rotate 360 degrees without a motion of the shoulders.
  • the prediction sequence may also use a set of dynamic and motion process.
  • the three dimensional skeleton prediction module 5800 may be adapted to predict the position of the three dimensional skeleton.
  • the system may be further adapted to extract depth using the Zlens acquisition assembly, as shown in step 6050, the extraction of depth using a Zlens acquisition assembly is described in (1) PCT/IL2006/001254 filed on October 31 2006 under the same assignee as the present application and with (2)US Patent application 60/731,274 US filed on October 31 2005 under the same assignee as the present application.
  • the depth extraction module, 5050 may be adapted to extract depth from the acquired image.
  • the system may be further adapted to project the skeleton onto the image, as shown in step 6450.
  • the projection may be applied in the two-dimensional plane.
  • the projection of the skeleton may be applied in the three-dimensional plane if module 5050 is used.
  • the projection may be onto a three-dimensional image and/or a three dimensional cloud of points.
  • the projection module, 6450 may be adapted to project the skeleton onto the image.
  • the system may be further adapted to fit the segmented data with the projected skeleton, as shown in step 6500.
  • the fitting process may comprise the association of portions of the segmented data with portions of the projected skeleton.
  • fitting the segmented data may comprise associating portions of the extracted segmented data with current skeleton parameters, which current skeleton parameters may support the associated portions of extracted segmented data.
  • the outcome of this process is a "fitted skeleton”.
  • the fitting module, 5500 may be adapted to associate the segmented data with the projected skeleton.
  • the system may be further adapted to give score to the fitted skeleton and detect errors, as shown in step 6550.
  • giving score and detecting errors may comprise of (1) examining the fitted skeleton, (2) evaluating the fitting quality of said skeleton and (3) determining whether an error has occurred during the skeleton prediction process or the association of segmented data.
  • the scoring and error detection module, 5550 may be adapted to give score and detect errors
  • the system may enter a recovery phase, as shown in step 6600.
  • the recovery process may be a process of multiple processing layers.
  • the recovery phase may comprise re-segmenting the image, re-predicting the 3D skeleton position, re-projecting and re-fitting the skeleton using extended effort.
  • the recovery module may also decide to skip a frame or more if the image information is corrupt.
  • the system may be adapted to detect that the object its tracking is not in the frame.
  • the system may be adapted to skip one or more frames until the object is back in the frame.
  • the recovery phase may direct the system back to the initialization step.
  • the recovery module 5600 may be adapted to perform the recovery process.
  • the system may be adapted to update the position of the three dimensional skeleton in accordance with the position of the fitted skeleton, as shown in step 6700.
  • the updating process may comprise (1) projecting the 3D skeleton on the fitted skeleton, (2) associating the three dimensional skeleton with the fitted skeleton, (3) extract depth using the Zlens assembly, (4) associating the three- dimensional skeleton with depth parameters and (5) updating the position of the 3D skeleton.
  • the three dimensional correlation module, 5700 may be adapted to update the position of the three dimensional skeleton.
  • the three-dimensional correlation module 5700 and the skeleton prediction module 5800 may use some or all of the algorithms and processes which were disclosed in PCT application serial number.PCT/IL2005/000813, filed on 31 July 2005 under the same assignee as the present application.
  • the functionality of (1) the three-dimensional correlation module 5700 (2) the Zlens image acquisition assembly 5000 the (3) depth extraction module 5050 and (4) particularly the extrapolation of depth from an image acquired using the Zlens apparatus may best be described in conjunction with PCT/IL2006/001254 filed on October 31 2006 under the same assignee as the present application and with US Patent application 60/731 ,274 US filed on October 31 2005 under the same assignee as the present application.
  • the systems described hereinabove may be adapted to receive from an exterior source depth images and/or three-dimensional images. According to yet further embodiments of the present invention, if a depth images and/or three- dimensional images is received the system is adapted to extract its parameters in the relevant modules.
  • the processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method. The desired structure for a variety of these systems will appear from the description below. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the inventions as described herein. One of ordinary skill in the art should understand that the described invention may be used for all kinds of wireless or wire-line system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/IL2007/000475 2004-07-30 2007-04-15 An apparatus system and method for human-machine-interface WO2008126069A2 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US12/517,210 US8432390B2 (en) 2004-07-30 2007-04-15 Apparatus system and method for human-machine interface
JP2010502633A JP5147933B2 (ja) 2007-04-15 2007-04-15 人−機械インターフェース装置システム及び方法
KR1020097023103A KR101379074B1 (ko) 2007-04-15 2007-04-15 인간 기계 인터페이스를 위한 장치 시스템 및 방법
CA2684020A CA2684020C (en) 2007-04-15 2007-04-15 An apparatus system and method for human-machine-interface
EP07736215A EP2147393A4 (en) 2007-04-15 2007-04-15 DEVICE, SYSTEM AND METHOD FOR A HUMAN MACHINE INTERFACE
PCT/IL2007/000475 WO2008126069A2 (en) 2007-04-15 2007-04-15 An apparatus system and method for human-machine-interface
IL201514A IL201514A (he) 2007-04-15 2009-10-14 מנגנון, מערכת ושיטה לממשק בין אדם למכונה
US13/355,643 US8928654B2 (en) 2004-07-30 2012-01-23 Methods, systems, devices and associated processing logic for generating stereoscopic images and video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IL2007/000475 WO2008126069A2 (en) 2007-04-15 2007-04-15 An apparatus system and method for human-machine-interface

Related Child Applications (4)

Application Number Title Priority Date Filing Date
PCT/IL2005/000813 Continuation-In-Part WO2006011153A2 (en) 2004-07-30 2005-07-31 A system and method for 3d space-dimension based image processing
US11/572,958 Continuation-In-Part US8114172B2 (en) 2004-07-30 2005-07-31 System and method for 3D space-dimension based image processing
US57295807A Continuation-In-Part 2004-07-30 2007-01-30
US12/517,210 A-371-Of-International US8432390B2 (en) 2004-07-30 2007-04-15 Apparatus system and method for human-machine interface

Publications (2)

Publication Number Publication Date
WO2008126069A2 true WO2008126069A2 (en) 2008-10-23
WO2008126069A3 WO2008126069A3 (en) 2009-04-23

Family

ID=39864447

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2007/000475 WO2008126069A2 (en) 2004-07-30 2007-04-15 An apparatus system and method for human-machine-interface

Country Status (6)

Country Link
EP (1) EP2147393A4 (he)
JP (1) JP5147933B2 (he)
KR (1) KR101379074B1 (he)
CA (1) CA2684020C (he)
IL (1) IL201514A (he)
WO (1) WO2008126069A2 (he)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012095756A2 (en) * 2011-01-03 2012-07-19 Extreme Reality Ltd. Method circuit and system for human to machine interfacing by hand gestures
US8615108B1 (en) 2013-01-30 2013-12-24 Imimtek, Inc. Systems and methods for initializing motion tracking of human hands
US8655021B2 (en) 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US8878779B2 (en) 2009-09-21 2014-11-04 Extreme Reality Ltd. Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
US8878896B2 (en) 2005-10-31 2014-11-04 Extreme Reality Ltd. Apparatus method and system for imaging
US8928654B2 (en) 2004-07-30 2015-01-06 Extreme Reality Ltd. Methods, systems, devices and associated processing logic for generating stereoscopic images and video
US9046962B2 (en) 2005-10-31 2015-06-02 Extreme Reality Ltd. Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9177220B2 (en) 2004-07-30 2015-11-03 Extreme Reality Ltd. System and method for 3D space-dimension based image processing
US9218126B2 (en) 2009-09-21 2015-12-22 Extreme Reality Ltd. Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
CN108144292A (zh) * 2018-01-30 2018-06-12 河南三阳光电有限公司 裸眼3d互动游戏制作设备

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3481430B2 (ja) * 1997-09-11 2003-12-22 富士通株式会社 移動体追跡装置
JP3800905B2 (ja) * 1999-07-27 2006-07-26 松下電工株式会社 画像特徴追跡処理方法、画像特徴追跡処理装置、3次元データ作成方法
JP2001236505A (ja) * 2000-02-22 2001-08-31 Atsushi Kuroda 座標推定方法、座標推定装置および座標推定システム
JP2002259474A (ja) * 2001-03-05 2002-09-13 Oojisu Soken:Kk 人体モデル生成方法、人体モデル生成装置、コンピュータプログラム及び記録媒体
US6833843B2 (en) * 2001-12-03 2004-12-21 Tempest Microsystems Panoramic imaging and display system with canonical magnifier
JP2003256850A (ja) * 2001-12-27 2003-09-12 Sanyo Electric Co Ltd 動き認識装置および画像処理装置並びにプログラム
US9177387B2 (en) * 2003-02-11 2015-11-03 Sony Computer Entertainment Inc. Method and apparatus for real time motion capture
JPWO2004094943A1 (ja) * 2003-04-22 2006-07-13 博 有澤 モーションキャプチャ方法、モーションキャプチャ装置、及びモーションキャプチャ用マーカ
JP4481663B2 (ja) * 2004-01-15 2010-06-16 キヤノン株式会社 動作認識装置、動作認識方法、機器制御装置及びコンピュータプログラム
US7308112B2 (en) * 2004-05-14 2007-12-11 Honda Motor Co., Ltd. Sign based human-machine interaction
KR101323966B1 (ko) * 2004-07-30 2013-10-31 익스트림 리얼리티 엘티디. 이미지 프로세싱을 기반으로 한 3d 공간 차원용 시스템 및 방법
WO2006099597A2 (en) * 2005-03-17 2006-09-21 Honda Motor Co., Ltd. Pose estimation based on critical point analysis

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of EP2147393A4 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8872899B2 (en) 2004-07-30 2014-10-28 Extreme Reality Ltd. Method circuit and system for human to machine interfacing by hand gestures
US8928654B2 (en) 2004-07-30 2015-01-06 Extreme Reality Ltd. Methods, systems, devices and associated processing logic for generating stereoscopic images and video
US9177220B2 (en) 2004-07-30 2015-11-03 Extreme Reality Ltd. System and method for 3D space-dimension based image processing
US9131220B2 (en) 2005-10-31 2015-09-08 Extreme Reality Ltd. Apparatus method and system for imaging
US8878896B2 (en) 2005-10-31 2014-11-04 Extreme Reality Ltd. Apparatus method and system for imaging
US9046962B2 (en) 2005-10-31 2015-06-02 Extreme Reality Ltd. Methods, systems, apparatuses, circuits and associated computer executable code for detecting motion, position and/or orientation of objects within a defined spatial region
US9218126B2 (en) 2009-09-21 2015-12-22 Extreme Reality Ltd. Methods circuits apparatus and systems for human machine interfacing with an electronic appliance
US8878779B2 (en) 2009-09-21 2014-11-04 Extreme Reality Ltd. Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen
WO2012095756A3 (en) * 2011-01-03 2013-07-18 Extreme Reality Ltd. Method circuit and system for human to machine interfacing by hand gestures
JP2014501011A (ja) * 2011-01-03 2014-01-16 エクストリーム リアリティー エルティーディー. 手のジェスチャによるヒューマンマシンインターフェースのための方法、回路、及び、システム
WO2012095756A2 (en) * 2011-01-03 2012-07-19 Extreme Reality Ltd. Method circuit and system for human to machine interfacing by hand gestures
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
US9504920B2 (en) 2011-04-25 2016-11-29 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
US9600078B2 (en) 2012-02-03 2017-03-21 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
US8934675B2 (en) 2012-06-25 2015-01-13 Aquifi, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US8830312B2 (en) 2012-06-25 2014-09-09 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching within bounded regions
US9098739B2 (en) 2012-06-25 2015-08-04 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching
US8655021B2 (en) 2012-06-25 2014-02-18 Imimtek, Inc. Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints
US9310891B2 (en) 2012-09-04 2016-04-12 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
US8615108B1 (en) 2013-01-30 2013-12-24 Imimtek, Inc. Systems and methods for initializing motion tracking of human hands
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
CN108144292A (zh) * 2018-01-30 2018-06-12 河南三阳光电有限公司 裸眼3d互动游戏制作设备

Also Published As

Publication number Publication date
JP5147933B2 (ja) 2013-02-20
WO2008126069A3 (en) 2009-04-23
JP2010524113A (ja) 2010-07-15
CA2684020C (en) 2016-08-09
IL201514A0 (en) 2010-05-31
EP2147393A4 (en) 2012-12-05
KR20100016240A (ko) 2010-02-12
EP2147393A2 (en) 2010-01-27
KR101379074B1 (ko) 2014-03-28
CA2684020A1 (en) 2008-10-23
IL201514A (he) 2015-02-26

Similar Documents

Publication Publication Date Title
CA2684020C (en) An apparatus system and method for human-machine-interface
US8432390B2 (en) Apparatus system and method for human-machine interface
US20110163948A1 (en) Method system and software for providing image sensor based human machine interfacing
US8548258B2 (en) Method system and associated modules and software components for providing image sensor based human machine interfacing
CN109902767B (zh) 模型训练方法、图像处理方法及装置、设备和介质
CN109491586B (zh) 虚拟对象控制方法及装置、电子设备、存储介质
US8681100B2 (en) Apparatus system and method for human-machine-interface
CN116954367A (zh) 一种虚拟现实的交互方法、系统及设备
CN114167997B (zh) 一种模型显示方法、装置、设备和存储介质
McNamara et al. Investigating low-cost virtual reality technologies in the context of an immersive maintenance training application
JP5620449B2 (ja) 人−機械インターフェース装置システム及び方法
CN112755510A (zh) 一种移动端云游戏控制方法、系统和计算机可读存储介质
CN108499102B (zh) 信息界面展示方法及装置、存储介质、电子设备
CN111821688A (zh) 虚拟现实游戏画面处理方法及相关设备
US11869145B2 (en) Input device model projecting method, apparatus and system
KR102500237B1 (ko) 모형을 이용한 ar/vr 발골 교육 방법, 장치 및 시스템
US20230148112A1 (en) Sports Neural Network Codec
KR20220148543A (ko) 증강 현실 콘텐츠 제공 방법 및 장치
CN111821689A (zh) 基于云计算技术的虚拟现实游戏系统
Jatain et al. A Real-Time Camera-based Motion Sensing Game Tool for Cervical Rehabilitation
CN106484114B (zh) 基于虚拟现实的交互控制方法及装置
CN116721455A (zh) 一种人脸姿态估计方法、装置及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07736215

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 12517210

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2010502633

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 201514

Country of ref document: IL

Ref document number: 2684020

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20097023103

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2007736215

Country of ref document: EP