WO2008132724A4 - A method and apparatus for three dimensional interaction with autosteroscopic displays - Google Patents
A method and apparatus for three dimensional interaction with autosteroscopic displays Download PDFInfo
- Publication number
- WO2008132724A4 WO2008132724A4 PCT/IL2008/000530 IL2008000530W WO2008132724A4 WO 2008132724 A4 WO2008132724 A4 WO 2008132724A4 IL 2008000530 W IL2008000530 W IL 2008000530W WO 2008132724 A4 WO2008132724 A4 WO 2008132724A4
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewer
- virtual
- data
- display
- location
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method and apparatus for interactive human computer interface using a self-contained single housing autostereoscopic (AS) display configured to render 3D virtual objects into fixed viewing zones. The system contains an eye location tracking system for continuously determining both a viewer perceived three dimensional space in relation to the zones and a 3D mapping of the rendered virtual objects in the perceived space in accordance with a viewer eyes position. Additionally, one or more 3D cameras determines anatomy location and configuration of the viewer in real time in relation to said display. An interactive application that defines interactive rules and displayed content to the viewer is present. Furthermore, an interaction processing engine receives information from the eye location tracking system, the anatomy location and configuration system, and the interactive application to determine interaction data of the viewer anatomy with the rendered virtual objects from the AS display.
Claims
1. A system for providing an interactive human computer interface to a viewer, said system comprising: a) a data storage configured to store three-dimensional virtual environment representation data including at least one three-dimensional virtual object within said virtual environment; b) an autostereoscopic (AS) display configured to display simultaneous perspectives of virtual objects of said 3D virtual environment representation data to spatially separated viewing zones located in front of said AS display, c) an anatomy tracking system including at least one 3D camera, said anatomy tracking system operative to determine;
I) a viewer' S eyes location within said viewing zones, said eyes location determining: i. a defined region substantially in front of said AS display in which said virtual objects are perceived and
H. 3D location data of said virtual objects perceived by said viewer in said defined region;
II) 3D mapping data of an anatomical part on said viewer's body located in said defined region d) a 3D registration engine configured to generate a single, unified 3D volume comprising said 3D mapping data of an anatomical part located in said defined region and said 3D location data of said virtual objects perceived by said viewer in said defined region , e) an anatomical part-virtual object relation computation engine operative to determine relative locations in said unified 3D volume between said virtual object 3D location data and said anatomical part 3D mapping data in accordance with output of said registration engine: f) a rule enforcement engine operative to modify said three-dimensional virtual environment representation data in accordance with output of said anatomical part- virtual object relation computation engine and said virtual environment representation data. 34
2. A system according to claim 1, wherein said virtual environment data contains interactive application rules.
3- A system according to claim 1, wherein said anatomy tracking system determines said viewer's eye position and anatomical part position continuously over time.
4. A system according to claim 1, wherein said rule enforcement engine is further configured to provide output data in real time to a scene presentation engine for display of subsequent content.
5. A system according to claim 1, configured to receive input from a plurality of viewers having eye positions in a plurality of adjacent pairs of said viewing zones, said viewers simultaneously interacting with said autostereoscopic display.
6. A system according to claim 1, wherein said 3D camera comprises an NIR projector and narrowband filter.
7. A system according to claim 1, further comprising a viewer classification module configured to determine viewer identity and characteristics based on a viewer historical profile.
8. A system according to claim 7, wherein said viewer profile is obtained through an on-line database.
9. A system according to claim 1, wherein said displayed three-dimensional virtual environment representation data is a virtual touch screen.
10. A system according to claim 1, wherein said displayed three-dimensional virtual environment representation data is a virtual keyboard.
11. A system according to claim 1, wherein said displayed three-dimensional virtual environment representation data is a virtual image of the user.
12. A system according to claim 1, wherein said displayed virtual environment representation data as a result of rule enforcement engine output is sent over a network in a multi-user environment to allow for multi-user interface.
13. A system according to claim 1, wherein said displayed virtual environment representation data is continuously changing 3D positional data.
14. A system according to claim 1, further comprising a miniaturized element perceived by said viewer in a viewer perceived space that is determined in accordance with said viewer's eye position, said miniaturized element configured for interaction with said displayed 3D virtual objects.
15. A system for providing an interactive human computer interface to a viewer, said system comprising: a) a data storage configured to store three-dimensional virtual environment representation data including at least one three-dimensional virtual object within said virtual environment; b) an autostereoscopic (AS) display configured to display simultaneous perspectives of virtual objects of said 3D virtual environment representation data to spatially separated viewing zones located in front of said AS display, including the display of a virtual viewer configured for interaction with said virtual objects c) an anatomy tracking system including at least one 3D camera, said anatomy tracking system operative to determine:
I) a viewer's eyes location within said viewing zones, said eyes location determining: i) a defined region substantially in front of said AS display in which said virtual objects and said vinual viewer are perceived ii) 3D location data of said virtual objects perceived by said viewer in said defined region, 36
U) 3D mapping data of an anatomical part on said viewer's body outside of said defined region, said virtual viewer 3D location data reflecting said anatomical part mapping d) a virtual viewer-virtual object relative-location computation engine operative to determine a relative-location between said virtual object 3D location data and said virtual viewer 3D location data e) a rule enforcement engine operative to modify said three-dimensional virtual environment representation data in accordance with output of said virtual viewer- virtual object relative-location computation engine and said virtual environment representation data.
16. A system according to claim 15, wherein said virtual viewer is perceived by said viewer when said viewer's interacting anatomy part is outside of said viewer perceived space, and wherein said virtual viewer disappears when said viewer's interacting anatomy part is within said viewer perceived space.
17. A method for providing an interactive human computer interface to a viewer, said method comprising: a) storing three-dimensional virtual environment representation data including at least one three-dimensional virtual object within said virtual environment; b) displaying on an autostereoscopic (AS) display simultaneous perspectives of virtual objects in said 3D virtual environment representation data to spatially separated viewing zones located in front of said AS display c) tracking the anatomy of a viewer to determine:
I) a viewer's eyes location within said viewing zones, said eyes location determining: i. a defined region substantially in front of said AS display in which said virtual objects are perceived and ii. 3D location data of said virtual objects perceived by said viewer in said defined region;
I) 3D mapping data of an anatomical part on said viewer's body located in said defined region 37
d) generating a registered, unified 3D volume comprising said 3D mapping data of said viewer's anatomical part located in said defined region and said 3D location data of said virtual objects perceived by said viewer in said defined region, e) determining relative locations in said unified 3D volume between said virtual object 3D location data and said anatomical part 3D mapping data in accordance with said registration, f) modifying based on interactive rules said three-dimensional environment representation data in accordance with said determined anatomical part- virtual object relative locations and said virtual environment representation data.
18. A system for interactive human computer interface, said system comprising: a self-contained autostereσscopic (AS) display configured to render 3D virtual objects into neighboring viewing zones associated with said display, an eye location tracking system, comprising at least one 3D video camera, for continuously determining:
1) a viewer perceived three dimensional space in relation to said display, and
2) a 3D mapping of said rendered virtual objects in said perceived space in accordance with viewer eyes position in relation to said fixed viewing zones an anatomy location and configuration system, comprising at least one 3D video camera, for continuously determining a 3D mapping of viewer anatomy in relation to said display, and an interactive application that defines interactive rules and displayed content to said user, and an interaction processing engine configured to receive information from
1) said eye location tracking system
2) said anatomy location and configuration system, and
3) said interactive application thereby to determine interaction data of said viewer anatomy with said rendered virtual objects from AS display.
19. A system or method for three dimensional interaction with auto stereoscopic displays substantially as described or illustrated hereinabove or in any of the drawings.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US92400307P | 2007-04-26 | 2007-04-26 | |
US60/924,003 | 2007-04-26 | ||
US93542607P | 2007-08-13 | 2007-08-13 | |
US60/935,426 | 2007-08-13 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2008132724A1 WO2008132724A1 (en) | 2008-11-06 |
WO2008132724A4 true WO2008132724A4 (en) | 2008-12-31 |
Family
ID=39587846
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2008/000530 WO2008132724A1 (en) | 2007-04-26 | 2008-04-17 | A method and apparatus for three dimensional interaction with autosteroscopic displays |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2008132724A1 (en) |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2470072B (en) * | 2009-05-08 | 2014-01-01 | Sony Comp Entertainment Europe | Entertainment device,system and method |
GB0910545D0 (en) | 2009-06-18 | 2009-07-29 | Therefore Ltd | Picturesafe |
US8970478B2 (en) | 2009-10-14 | 2015-03-03 | Nokia Corporation | Autostereoscopic rendering and display apparatus |
KR101627214B1 (en) | 2009-11-12 | 2016-06-03 | 엘지전자 주식회사 | Image Display Device and Operating Method for the Same |
KR101647722B1 (en) | 2009-11-13 | 2016-08-23 | 엘지전자 주식회사 | Image Display Device and Operating Method for the Same |
KR101631451B1 (en) * | 2009-11-16 | 2016-06-20 | 엘지전자 주식회사 | Image Display Device and Operating Method for the Same |
EP2372512A1 (en) * | 2010-03-30 | 2011-10-05 | Harman Becker Automotive Systems GmbH | Vehicle user interface unit for a vehicle electronic device |
FR2959576A1 (en) * | 2010-05-03 | 2011-11-04 | Thomson Licensing | METHOD FOR DISPLAYING A SETTING MENU AND CORRESPONDING DEVICE |
JP5573379B2 (en) * | 2010-06-07 | 2014-08-20 | ソニー株式会社 | Information display device and display image control method |
US8508347B2 (en) | 2010-06-24 | 2013-08-13 | Nokia Corporation | Apparatus and method for proximity based input |
US9135426B2 (en) | 2010-12-16 | 2015-09-15 | Blackberry Limited | Password entry using moving images |
US8635676B2 (en) | 2010-12-16 | 2014-01-21 | Blackberry Limited | Visual or touchscreen password entry |
US8931083B2 (en) | 2010-12-16 | 2015-01-06 | Blackberry Limited | Multi-layer multi-point or randomized passwords |
US8745694B2 (en) | 2010-12-16 | 2014-06-03 | Research In Motion Limited | Adjusting the position of an endpoint reference for increasing security during device log-on |
US8631487B2 (en) | 2010-12-16 | 2014-01-14 | Research In Motion Limited | Simple algebraic and multi-layer passwords |
US8650624B2 (en) | 2010-12-16 | 2014-02-11 | Blackberry Limited | Obscuring visual login |
US8769641B2 (en) | 2010-12-16 | 2014-07-01 | Blackberry Limited | Multi-layer multi-point or pathway-based passwords |
US8863271B2 (en) | 2010-12-16 | 2014-10-14 | Blackberry Limited | Password entry using 3D image with spatial alignment |
US8661530B2 (en) | 2010-12-16 | 2014-02-25 | Blackberry Limited | Multi-layer orientation-changing password |
US8650635B2 (en) | 2010-12-16 | 2014-02-11 | Blackberry Limited | Pressure sensitive multi-layer passwords |
US9258123B2 (en) | 2010-12-16 | 2016-02-09 | Blackberry Limited | Multi-layered color-sensitive passwords |
EP2521007A1 (en) * | 2011-05-03 | 2012-11-07 | Technische Universität Dresden | Method for object 3D position identification based on multiple image analysis applying to gaze tracking |
US8769668B2 (en) | 2011-05-09 | 2014-07-01 | Blackberry Limited | Touchscreen password entry |
CN102810028A (en) * | 2011-06-01 | 2012-12-05 | 时代光电科技股份有限公司 | Touch device for virtual images floating in air |
US9733789B2 (en) | 2011-08-04 | 2017-08-15 | Eyesight Mobile Technologies Ltd. | Interfacing with a device via virtual 3D objects |
US9930128B2 (en) | 2011-09-30 | 2018-03-27 | Nokia Technologies Oy | Method and apparatus for accessing a virtual object |
US9223948B2 (en) | 2011-11-01 | 2015-12-29 | Blackberry Limited | Combined passcode and activity launch modifier |
KR101322910B1 (en) | 2011-12-23 | 2013-10-29 | 한국과학기술연구원 | Apparatus for 3-dimensional displaying using dyanmic viewing zone enlargement for multiple observers and method thereof |
GB2498184A (en) * | 2012-01-03 | 2013-07-10 | Liang Kong | Interactive autostereoscopic three-dimensional display |
US9807362B2 (en) * | 2012-03-30 | 2017-10-31 | Intel Corporation | Intelligent depth control |
US8933912B2 (en) * | 2012-04-02 | 2015-01-13 | Microsoft Corporation | Touch sensitive user interface with three dimensional input sensor |
DE102012209917A1 (en) * | 2012-06-13 | 2013-12-19 | Technische Universität Dresden | Method for transforming real-world visual information in virtual three-dimensional environments, involves time-synchronized recording of raw data for test person in real world system with detection of head movements of test person |
EP3090321A4 (en) * | 2014-01-03 | 2017-07-05 | Harman International Industries, Incorporated | Gesture interactive wearable spatial audio system |
GB2533777A (en) * | 2014-12-24 | 2016-07-06 | Univ Of Hertfordshire Higher Education Corp | Coherent touchless interaction with steroscopic 3D images |
WO2016182502A1 (en) * | 2015-05-14 | 2016-11-17 | Medha Dharmatilleke | Multi purpose mobile device case/cover integrated with a camera system & non electrical 3d/multiple video & still frame viewer for 3d and/or 2d high quality videography, photography and selfie recording |
US20180077430A1 (en) * | 2016-09-09 | 2018-03-15 | Barrie Hansen | Cloned Video Streaming |
CN111949111B (en) * | 2019-05-14 | 2022-04-26 | Oppo广东移动通信有限公司 | Interaction control method and device, electronic equipment and storage medium |
NL2030326B1 (en) * | 2021-12-29 | 2023-07-04 | Dimenco Holding B V | Autostereoscopic display device having a remote body tracking system |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6985168B2 (en) * | 1994-11-14 | 2006-01-10 | Reveo, Inc. | Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments |
EP0785457A3 (en) * | 1996-01-17 | 1998-10-14 | Nippon Telegraph And Telephone Corporation | Optical device and three-dimensional display device |
US6985290B2 (en) * | 1999-12-08 | 2006-01-10 | Neurok Llc | Visualization of three dimensional images and multi aspect imaging |
US7774075B2 (en) * | 2002-11-06 | 2010-08-10 | Lin Julius J Y | Audio-visual three-dimensional input/output |
US7787009B2 (en) * | 2004-05-10 | 2010-08-31 | University Of Southern California | Three dimensional interaction with autostereoscopic displays |
-
2008
- 2008-04-17 WO PCT/IL2008/000530 patent/WO2008132724A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2008132724A1 (en) | 2008-11-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2008132724A4 (en) | A method and apparatus for three dimensional interaction with autosteroscopic displays | |
KR100809479B1 (en) | Face mounted display apparatus and method for mixed reality environment | |
Fehn et al. | Interactive 3-DTV-concepts and key technologies | |
US9967555B2 (en) | Simulation device | |
US9230500B2 (en) | Expanded 3D stereoscopic display system | |
US20050264858A1 (en) | Multi-plane horizontal perspective display | |
JP2014504074A (en) | Method, system, apparatus and associated processing logic for generating stereoscopic 3D images and video | |
KR20070047736A (en) | Horizontal perspective display | |
US20130044103A1 (en) | Apparatus and method for displaying stereoscopic image | |
US11659158B1 (en) | Frustum change in projection stereo rendering | |
CN106168855B (en) | Portable MR glasses, mobile phone and MR glasses system | |
CN111226187A (en) | System and method for interacting with a user through a mirror | |
US6252982B1 (en) | Image processing system for handling depth information | |
US11961194B2 (en) | Non-uniform stereo rendering | |
CN107948631A (en) | It is a kind of based on cluster and the bore hole 3D systems that render | |
JP2018157331A (en) | Program, recording medium, image generating apparatus, image generation method | |
KR20120060548A (en) | System for 3D based marker | |
JP6166985B2 (en) | Image generating apparatus and image generating program | |
KR20120093693A (en) | Stereoscopic 3d display device and method of driving the same | |
US10567744B1 (en) | Camera-based display method and system for simulators | |
CN111699460A (en) | Multi-view virtual reality user interface | |
US20060152580A1 (en) | Auto-stereoscopic volumetric imaging system and method | |
Wei et al. | Color anaglyphs for panorama visualizations | |
CN113382225B (en) | Binocular holographic display method and device based on holographic sand table | |
JP2019197368A (en) | Stereoscopic motion image depth compression device and stereoscopic motion image depth compression program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08738232 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS EPO FORM 1205A DATED 20.01.2010. |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08738232 Country of ref document: EP Kind code of ref document: A1 |