EP3535643A1 - Spatial augmented reality method, device and system - Google Patents
Spatial augmented reality method, device and systemInfo
- Publication number
- EP3535643A1 EP3535643A1 EP16794972.6A EP16794972A EP3535643A1 EP 3535643 A1 EP3535643 A1 EP 3535643A1 EP 16794972 A EP16794972 A EP 16794972A EP 3535643 A1 EP3535643 A1 EP 3535643A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- spatial
- object surface
- detecting
- user interface
- input device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
Definitions
- the present invention relates to a method, a device and a system for detecting an interaction between a user and a three-dimensional physical object in a three-dimensional spatial user interface.
- spatial augmented reality methods, devices and systems wherein three-dimensional objects from the environment of a user serve as output modalities, whereas an input is carried out via displays, touch-displays or mobile and/or other suitable devices.
- this object is achieved by a method for detecting an interaction between a user and a three- dimensional physical object in a three-dimensional spatial user interface, the method comprising inputting into an electronic control unit model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system of the object, detecting, by means of a tracking system, the spatial position and orientation of the physical object relative to a coordinate system of the spatial user interface, wherein the tracking system comprises a plurality of cameras and the electronic control unit which is connected to the cameras, calculating second spatial coordinates of the part of the object surface relative to the coordinate system of the spatial user interface, detecting, by means of the tracking system, spatial coordinates of a hand held input device manageable by the user relative to the coordinate system of the spatial user interface, and detecting a selection of the part of the object surface by detecting that a spatial distance between the input device and the part of the object surface is smaller than a predetermined value based on the second spatial
- the predetermined value of the spatial distance between the input device and the part of the object surface based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device may be set such that a selection of a part of the object surface only occurs when the tip of the hand held input device touches or almost touches the object surface.
- the spatial position and orientation of the physical object relative to the coordinate system of the spatial user interface could be predetermined by means of a holding device holding the three-dimensional physical object and thereby defining the exact spatial position and orientation of the physical object.
- the three-dimensional objects may be any objects with at least two surfaces arranged at an angle to each other such as the model of a car, a house, or the like such that the different surfaces of these objects can be selected by the user via the hand held input device.
- the method may further comprise illuminating the object by means of at least one illumination device based on the result of the step of detecting a selection.
- the illumination device may illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface such that any selected part of the surface of the three-dimensional object may be highlighted by the illumination device.
- the illumination device may also change the illumination of the selected part of the object surface upon detecting its selection.
- the method may further comprise the steps of selecting an attribute to be associated with a selected part of the object surface, and associating the attribute to a selected part of the object surface when detecting the selection of the part of the object surface by the input device.
- two or more hand held input devices may be provided, whereby at least one specific function, such as color, pattern or the like, may be associated to each input device, respectively.
- the method may further comprise processing a data set containing a plurality of parts of the object surface and a plurality of attributes respectively associated to each of the parts of the object surface.
- a digital reproduction of the three-dimensional physical object can be created and the object model data of this reproduction of the physical objects, for example models of cars, houses or the like, may be overlaid with the physical objects and can thus be digitally configured as desired by the user with different colors, pattern or the like of the object surfaces.
- the object of the present invention is achieved by a device for detecting an interaction between a user and a three- dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface, the device
- the electronic control unit stores model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system of the object, wherein the tracking system is adapted to detect the spatial position and orientation of the physical object relative to a coordinate system of the spatial user interface, wherein the electronic control unit is adapted to calculate second spatial coordinates of the part of the object surface relative to the coordinate system of the spatial user interface, and wherein the tracking system is further adapted to detect spatial coordinates of the hand held input device
- the predetermined value of the spatial distance between the input device and the part of the object surface based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device may be set such that a selection of a part of the object surface only occurs when the tip of the hand held input device touches or almost touches the object surface.
- the object carrier is a suitable media for carrying the objects which additionally eases the use of the spatial augmented reality device of the present invention described herein.
- the device may further comprise a highlighting device, such as an illumination device adapted to illuminate the object based on the result of the detection of a selection.
- a highlighting device such as an illumination device adapted to illuminate the object based on the result of the detection of a selection.
- the illumination device may be a commercially available projector which can be calibrated such that a precisely overlaid projection of the created graphic and the physical object can be obtained.
- the highlighting device may be an augmented reality/virtual reality device, such as augmented reality/virtual reality glasses, which is adapted to highlight and/or illustrate the object based on the result of the detection of a selection.
- the highlighting device may be adapted to highlight or illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface.
- the highlighting device may also be adapted to highlight or change the illumination of the selected part of the object surface upon detecting its selection. In this way, any selected part of the surface of the three-dimensional object may be highlighted by the highlighting device and a great variety of different colors, pattern and the like can be projected onto the surfaces of the three-dimensional objects and therewith, a digital
- reproduction of the three-dimensional physical object can be created and the object model data of this reproduction of the physical objects, for example models of cars, houses or the like, may be overlaid with the physical objects and can thus be digitally configured as desired by the user with different colors, pattern or the like of the object surfaces.
- the cameras of the tracking system may operate in the visible light range or in the infrared light range, and the tracking system may detect the spatial position and orientation of the object by means of standard methods of three dimensional object recognition, such as triangulation, edge detection, grayscale/gradient matching or other appearance-based or feature-based methods.
- the hand held input device and/or the at least one three-dimensional physical object may be provided with infrared markers and the plurality of cameras may be infrared cameras, wherein preferably the device may further comprise an infrared illumination device.
- the tracking system may detect the positions and orientations of these markers and may thereby determine the exact position of the selected part of the object service through these positions and orientations of the infrared markers relative to the coordinate system of the spatial augmented user interface.
- certain basic parameters with respect to the arrangement of the infrared markers on these object have to be considered.
- the above-mentioned cameras also have to be arranged in a certain manner, preferably at the corners in an upper portion of the object carrier, since an observation of the same scene within the spatial augmented user interface from various different views in consideration of the camera resolutions and aperture angles is essentially for an accurate detection of the infrared markers and therewith of the objects itself.
- the hand held input device and the at least one three-dimensional physical object may be provided with any other means suitable to be detected by the tracking system.
- the device may further comprise at least one board as an additional display of an interaction menu.
- the illumination device may project additional menu selections onto such board, preferably a white board.
- white surfaces provided at a table board for receiving the one or more physical object/objects of the object carrier may serve as projection surface for the additional interaction menu.
- additional menu selections may also be projected or illustrated onto/at, for example, the tabletop of the object carrier or on any other surface within the three-dimensional spatial user interface suitable for an illustration of an additional menu interface.
- the object of the present invention is achieved by a spatial augmented reality system comprising the device of the second aspect of the present invention, and a physical object with at least two surfaces arranged at an angle to each other.
- the physical object may be a 3D-printed model.
- a 3D-printed model can be produced quite fast and with relatively low expenditure of materials and costs.
- the system may operate the method of the first aspect of the present invention.
- figure 1 is a perspective view of a device for detecting an interaction between a user and a three-dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention and a spatial augmented reality system according to the invention
- figure 2 is an enlarged view of a middle section of the device for
- figure 3 is a flowchart of a method for detecting an interaction
- Figure 1 is a perspective view of a device for detecting an interaction between a user and a three-dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention and a spatial augmented reality system according to the invention and shows the arrangement and structure of a spatial augmented reality device 10 and a spatial augmented reality system, respectively.
- An object carrier 12 which may comprise a table board 14 for receiving one or more three-dimensional physical object/objects 24, 26 which may be placed on the table board 14 and an electronic control unit (ECU) 30 which may be placed on a second table board 16 may be provided.
- Physical object 24 may be a model of a car 24 and physical object 26 may be a model of a house 26 in the embodiment described herein.
- Physical objects 24, 26 could also be any other physical objects with at least two surfaces arranged at an angle to each other.
- the object carrier 12 comprises a plurality of frames 18 whose bottom ends 17 may be designed as stands 17 of the object carrier 12 serving as a connection to the ground at its bottom portion.
- the frames 18 may form a connection between the table board 14 for receiving amongst other things the one or more three-dimensional physical object/objects 24, 26 and the second table board 16 for receiving e.g. the electronic control unit 30.
- a plurality of cameras 32a, 32b, 32c and 32d which, respectively, may be provided at the upper corners of the object carrier 12 and preferably an illumination device 34, which may be provided at an upper middle area of the object carrier 12, may be held by the object carrier 12 via crossbars 19.
- the object carrier 12 constitutes a three-dimensional spatial user interface 100.
- figure 1 shows a hand held input device 28 manageable by a user as well as a board 40 serving as an additional display 40 of an interaction menu.
- Figure 2 is an enlarged view of a middle section of the device 10 of figure 1 and additionally shows coordinate systems x, y, z; xi , y ⁇ ⁇ ⁇ , x 2 , y2, z 2 of the spatial user interface 100 (coordinate system x, y, z), of physical object 24 (coordinate system Xi , yi , z-i) which may be a model of a car in the described embodiment, and of physical object 26 (coordinate system x 2 , y 2 , z 2 ) which may be a model of a house in the described embodiment.
- Figure 3 is a flowchart of a method for detecting an interaction between a user and a three-dimensional physical object 24, 26 in a three-dimensional spatial user interface 100.
- model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to the coordinate system/systems Xi , y-i , z-i ; x 2 , y 2 , z 2 of the object/objects 24, 26 may be inputted into the electronic control unit 30 and thereby may be stored in a storage 36 of the electronic control unit (ECU) 30.
- the model object data of the three-dimensional physical objects 24, 26 may be stored as CAD data in the storage 36 of the ECU 30.
- the electronic control unit and the plurality of cameras 32a, 32b, 32c and 32d together may constitute a tracking system which can detect the spatial position and orientation of the one or more physical object/objects 24, 26 relative to the coordinate system x, y, z of the spatial user interface 100.
- Tracking and/or detection of the one or more physical object/objects 24, 26 may be achieved by means of infrared markers, in particular passive infrared markers, arranged at the object/objects 24, 26.
- second spatial coordinates of the part of the object surface relative to the coordinate system x, y, z of the spatial user interface 100 may be calculated via coordinate transformation.
- Spatial coordinates of the hand held input device 28 manageable by the user relative to the coordinate system x, y, z of the spatial user interface 100 may be detected by means of the tracking system.
- tracking and/or detection of the hand held input device 28 may be achieved by means of infrared markers, in particular passive infrared markers, arranged at the input device 28.
- infrared markers in particular passive infrared markers
- arranged at the input device 28 thereby, generally one infrared marker arranged at the tip of the input device 28 would be sufficient, but a better and more precise detection can be carried out using at least two infrared markers arranged at different locations of the input device 28, wherein preferably at least one infrared marker has to be arranged at the tip of the input device.
- the ECU 30 may then calculate the spatial distance between the input device 28 and a part of the object surface of the physical object/objects 24, 26 to be selected. Based on the calculation of the distance between the input device 28 and the part of the object surface of the physical object/objects 24, 26 to be selected the ECU 30 may determine if the calculated distance is smaller than a predetermined threshold value or not and thus if a part of the object surface of the physical object/objects 24, 26 is selected or not, wherein the predetermined threshold value may be e.g. 1 cm, preferably 1 mm and most preferably substantially equal to 0.
- the predetermined threshold value may be e.g. 1 cm, preferably 1 mm and most preferably substantially equal to 0.
- a selection of the part of the object surface can be detected by detecting that a spatial distance between the input device 28 and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device 28.
- the one or more object/objects 24, 26 may then be illuminated by means of at least one illumination device 34 arranged at an upper middle area of the object carrier 12 and held by the object carrier 12 via crossbars 19 based on the result of the step of detecting a selection.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2016/076714 WO2018082782A1 (en) | 2016-11-04 | 2016-11-04 | Spatial augmented reality method, device and system |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3535643A1 true EP3535643A1 (en) | 2019-09-11 |
Family
ID=57288379
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16794972.6A Withdrawn EP3535643A1 (en) | 2016-11-04 | 2016-11-04 | Spatial augmented reality method, device and system |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3535643A1 (en) |
WO (1) | WO2018082782A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11253181B2 (en) * | 2018-08-03 | 2022-02-22 | From Zero, LLC | Method for objectively tracking and analyzing the social and emotional activity of a patient |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10306193B2 (en) * | 2015-04-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Trigger zones for objects in projected surface model |
-
2016
- 2016-11-04 WO PCT/EP2016/076714 patent/WO2018082782A1/en active Search and Examination
- 2016-11-04 EP EP16794972.6A patent/EP3535643A1/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2018082782A1 (en) | 2018-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10217288B2 (en) | Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor | |
US9430698B2 (en) | Information input apparatus, information input method, and computer program | |
TWI450132B (en) | A portrait recognition device, an operation judgment method, and a computer program | |
US10762386B2 (en) | Method of determining a similarity transformation between first and second coordinates of 3D features | |
US9256288B2 (en) | Apparatus and method for selecting item using movement of object | |
US20090226113A1 (en) | Environment Map Generating Apparatus, Environment Map Generating Method, and Environment Map Generating Program | |
US10853966B2 (en) | Virtual space moving apparatus and method | |
KR20020086931A (en) | Single camera system for gesture-based input and target indication | |
US20210255328A1 (en) | Methods and systems of a handheld spatially aware mixed-reality projection platform | |
JP2011065202A (en) | Autonomous mobile device | |
KR100971667B1 (en) | Apparatus and method for providing realistic contents through augmented book | |
US20210287330A1 (en) | Information processing system, method of information processing, and program | |
WO2018082782A1 (en) | Spatial augmented reality method, device and system | |
EP3088991B1 (en) | Wearable device and method for enabling user interaction | |
Piérard et al. | I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes | |
WO2020084192A1 (en) | Method, arrangement, and computer program product for three-dimensional visualization of augmented reality and virtual reality environments | |
US20210343040A1 (en) | Object tracking | |
WO2023194612A1 (en) | Calibration device and method for an electronic display screen for touchless gesture control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190408 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20210219 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210702 |