EP3535643A1 - Spatial augmented reality method, device and system - Google Patents

Spatial augmented reality method, device and system

Info

Publication number
EP3535643A1
EP3535643A1 EP16794972.6A EP16794972A EP3535643A1 EP 3535643 A1 EP3535643 A1 EP 3535643A1 EP 16794972 A EP16794972 A EP 16794972A EP 3535643 A1 EP3535643 A1 EP 3535643A1
Authority
EP
European Patent Office
Prior art keywords
spatial
object surface
detecting
user interface
input device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP16794972.6A
Other languages
German (de)
French (fr)
Inventor
Hendrik WALZEL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fortiss GmbH
Original Assignee
Fortiss GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fortiss GmbH filed Critical Fortiss GmbH
Publication of EP3535643A1 publication Critical patent/EP3535643A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present invention relates to a method, a device and a system for detecting an interaction between a user and a three-dimensional physical object in a three-dimensional spatial user interface.
  • spatial augmented reality methods, devices and systems wherein three-dimensional objects from the environment of a user serve as output modalities, whereas an input is carried out via displays, touch-displays or mobile and/or other suitable devices.
  • this object is achieved by a method for detecting an interaction between a user and a three- dimensional physical object in a three-dimensional spatial user interface, the method comprising inputting into an electronic control unit model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system of the object, detecting, by means of a tracking system, the spatial position and orientation of the physical object relative to a coordinate system of the spatial user interface, wherein the tracking system comprises a plurality of cameras and the electronic control unit which is connected to the cameras, calculating second spatial coordinates of the part of the object surface relative to the coordinate system of the spatial user interface, detecting, by means of the tracking system, spatial coordinates of a hand held input device manageable by the user relative to the coordinate system of the spatial user interface, and detecting a selection of the part of the object surface by detecting that a spatial distance between the input device and the part of the object surface is smaller than a predetermined value based on the second spatial
  • the predetermined value of the spatial distance between the input device and the part of the object surface based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device may be set such that a selection of a part of the object surface only occurs when the tip of the hand held input device touches or almost touches the object surface.
  • the spatial position and orientation of the physical object relative to the coordinate system of the spatial user interface could be predetermined by means of a holding device holding the three-dimensional physical object and thereby defining the exact spatial position and orientation of the physical object.
  • the three-dimensional objects may be any objects with at least two surfaces arranged at an angle to each other such as the model of a car, a house, or the like such that the different surfaces of these objects can be selected by the user via the hand held input device.
  • the method may further comprise illuminating the object by means of at least one illumination device based on the result of the step of detecting a selection.
  • the illumination device may illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface such that any selected part of the surface of the three-dimensional object may be highlighted by the illumination device.
  • the illumination device may also change the illumination of the selected part of the object surface upon detecting its selection.
  • the method may further comprise the steps of selecting an attribute to be associated with a selected part of the object surface, and associating the attribute to a selected part of the object surface when detecting the selection of the part of the object surface by the input device.
  • two or more hand held input devices may be provided, whereby at least one specific function, such as color, pattern or the like, may be associated to each input device, respectively.
  • the method may further comprise processing a data set containing a plurality of parts of the object surface and a plurality of attributes respectively associated to each of the parts of the object surface.
  • a digital reproduction of the three-dimensional physical object can be created and the object model data of this reproduction of the physical objects, for example models of cars, houses or the like, may be overlaid with the physical objects and can thus be digitally configured as desired by the user with different colors, pattern or the like of the object surfaces.
  • the object of the present invention is achieved by a device for detecting an interaction between a user and a three- dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface, the device
  • the electronic control unit stores model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system of the object, wherein the tracking system is adapted to detect the spatial position and orientation of the physical object relative to a coordinate system of the spatial user interface, wherein the electronic control unit is adapted to calculate second spatial coordinates of the part of the object surface relative to the coordinate system of the spatial user interface, and wherein the tracking system is further adapted to detect spatial coordinates of the hand held input device
  • the predetermined value of the spatial distance between the input device and the part of the object surface based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device may be set such that a selection of a part of the object surface only occurs when the tip of the hand held input device touches or almost touches the object surface.
  • the object carrier is a suitable media for carrying the objects which additionally eases the use of the spatial augmented reality device of the present invention described herein.
  • the device may further comprise a highlighting device, such as an illumination device adapted to illuminate the object based on the result of the detection of a selection.
  • a highlighting device such as an illumination device adapted to illuminate the object based on the result of the detection of a selection.
  • the illumination device may be a commercially available projector which can be calibrated such that a precisely overlaid projection of the created graphic and the physical object can be obtained.
  • the highlighting device may be an augmented reality/virtual reality device, such as augmented reality/virtual reality glasses, which is adapted to highlight and/or illustrate the object based on the result of the detection of a selection.
  • the highlighting device may be adapted to highlight or illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface.
  • the highlighting device may also be adapted to highlight or change the illumination of the selected part of the object surface upon detecting its selection. In this way, any selected part of the surface of the three-dimensional object may be highlighted by the highlighting device and a great variety of different colors, pattern and the like can be projected onto the surfaces of the three-dimensional objects and therewith, a digital
  • reproduction of the three-dimensional physical object can be created and the object model data of this reproduction of the physical objects, for example models of cars, houses or the like, may be overlaid with the physical objects and can thus be digitally configured as desired by the user with different colors, pattern or the like of the object surfaces.
  • the cameras of the tracking system may operate in the visible light range or in the infrared light range, and the tracking system may detect the spatial position and orientation of the object by means of standard methods of three dimensional object recognition, such as triangulation, edge detection, grayscale/gradient matching or other appearance-based or feature-based methods.
  • the hand held input device and/or the at least one three-dimensional physical object may be provided with infrared markers and the plurality of cameras may be infrared cameras, wherein preferably the device may further comprise an infrared illumination device.
  • the tracking system may detect the positions and orientations of these markers and may thereby determine the exact position of the selected part of the object service through these positions and orientations of the infrared markers relative to the coordinate system of the spatial augmented user interface.
  • certain basic parameters with respect to the arrangement of the infrared markers on these object have to be considered.
  • the above-mentioned cameras also have to be arranged in a certain manner, preferably at the corners in an upper portion of the object carrier, since an observation of the same scene within the spatial augmented user interface from various different views in consideration of the camera resolutions and aperture angles is essentially for an accurate detection of the infrared markers and therewith of the objects itself.
  • the hand held input device and the at least one three-dimensional physical object may be provided with any other means suitable to be detected by the tracking system.
  • the device may further comprise at least one board as an additional display of an interaction menu.
  • the illumination device may project additional menu selections onto such board, preferably a white board.
  • white surfaces provided at a table board for receiving the one or more physical object/objects of the object carrier may serve as projection surface for the additional interaction menu.
  • additional menu selections may also be projected or illustrated onto/at, for example, the tabletop of the object carrier or on any other surface within the three-dimensional spatial user interface suitable for an illustration of an additional menu interface.
  • the object of the present invention is achieved by a spatial augmented reality system comprising the device of the second aspect of the present invention, and a physical object with at least two surfaces arranged at an angle to each other.
  • the physical object may be a 3D-printed model.
  • a 3D-printed model can be produced quite fast and with relatively low expenditure of materials and costs.
  • the system may operate the method of the first aspect of the present invention.
  • figure 1 is a perspective view of a device for detecting an interaction between a user and a three-dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention and a spatial augmented reality system according to the invention
  • figure 2 is an enlarged view of a middle section of the device for
  • figure 3 is a flowchart of a method for detecting an interaction
  • Figure 1 is a perspective view of a device for detecting an interaction between a user and a three-dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention and a spatial augmented reality system according to the invention and shows the arrangement and structure of a spatial augmented reality device 10 and a spatial augmented reality system, respectively.
  • An object carrier 12 which may comprise a table board 14 for receiving one or more three-dimensional physical object/objects 24, 26 which may be placed on the table board 14 and an electronic control unit (ECU) 30 which may be placed on a second table board 16 may be provided.
  • Physical object 24 may be a model of a car 24 and physical object 26 may be a model of a house 26 in the embodiment described herein.
  • Physical objects 24, 26 could also be any other physical objects with at least two surfaces arranged at an angle to each other.
  • the object carrier 12 comprises a plurality of frames 18 whose bottom ends 17 may be designed as stands 17 of the object carrier 12 serving as a connection to the ground at its bottom portion.
  • the frames 18 may form a connection between the table board 14 for receiving amongst other things the one or more three-dimensional physical object/objects 24, 26 and the second table board 16 for receiving e.g. the electronic control unit 30.
  • a plurality of cameras 32a, 32b, 32c and 32d which, respectively, may be provided at the upper corners of the object carrier 12 and preferably an illumination device 34, which may be provided at an upper middle area of the object carrier 12, may be held by the object carrier 12 via crossbars 19.
  • the object carrier 12 constitutes a three-dimensional spatial user interface 100.
  • figure 1 shows a hand held input device 28 manageable by a user as well as a board 40 serving as an additional display 40 of an interaction menu.
  • Figure 2 is an enlarged view of a middle section of the device 10 of figure 1 and additionally shows coordinate systems x, y, z; xi , y ⁇ ⁇ ⁇ , x 2 , y2, z 2 of the spatial user interface 100 (coordinate system x, y, z), of physical object 24 (coordinate system Xi , yi , z-i) which may be a model of a car in the described embodiment, and of physical object 26 (coordinate system x 2 , y 2 , z 2 ) which may be a model of a house in the described embodiment.
  • Figure 3 is a flowchart of a method for detecting an interaction between a user and a three-dimensional physical object 24, 26 in a three-dimensional spatial user interface 100.
  • model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to the coordinate system/systems Xi , y-i , z-i ; x 2 , y 2 , z 2 of the object/objects 24, 26 may be inputted into the electronic control unit 30 and thereby may be stored in a storage 36 of the electronic control unit (ECU) 30.
  • the model object data of the three-dimensional physical objects 24, 26 may be stored as CAD data in the storage 36 of the ECU 30.
  • the electronic control unit and the plurality of cameras 32a, 32b, 32c and 32d together may constitute a tracking system which can detect the spatial position and orientation of the one or more physical object/objects 24, 26 relative to the coordinate system x, y, z of the spatial user interface 100.
  • Tracking and/or detection of the one or more physical object/objects 24, 26 may be achieved by means of infrared markers, in particular passive infrared markers, arranged at the object/objects 24, 26.
  • second spatial coordinates of the part of the object surface relative to the coordinate system x, y, z of the spatial user interface 100 may be calculated via coordinate transformation.
  • Spatial coordinates of the hand held input device 28 manageable by the user relative to the coordinate system x, y, z of the spatial user interface 100 may be detected by means of the tracking system.
  • tracking and/or detection of the hand held input device 28 may be achieved by means of infrared markers, in particular passive infrared markers, arranged at the input device 28.
  • infrared markers in particular passive infrared markers
  • arranged at the input device 28 thereby, generally one infrared marker arranged at the tip of the input device 28 would be sufficient, but a better and more precise detection can be carried out using at least two infrared markers arranged at different locations of the input device 28, wherein preferably at least one infrared marker has to be arranged at the tip of the input device.
  • the ECU 30 may then calculate the spatial distance between the input device 28 and a part of the object surface of the physical object/objects 24, 26 to be selected. Based on the calculation of the distance between the input device 28 and the part of the object surface of the physical object/objects 24, 26 to be selected the ECU 30 may determine if the calculated distance is smaller than a predetermined threshold value or not and thus if a part of the object surface of the physical object/objects 24, 26 is selected or not, wherein the predetermined threshold value may be e.g. 1 cm, preferably 1 mm and most preferably substantially equal to 0.
  • the predetermined threshold value may be e.g. 1 cm, preferably 1 mm and most preferably substantially equal to 0.
  • a selection of the part of the object surface can be detected by detecting that a spatial distance between the input device 28 and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device 28.
  • the one or more object/objects 24, 26 may then be illuminated by means of at least one illumination device 34 arranged at an upper middle area of the object carrier 12 and held by the object carrier 12 via crossbars 19 based on the result of the step of detecting a selection.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Method for detecting an interaction between a user and a three-dimensional physical object (24, 26) in a three-dimensional spatial user interface (100), the method comprising inputting into an electronic control unit (30) model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system (x1, y1, z1; x2, y2, z2) of the object (24, 26), detecting, by means of a tracking system, the spatial position and orientation of the physical object (24, 26) relative to a coordinate system (x, y, z) of the spatial user interface (100), wherein the tracking system comprises a plurality of cameras (32a, 32b, 32c, 32d) and the electronic control unit (30) which is connected to the cameras (32a, 32b, 32c, 32d), calculating second spatial coordinates of the part of the object surface relative to the coordinate system (x, y, z) of the spatial user interface (100), detecting, by means of the tracking system, spatial coordinates of a hand held input device (28) manageable by the user relative to the coordinate system (x, y, z) of the spatial user interface (100), and detecting a selection of the part of the object surface by detecting that a spatial distance between the input device (28) and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device (28).

Description

Spatia! augmented! reality method, device and system
Description
The present invention relates to a method, a device and a system for detecting an interaction between a user and a three-dimensional physical object in a three-dimensional spatial user interface.
During the last decades, interaction of a user with a digital system was characterized by the use of keyboards and mice as input modalities and two- dimensional displays as output modalities. A further development was to provide a more intuitive and productive interaction between a user and a digital system by the use of touch-sensitive displays. Currently, development in this field tends to three-dimensional interaction concepts.
In this context, spatial augmented reality methods, devices and systems are known, wherein three-dimensional objects from the environment of a user serve as output modalities, whereas an input is carried out via displays, touch-displays or mobile and/or other suitable devices.
Thereby, inputting data regarding a physical object into the user interface could even occur more intuitively since still a device that is not related to the physical object has to be used as input modality.
Under the above-mentioned circumstances it was an object of the present invention to provide a spatial augmented reality method, device and system providing a simple and in particular highly intuitive use of an interaction between a user and a three-dimensional physical object in a three- dimensional spatial user interface.
According to a first aspect of the present invention, this object is achieved by a method for detecting an interaction between a user and a three- dimensional physical object in a three-dimensional spatial user interface, the method comprising inputting into an electronic control unit model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system of the object, detecting, by means of a tracking system, the spatial position and orientation of the physical object relative to a coordinate system of the spatial user interface, wherein the tracking system comprises a plurality of cameras and the electronic control unit which is connected to the cameras, calculating second spatial coordinates of the part of the object surface relative to the coordinate system of the spatial user interface, detecting, by means of the tracking system, spatial coordinates of a hand held input device manageable by the user relative to the coordinate system of the spatial user interface, and detecting a selection of the part of the object surface by detecting that a spatial distance between the input device and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device.
Thereby, it is made possible for a user to intuitively select an object surface directly pointing at it within the three-dimensional spatial user interface and not via displays, touch-displays, or mobile and/or other suitable devices. Thus, the interaction between the user and the three-dimensional physical object can be carried out in a three-dimensional context in a highly intuitive manner for the user operating the interaction.
The predetermined value of the spatial distance between the input device and the part of the object surface based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device may be set such that a selection of a part of the object surface only occurs when the tip of the hand held input device touches or almost touches the object surface. As an alternative to the above-mentioned tracking system, the spatial position and orientation of the physical object relative to the coordinate system of the spatial user interface could be predetermined by means of a holding device holding the three-dimensional physical object and thereby defining the exact spatial position and orientation of the physical object.
The three-dimensional objects may be any objects with at least two surfaces arranged at an angle to each other such as the model of a car, a house, or the like such that the different surfaces of these objects can be selected by the user via the hand held input device.
Preferably, the method may further comprise illuminating the object by means of at least one illumination device based on the result of the step of detecting a selection. Furthermore, the illumination device may illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface such that any selected part of the surface of the three-dimensional object may be highlighted by the illumination device.
Preferably, the illumination device may also change the illumination of the selected part of the object surface upon detecting its selection. The method may further comprise the steps of selecting an attribute to be associated with a selected part of the object surface, and associating the attribute to a selected part of the object surface when detecting the selection of the part of the object surface by the input device. In this way, a great variety of different colors, pattern and the like can be projected onto the surfaces of the three- dimensional objects. To achieve an even greater variety of different functions for illustrating on the physical object in a simple and effective manner, preferably, two or more hand held input devices may be provided, whereby at least one specific function, such as color, pattern or the like, may be associated to each input device, respectively. Additionally, the method may further comprise processing a data set containing a plurality of parts of the object surface and a plurality of attributes respectively associated to each of the parts of the object surface. Therewith, a digital reproduction of the three-dimensional physical object can be created and the object model data of this reproduction of the physical objects, for example models of cars, houses or the like, may be overlaid with the physical objects and can thus be digitally configured as desired by the user with different colors, pattern or the like of the object surfaces.
According to a second aspect, the object of the present invention is achieved by a device for detecting an interaction between a user and a three- dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface, the device
comprising an object carrier, adapted to carry the physical object, a tracking system comprising a plurality of cameras and an electronic control unit which is connected to the cameras, and a hand held input device manageable by the user, wherein the electronic control unit stores model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system of the object, wherein the tracking system is adapted to detect the spatial position and orientation of the physical object relative to a coordinate system of the spatial user interface, wherein the electronic control unit is adapted to calculate second spatial coordinates of the part of the object surface relative to the coordinate system of the spatial user interface, and wherein the tracking system is further adapted to detect spatial coordinates of the hand held input device
manageable by the user relative to the coordinate system of the spatial user interface, and to detect a selection of the part of the object surface by detecting that a spatial distance between the input device and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial
coordinates of the input device. The predetermined value of the spatial distance between the input device and the part of the object surface based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device may be set such that a selection of a part of the object surface only occurs when the tip of the hand held input device touches or almost touches the object surface. Thus, it is made possible for a user to select an object surface highly intuitive directly pointing at it within the three-dimensional spatial user interface and not via displays, touch-displays or mobile and/or other suitable devices. The object carrier is a suitable media for carrying the objects which additionally eases the use of the spatial augmented reality device of the present invention described herein.
Preferably, the device may further comprise a highlighting device, such as an illumination device adapted to illuminate the object based on the result of the detection of a selection. The illumination device may be a commercially available projector which can be calibrated such that a precisely overlaid projection of the created graphic and the physical object can be obtained. As an alternative, the highlighting device may be an augmented reality/virtual reality device, such as augmented reality/virtual reality glasses, which is adapted to highlight and/or illustrate the object based on the result of the detection of a selection.
The highlighting device may be adapted to highlight or illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface. The highlighting device may also be adapted to highlight or change the illumination of the selected part of the object surface upon detecting its selection. In this way, any selected part of the surface of the three-dimensional object may be highlighted by the highlighting device and a great variety of different colors, pattern and the like can be projected onto the surfaces of the three-dimensional objects and therewith, a digital
reproduction of the three-dimensional physical object can be created and the object model data of this reproduction of the physical objects, for example models of cars, houses or the like, may be overlaid with the physical objects and can thus be digitally configured as desired by the user with different colors, pattern or the like of the object surfaces.
The cameras of the tracking system may operate in the visible light range or in the infrared light range, and the tracking system may detect the spatial position and orientation of the object by means of standard methods of three dimensional object recognition, such as triangulation, edge detection, grayscale/gradient matching or other appearance-based or feature-based methods. In a preferred embodiment of the present invention, the hand held input device and/or the at least one three-dimensional physical object may be provided with infrared markers and the plurality of cameras may be infrared cameras, wherein preferably the device may further comprise an infrared illumination device. The tracking system may detect the positions and orientations of these markers and may thereby determine the exact position of the selected part of the object service through these positions and orientations of the infrared markers relative to the coordinate system of the spatial augmented user interface. To ensure an explicit detection of the respective objects, the three-dimensional physical objects and the hand held input device, respectively, certain basic parameters with respect to the arrangement of the infrared markers on these object have to be considered. The above-mentioned cameras also have to be arranged in a certain manner, preferably at the corners in an upper portion of the object carrier, since an observation of the same scene within the spatial augmented user interface from various different views in consideration of the camera resolutions and aperture angles is essentially for an accurate detection of the infrared markers and therewith of the objects itself. Alternatively, the hand held input device and the at least one three-dimensional physical object may be provided with any other means suitable to be detected by the tracking system. Furthermore, the device may further comprise at least one board as an additional display of an interaction menu. The illumination device may project additional menu selections onto such board, preferably a white board. Also white surfaces provided at a table board for receiving the one or more physical object/objects of the object carrier may serve as projection surface for the additional interaction menu. Additionally or alternatively, such additional menu selections may also be projected or illustrated onto/at, for example, the tabletop of the object carrier or on any other surface within the three-dimensional spatial user interface suitable for an illustration of an additional menu interface.
According to a third aspect, the object of the present invention is achieved by a spatial augmented reality system comprising the device of the second aspect of the present invention, and a physical object with at least two surfaces arranged at an angle to each other. Additionally, the physical object may be a 3D-printed model. Nowadays, a 3D-printed model can be produced quite fast and with relatively low expenditure of materials and costs.
Furthermore, almost every shape and form can be obtained.
By adjusting the visual output of the virtual data of the three-dimensional physical objects, realistic materials can be simulated at the object surfaces, although the actual material of the physical objects is another.
The system may operate the method of the first aspect of the present invention.
A preferred embodiment of the present invention will now be described by way of example with reference to the accompanying drawings, in which figure 1 is a perspective view of a device for detecting an interaction between a user and a three-dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention and a spatial augmented reality system according to the invention, figure 2 is an enlarged view of a middle section of the device for
detecting an interaction between a user and a three- dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention, and figure 3 is a flowchart of a method for detecting an interaction
between a user and a three-dimensional physical object in a three-dimensional spatial user interface. Figure 1 is a perspective view of a device for detecting an interaction between a user and a three-dimensional physical object with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface according to the invention and a spatial augmented reality system according to the invention and shows the arrangement and structure of a spatial augmented reality device 10 and a spatial augmented reality system, respectively. An object carrier 12 which may comprise a table board 14 for receiving one or more three-dimensional physical object/objects 24, 26 which may be placed on the table board 14 and an electronic control unit (ECU) 30 which may be placed on a second table board 16 may be provided. Physical object 24 may be a model of a car 24 and physical object 26 may be a model of a house 26 in the embodiment described herein. Physical objects 24, 26 could also be any other physical objects with at least two surfaces arranged at an angle to each other.
Furthermore, the object carrier 12 comprises a plurality of frames 18 whose bottom ends 17 may be designed as stands 17 of the object carrier 12 serving as a connection to the ground at its bottom portion. In addition, the frames 18 may form a connection between the table board 14 for receiving amongst other things the one or more three-dimensional physical object/objects 24, 26 and the second table board 16 for receiving e.g. the electronic control unit 30. At the upper portion of the object carrier 12, a plurality of cameras 32a, 32b, 32c and 32d which, respectively, may be provided at the upper corners of the object carrier 12 and preferably an illumination device 34, which may be provided at an upper middle area of the object carrier 12, may be held by the object carrier 12 via crossbars 19.
Likewise, the object carrier 12 constitutes a three-dimensional spatial user interface 100.
Furthermore, figure 1 shows a hand held input device 28 manageable by a user as well as a board 40 serving as an additional display 40 of an interaction menu.
Figure 2 is an enlarged view of a middle section of the device 10 of figure 1 and additionally shows coordinate systems x, y, z; xi , y^ ζγ, x2, y2, z2 of the spatial user interface 100 (coordinate system x, y, z), of physical object 24 (coordinate system Xi , yi , z-i) which may be a model of a car in the described embodiment, and of physical object 26 (coordinate system x2, y2, z2) which may be a model of a house in the described embodiment.
Figure 3 is a flowchart of a method for detecting an interaction between a user and a three-dimensional physical object 24, 26 in a three-dimensional spatial user interface 100.
It can be seen that model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to the coordinate system/systems Xi , y-i , z-i ; x2, y2, z2 of the object/objects 24, 26 may be inputted into the electronic control unit 30 and thereby may be stored in a storage 36 of the electronic control unit (ECU) 30. In particular, the model object data of the three-dimensional physical objects 24, 26 may be stored as CAD data in the storage 36 of the ECU 30.
The electronic control unit and the plurality of cameras 32a, 32b, 32c and 32d together may constitute a tracking system which can detect the spatial position and orientation of the one or more physical object/objects 24, 26 relative to the coordinate system x, y, z of the spatial user interface 100. Tracking and/or detection of the one or more physical object/objects 24, 26 may be achieved by means of infrared markers, in particular passive infrared markers, arranged at the object/objects 24, 26. Thereby, it is sufficient to provide at least two infrared markers at two different locations of each object 24, 26, respectively, such that the position and orientation of the three- dimensional physical object/objects 24, 26 can be detected overlaying within the ECU 30 the position of the at least two infrared markers with the stored object model data, in particular CAD data.
Then, second spatial coordinates of the part of the object surface relative to the coordinate system x, y, z of the spatial user interface 100 may be calculated via coordinate transformation.
Spatial coordinates of the hand held input device 28 manageable by the user relative to the coordinate system x, y, z of the spatial user interface 100 may be detected by means of the tracking system. In turn, tracking and/or detection of the hand held input device 28 may be achieved by means of infrared markers, in particular passive infrared markers, arranged at the input device 28. Thereby, generally one infrared marker arranged at the tip of the input device 28 would be sufficient, but a better and more precise detection can be carried out using at least two infrared markers arranged at different locations of the input device 28, wherein preferably at least one infrared marker has to be arranged at the tip of the input device. With the detected spatial coordinates of both, the one or more physical object/objects and the hand held input device, the ECU 30 may then calculate the spatial distance between the input device 28 and a part of the object surface of the physical object/objects 24, 26 to be selected. Based on the calculation of the distance between the input device 28 and the part of the object surface of the physical object/objects 24, 26 to be selected the ECU 30 may determine if the calculated distance is smaller than a predetermined threshold value or not and thus if a part of the object surface of the physical object/objects 24, 26 is selected or not, wherein the predetermined threshold value may be e.g. 1 cm, preferably 1 mm and most preferably substantially equal to 0.
Thus, a selection of the part of the object surface can be detected by detecting that a spatial distance between the input device 28 and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device 28.
Based on the result of the step of determining a selection of a part of the object surface the one or more object/objects 24, 26 may then be illuminated by means of at least one illumination device 34 arranged at an upper middle area of the object carrier 12 and held by the object carrier 12 via crossbars 19 based on the result of the step of detecting a selection.

Claims

Claims
Method for detecting an interaction between a user and a three- dimensional physical object (24, 26) in a three-dimensional spatial user interface (100), the method comprising:
• inputting into an electronic control unit (30) model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system (x-i , yi , z-i ; x2, y2, z2) of the object (24, 26),
• detecting, by means of a tracking system, the spatial position and orientation of the physical object (24, 26) relative to a coordinate system (x, y, z) of the spatial user interface (100), wherein the tracking system comprises a plurality of cameras (32a, 32b, 32c, 32d) and the electronic control unit (30) which is connected to the cameras (32a, 32b, 32c, 32d),
• calculating second spatial coordinates of the part of the object surface relative to the coordinate system (x, y, z) of the spatial user interface (100),
• detecting, by means of the tracking system, spatial coordinates of a hand held input device (28) manageable by the user relative to the coordinate system (x, y, z) of the spatial user interface (100), and
• detecting a selection of the part of the object surface by detecting that a spatial distance between the input device (28) and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device (28).
Method according to claim 1 , characterized in that the method further comprises illuminating the object (24, 26) by means of at least one illumination device (34) based on the result of the step of detecting a selection.
3. Method according to claim 1 or 2, characterized in that the illumination device (34) illuminates the selected part of the object surface such as to be visually distinct from other parts of the object surface.
4. Method according to claim 3, characterized in that the illumination
device (34) changes the illumination of the selected part of the object surface upon detecting its selection.
5. Method according to any of the preceding claims, characterized in that the method further comprises the steps of:
• selecting an attribute to be associated with a selected part of the object surface, and
• associating the attribute to a selected part of the object surface when detecting the selection of the part of the object surface by the input device (28). 6. Method according to any of the preceding claims, characterized in that the method further comprises processing a data set containing a plurality of parts of the object surface and a plurality of attributes respectively associated to each of the parts of the object surface. 7. Device (10) for detecting an interaction between a user and a three- dimensional physical object (24, 26) with at least two surfaces arranged at an angle to each other in a three-dimensional spatial user interface (100), the device (10) comprising:
• an object carrier (12), adapted to carry the physical object (24, 26), • a tracking system comprising a plurality of cameras (32a, 32b, 32c, 32d) and an electronic control unit (30) which is connected to the cameras (32a, 32b, 32c, 32d), and
• a hand held input device (28) manageable by the user,
wherein the electronic control unit (30) stores model object data at least comprising first spatial coordinates of a model of at least a part of an object surface relative to a coordinate system (xi, yi, Zi; x2l y2, ¾) of the object (24, 26),
wherein the tracking system is adapted to detect the spatial position and orientation of the physical object (24, 26) relative to a coordinate system (x, y, z) of the spatial user interface (),
wherein the electronic control unit (30) is adapted to calculate second spatial coordinates of the part of the object surface relative to the coordinate system (x, y, z) of the spatial user interface (100), and wherein the tracking system is further adapted to detect spatial coordinates of the hand held input device (28) manageable by the user relative to the coordinate system (x, y, z) of the spatial user interface (100), and to detect a selection of the part of the object surface by detecting that a spatial distance between the input device (28) and the part of the object surface is smaller than a predetermined value based on the second spatial coordinates of the part of the object surface and the spatial coordinates of the input device (28).
Device (10) according to claim 7, further comprising an illumination device (34) adapted to illuminate the object (24, 26) based on the result of the detection of a selection.
Device (10) according to claim 7 or 8, characterized in that the illumination device (34) is adapted to illuminate the selected part of the object surface such as to be visually distinct from other parts of the object surface.
10. Device (10) according to one of claims 7 to 9, characterized in that the illumination device (34) is adapted to change the illumination of the selected part of the object surface upon detecting its selection. 1 1 . Device (10) according to one of claims 7 to 10, characterized in that the hand held input device (28) and/or the at least one three-dimensional object (24, 26) are provided with infrared markers and wherein the plurality of cameras (32a, 32b, 32c, 32d) are infrared cameras (32a, 32b, 32c, 32d), wherein preferably the device (10) further comprises an infrared illumination device.
12. Device (10) according to one of claims 7 to 11 , further comprising at least one board (40) as an additional display (40) of an interaction menu.
13. Spatial augmented reality system comprising:
• the device (10) according to one of claims 7 to 12, and
• a physical object (24, 26) with at least two surfaces arranged at an angle to each other.
14. System according to claim 13, characterized in that the physical object (24, 26) is a 3D-printed model (24, 26).
15. System according to claim 13 or 14, characterized in that the system operates the method according to one of claims 1 to 6.
EP16794972.6A 2016-11-04 2016-11-04 Spatial augmented reality method, device and system Withdrawn EP3535643A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/076714 WO2018082782A1 (en) 2016-11-04 2016-11-04 Spatial augmented reality method, device and system

Publications (1)

Publication Number Publication Date
EP3535643A1 true EP3535643A1 (en) 2019-09-11

Family

ID=57288379

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16794972.6A Withdrawn EP3535643A1 (en) 2016-11-04 2016-11-04 Spatial augmented reality method, device and system

Country Status (2)

Country Link
EP (1) EP3535643A1 (en)
WO (1) WO2018082782A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11253181B2 (en) * 2018-08-03 2022-02-22 From Zero, LLC Method for objectively tracking and analyzing the social and emotional activity of a patient

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10306193B2 (en) * 2015-04-27 2019-05-28 Microsoft Technology Licensing, Llc Trigger zones for objects in projected surface model

Also Published As

Publication number Publication date
WO2018082782A1 (en) 2018-05-11

Similar Documents

Publication Publication Date Title
US10217288B2 (en) Method for representing points of interest in a view of a real environment on a mobile device and mobile device therefor
US9430698B2 (en) Information input apparatus, information input method, and computer program
TWI450132B (en) A portrait recognition device, an operation judgment method, and a computer program
US10762386B2 (en) Method of determining a similarity transformation between first and second coordinates of 3D features
US9256288B2 (en) Apparatus and method for selecting item using movement of object
US20090226113A1 (en) Environment Map Generating Apparatus, Environment Map Generating Method, and Environment Map Generating Program
US10853966B2 (en) Virtual space moving apparatus and method
KR20020086931A (en) Single camera system for gesture-based input and target indication
US20210255328A1 (en) Methods and systems of a handheld spatially aware mixed-reality projection platform
JP2011065202A (en) Autonomous mobile device
KR100971667B1 (en) Apparatus and method for providing realistic contents through augmented book
US20210287330A1 (en) Information processing system, method of information processing, and program
WO2018082782A1 (en) Spatial augmented reality method, device and system
EP3088991B1 (en) Wearable device and method for enabling user interaction
Piérard et al. I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes
WO2020084192A1 (en) Method, arrangement, and computer program product for three-dimensional visualization of augmented reality and virtual reality environments
US20210343040A1 (en) Object tracking
WO2023194612A1 (en) Calibration device and method for an electronic display screen for touchless gesture control

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190408

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210219

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210702