EP3195593A1 - Dispositif et procédé d'orchestration de surfaces d'affichage, de dispositifs de projection et de dispositifs d'intéraction spatialisés 2d et 3d pour la création d'environnements intéractifs - Google Patents
Dispositif et procédé d'orchestration de surfaces d'affichage, de dispositifs de projection et de dispositifs d'intéraction spatialisés 2d et 3d pour la création d'environnements intéractifsInfo
- Publication number
- EP3195593A1 EP3195593A1 EP15775768.3A EP15775768A EP3195593A1 EP 3195593 A1 EP3195593 A1 EP 3195593A1 EP 15775768 A EP15775768 A EP 15775768A EP 3195593 A1 EP3195593 A1 EP 3195593A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- display
- spatialized
- interaction
- user
- devices
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1423—Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/301—Simulation of view from aircraft by computer-processed or -generated image
- G09B9/302—Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/32—Simulation of view from aircraft by projected image
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present invention relates to the field of information presentation devices and interaction devices. It relates more particularly to multiple-image digital image projection and display devices taking into account the interactions of one or more users with this mainly visual environment but which can be extended to the sound domain and to any spatialized information device. .
- the hardware needed to create a truly immersive virtual reality experience ie in which the user is not a simple viewer but can interact with the virtual environment as it would with the real environment, is quite prohibitive which restricts its use to research and some business areas where security constraints are more important than budget constraints.
- the results obtained will not necessarily be representative of real-world usage.
- the virtual world does not necessarily reproduce the real environment in all its details (sound and light conditions, vibrations, etc.) and it is not adapted to collaboration because virtual avatars do not allow to accurately transcribe the relative positions users between them and non-verbal communication (gestures, attitudes, mimicry, etc.).
- augmented reality Another state-of-the-art solution is based on augmented reality, the principle of which is to mix the real world and the virtual world.
- the user perceives the real world through a pair of semi-transparent glasses that superimpose in real time a 3D virtual model (or 2D) to his perception of the real world.
- augmented reality has an advantage in terms of algorithm and render fidelity: using the real world puts the user in a familiar environment and reduces modeling efforts by reusing existing elements of the user's environment, which makes it possible to reduce the complexity of the graphical scene manipulated in terms of the number of polygons.
- a major disadvantage of this approach is that the user must be paired with a pair of augmented reality glasses which can be tiring for long evaluations and requires equipment adapted to the sight (corrective glasses, contact lenses, etc.). of all the users participating in the evaluation which can be quite expensive.
- Another major drawback is that each user has his or her own subjective view of the augmented real world, which does not facilitate the creation of a shared context in co-localized collaboration situations: even if the virtual world superimposed on reality is shared. , different users do not see exactly the same thing, more particularly the virtual world presented to user can come to hide the hands of a second user thus preventing the first user from becoming fully aware of the actions of his collaborator.
- a disadvantage of this approach is that computer vision is very sensitive to occlusion: a user can hide the actions of another user by placing his arm or hand between it and the camera, making it invisible to the user. the camera.
- the computer vision is also sensitive to ambient light which constrains the conditions of use of the environment thus achieved, in particular the maximum brightness tolerated by the computer vision is generally lower than the illumination of the workplace of the users .
- display devices such as screens
- the use of display devices, such as screens, in the field of the camera can disrupt computer vision: the light and heat emitted by these display devices can be perceived by the camera and lead to false positives.
- Computer vision hardly handles dynamic changes in the environment because this technique relies on comparing a current image to a starting condition.
- the present invention aims to remedy these drawbacks by providing a method and a device for rapidly and inexpensively real interactive environment that can dynamically adapt to the devices present. More particularly, the present invention is directed to a device for orchestrating 2D and 3D spatial display surfaces, projection devices and spatial interaction devices.
- the device according to the invention is particularly suitable for producing simulators, for example a cockpit simulator, but is not limited to this field of use. Presentation of the invention
- the invention aims firstly at a display management and interaction device on a plurality of supports, comprising:
- the second aim of the invention is a display device comprising a device as described above and:
- At least one spatialized interaction device adapted to detecting gestural instructions from a user.
- the invention aims in another aspect a display management method and interacting on a plurality of selected areas on display media, said display surfaces receiving images projected by at least one image projection system.
- the method comprises a step:
- the method comprises modeling each display surface, using at least one of the following substeps:
- the sub-step 100C includes an automated setting using a computer vision system coupled to a projection system displaying different sequences of visual patterns to detect and calibrate the different planes of projection.
- the sub-step 100C comprises modeling a virtual projection plane according to the orientation of the image projection system but also to its focal length, this virtual projection plane being normal to the projection axis of the image projection system and located at a distance depending on the focal length of the projector.
- the method comprises a step: 200 / integration of 3D spatialized interaction devices in the global geometric environment model, through the determination geometric transformations necessary for the interpretation of the information that these 3D spatialized interaction devices provide in the same three-dimensional coordinate system as that used for the modeling of the display surfaces.
- step 200 comprises sub-steps:
- the method comprises a step: 300 / integration of at least one spatialized interaction device 2D in the global geometric model of environment by the determination of transformation of 2D coordinates from the device of 2D spatialized interaction in 3D coordinates that can be taken into account in the global geometric model of environment.
- the method comprises a step:
- this step 400 having the following sub-steps:
- the invention then constitutes a device and a method for orchestrating display surfaces, projection devices and 2D and 3D spatial interaction devices for the creation of multimodal interactive environments.
- the invention aims at a device and a method of unifying in a same three-dimensional frame of a plurality of display surfaces, video projection devices and input devices including at least one 2D or 3D spatial interaction device and / or a touch surface - all of which may be static or mobile - to match any point, line or shape of the physical space produced by a device. input, one or more points and / or one or more lines and / or one or more shapes on the display surfaces and the video projection devices.
- the invention relates to a system (hardware and method) adapted to recreate a partially or completely simulated environment on a set of arbitrary surfaces surrounding the user, providing a display equivalent to that which it would have in a real environment, with identical tactile and / or interaction functions.
- the software implementing the method of the invention comprises modules intended:
- this screen directly displaying the data to be displayed
- FIG. 1 the different elements involved in an implementation of the invention
- FIG. 2 a flowchart of the main steps of the process. Detailed description of an embodiment of the invention
- a device according to the invention is used in the context of the generation of a cockpit simulator. It will be referred to hereafter as the simulation management device.
- the simulation management device uses for its implementation a plurality of display surfaces 10 not necessarily flat, nor necessarily parallel, related or coplanar.
- the invention can naturally be implemented on a single surface, but finds full use only for the generation of images to several surfaces.
- the display surfaces 10 considered here are in particular passive type. That is, it can typically be board surfaces, boards, etc.
- the display surfaces 10 consist of a set of cardboard boxes of various sizes, arranged substantially facing a user 15 of said simulation management device.
- the simulation management device comprises first and foremost at least one image projection system 11 to these display surfaces 10, for example of the video projector type.
- image projection systems 11 may, more generally, consist of any device capable of generating dynamic visual information.
- the simulation management device comprises secondly a computer 12, for example of the microcomputer type, adapted to send data to be displayed and display commands to image projection systems 1 1.
- This calculator 12 is connected to a database 13.
- some display surfaces 10 ' may consist of screens (LCD type or other).
- the computer 12 sends directly to these screens 10 'the images to be displayed.
- the simulation management device comprises thirdly at least one spatialized interaction device 16 between the user 15 and the computer 12.
- a non-contact spatialized interaction device may for example be of type based on pattern recognition. and / or movements ("Leap Motion” type, Kinect-deposited mark-, Eye-tracking eye movement detection, etc.).
- Leap Motion type, Kinect-deposited mark-, Eye-tracking eye movement detection, etc.
- Such systems are known to those skilled in the art, and the details of their constitution are beyond the scope of the present invention. It is therefore not detailed further here, as for the computer 12 and the image projection systems 1 1. These systems make it possible to interpret visual or manual movements of the user as commands for modifying the display.
- non-contact spatialized interaction devices 16 make it possible to detect movements of the user's hands 15 towards certain areas of the display surfaces. 10, for example representing images of aircraft system control panel areas.
- the simulation management device can change the display according to the movements of the user's hand 15, representative of what would happen if a user acted on the actual control panel of the aircraft system.
- the simulation management device can determine, thanks to the detectors 16, the position or the attitude of the user facing the display surfaces 10, and consequently adapt the display as a function of this attitude of the user. 15. This attitude may characterize either an area it observes or a change control to the vehicle being simulated.
- the simulation management device also comprises at least one contact interaction device such as sensor, button, touch surface, etc.
- the simulation management device may also include interaction devices between the user 15 and the computer 12, such as voice recognition, presence detector or other active device involved in interactive environments and generating discrete or continuous events.
- the simulation management device finally comprises a digital communication network 14 connecting the above elements, and in particular the spatialized interaction devices 16, the image projection systems 1 1 and the computer 12.
- the choice of the network 14 is naturally adapted to the volume of digital data (for example images) to be transited on this network.
- the simulation management device finally comprises means for managing the image projection systems 11, implemented in the form of one or more software modules by the computer 12.
- a geometric modeling of the user's visual environment is performed.
- the simulation management device comprises for this purpose in the first place a module for combining display surfaces 10 of heterogeneous natures within the same digital environment model.
- each display surface 10 is modeled in the same frame of this three-dimensional space (the digital environment model).
- the modeling of each display surface 10 can be carried out using three techniques that can be combined with each other:
- the first modeling technique uses a direct geometric measurement in space using tools such as meters, ribbons, graduated rulers, and so on.
- 100B The second modeling technique uses a geometric measurement using three-dimensional modeling systems, for example based on accelerometer, optical or laser processing techniques, etc. Such techniques are known to those skilled in the art.
- the third modeling technique uses a visual calibration.
- patterns For one skilled in the art, it can be automated using a computer vision system coupled to a projection system displaying different sequences of patterns (“patterns”) visual (checkerboard, parallel bands, etc.) to detect and calibrate the different projection planes.
- This visual registration can also be manual, and performed using graphical tools to move virtual landmarks through the display systems 10 to match physical landmarks.
- This visual staggering task may for example require the user to visually project video-projected landmarks on the corners of a display surface 10 forming a physical polygon, regardless of the position of the projection system. 1 1, provided that it illuminates the viewing surface 10 considered.
- the visual calibration modeling technique is combined with first or second modeling techniques, by direct geometric measurement. Indeed, the visual calibration modeling technique is based only on the geometric projection, it does not keep the distances. As such, the visual calibration modeling technique is only an easy way to easily replace a projector 1 1 in the environment, provided that the display surfaces 10 on which it projects have been modeled with one of the first two techniques 100A, 100B defined above.
- the projector 1 1 can be placed approximately and the technique of visual registration makes it possible to "realign" it with the polygons corresponding to the display surfaces 10.
- the environment is completely modeled, in the form of a global geometric model of environment, c That is, data are available characterizing the position and dimensions of each display surface 10 in the user's visual environment vis-a-vis the image projection systems 11.
- this environment uses an image projection system 1 1 to power one or more display surfaces 10, it is necessary to create a correspondence between these display surfaces 10 and the image projection system 1 1 as a display source, so that each display surface 10 is controllably addressable by the image projection system 11.
- the image projection system 11 can display any composite image comprising a set of projected images to various display surfaces 10, matching its projection to match each desired image with the display surface 10 corresponding, whose edges or characteristic points have been identified.
- the visual calibration environment modeling technique described above is not the only way to geometrically model the visual environment. This modeling can also be ensured in the following way:
- a virtual projection plane is modeled according to the orientation of the image projection system 1 1 but also of its focal length. This is the plan that should be matched with a projection wall in a typical use, for example in a meeting room. This plane is normal to the projection axis of the projection system of images 1 1 and located at a distance depending on the focal length of the projector, corresponding to the sharpness distance of the image.
- the device under consideration is compatible with display surfaces 10, image projection systems 11 and moving interactive devices if it has been able to dynamically update the overall geometric environment model with the aid of FIG. least one of the three environmental modeling techniques defined above.
- the 3D spatialized interaction devices 16 are integrated in the global geometric environment model by determining the geometric transformations necessary for the interpretation of the information they provide in the same reference frame. three dimensions than the one used for modeling the visual environment.
- the simulation management device comprises, secondly, an interaction management module provided by the spatialized interaction devices 16.
- the calculation of the transformation function between the intrinsic coordinate system of the spatialized interaction device and the reference point of the invention is made by knowing the position of at least two points in these two marks or a point and a point. vector.
- the modeling can be completed by a visual calibration of these spatialized interaction devices 1.
- the modeling of the interaction device can be obtained through the modeling of this display surface 10 in the global geometrical model of environment, to allow the computation of the geometrical transformations allowing the bijective relation of the information generated by this device.
- the simulation management device performs the necessary calculations to match the information of the spatialized interaction devices 16 with the display surfaces 10 and produces visual effects on them accordingly.
- the correspondence includes in a non-exhaustive way:
- Spatial sound sources for example by using Dolby 5.1 -marque- or binaural-listening solutions, can also be integrated into the global environment benchmark, just like visual devices. This only requires knowing the position of the user's head. This position can be obtained by different computer vision systems known to those skilled in the art. 300 / Management of spatialized interactions in two dimensions
- a step 300 the method implemented in the invention, and described here in a non-limiting example, interprets the interactions captured by the spatialized interaction devices in two dimensions (for example, a sensor that tracks eye movements ( eye-tracking) or tactile devices, a precise rectangular area (tactile or multitouch frame) or an entire plane (radarTouch - registered trademark -, light beam, laser or infrared plane), which can be used together to a display surface (the two then constitute a tactile or multitouch display surface) or without a display surface (in this case they are gestural interaction devices in the "in-air gesture" space ) and retranscribes them as modifications of the display on the display surfaces 10.
- two dimensions for example, a sensor that tracks eye movements ( eye-tracking) or tactile devices, a precise rectangular area (tactile or multitouch frame) or an entire plane (radarTouch - registered trademark -, light beam, laser or infrared plane), which can be used together to a display surface (the two then constitute a tactile or multitouch display surface
- Two-dimensional spatialized interaction devices require, in comparison with the three-dimensional spatialized interaction devices, complementary operations for transforming the 2D coordinates into 3D coordinates that can be taken into account in the overall geometric environment model of the invention. .
- This integration into the global geometric model involves the attachment of each two-dimensional spatialized interaction device to a reference plane surface (virtual or not) modeled in the global environment and, secondly, by the use of launching techniques. radius to extend the capabilities of the 2D device to other display surfaces 10 of the environment.
- Calibration of a 2D spatialized interaction device is done using a visual calibration grid with a number of reference points, usually five or nine points even if three non-aligned points are sufficient for the skilled in the art. These reference points can be projected on the reference surface of the 2D spatialized interaction device in various ways, using or not the display capabilities of the invention. In all cases, the calibration of a 2D spatialized interaction device becomes reference point by reference point, and makes it possible to create a correspondence between the data of the 2D spatialized interaction device and the visual calibration grid.
- the method uses known techniques to launch radius to detect intersections with dots. other display surfaces 10 or other devices that can be reduced to the case of a spatial interaction device in 3D.
- the reference surface no longer needs to be visible. It is only necessary to obtain the information of the resulting point (s) on this reference surface from a mathematical point of view in order to represent them in the reference of the global environment.
- a step 400 the method implemented in the invention orchestrates the images projected on the display surfaces according to the information received from all the spatialized interaction devices.
- the generation of projected images on the various display surfaces 10 in real time is called the orchestration according to the actions of the user 15 as detected by the spatialized interaction devices 16.
- This modification of the images is calculated by the computer 12 and transmitted to the display surfaces 10 by the image projection systems 1 1.
- a first step it is a matter of mathematically projecting the information of the spatialized interaction devices 16 onto the display surfaces 10.
- the projection obtained is used to perform various actions on the display surfaces 10 concerned:
- the fusion of the information of the different spatialized interaction devices 16 and the display surfaces 10 within the global geometric environment model makes it possible to operate a spatialized interaction device 16 on several display surfaces 10 at the same time.
- a second step it is to use the spatialized information to locate physical entities (objects or users, example: a Leap Motion allows to locate in the space the hand of the user 15) and to project the information about or around these entities.
- This projection is based both on the 3D positioning of the display surfaces 10 and on the virtual reference surfaces of the projectors 1 1.
- the reference virtual surface of a video projector 1 1 means the "rectangular" area corresponding to the projection area of the projector at its “sharpness distance”. This "rectangular" surface is normal to the projection axis of the video projector.
- each spatialized interaction device 16 communicates to the other elements (spatialized interaction devices 16, display surfaces 10, image projection systems 1 1) of the simulation management device the actions it captures. and each display surface 10 detects whether it is affected by these actions and, where appropriate, responds by visually updating itself, and communicating with the rest of the device.
- the simulation management device comprises at least one third-party interaction device of the voice command type, presence sensors, etc.
- the simulation management device described here as a non-limiting example finds particular use in the context of prototyping. interactive environments (cockpits, supervision systems, etc.) by allowing to recreate and extend all or part of a complex work environment by using prototyping devices or low-cost devices compared to the devices that will be selected in the environment once industrialized and put into operation.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Aviation & Aerospace Engineering (AREA)
- Human Computer Interaction (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1458702A FR3025917A1 (fr) | 2014-09-16 | 2014-09-16 | Dispositif et procede d'orchestration de surfaces d'affichage, de dispositifs de projection et de dispositifs d'interaction spatialises 2d et 3d pour la creation d'environnements interactifs |
PCT/FR2015/052469 WO2016042256A1 (fr) | 2014-09-16 | 2015-09-15 | Dispositif et procédé d'orchestration de surfaces d'affichage, de dispositifs de projection et de dispositifs d'intéraction spatialisés 2d et 3d pour la création d'environnements intéractifs |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3195593A1 true EP3195593A1 (fr) | 2017-07-26 |
Family
ID=52988107
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15775768.3A Withdrawn EP3195593A1 (fr) | 2014-09-16 | 2015-09-15 | Dispositif et procédé d'orchestration de surfaces d'affichage, de dispositifs de projection et de dispositifs d'intéraction spatialisés 2d et 3d pour la création d'environnements intéractifs |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170257610A1 (fr) |
EP (1) | EP3195593A1 (fr) |
FR (1) | FR3025917A1 (fr) |
WO (1) | WO2016042256A1 (fr) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6780315B2 (ja) * | 2016-06-22 | 2020-11-04 | カシオ計算機株式会社 | 投影装置、投影システム、投影方法及びプログラム |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
WO2019079790A1 (fr) * | 2017-10-21 | 2019-04-25 | Eyecam, Inc | Système d'interface utilisateur graphique adaptatif |
CN107908384A (zh) * | 2017-11-18 | 2018-04-13 | 深圳市星野信息技术有限公司 | 一种实时显示全息人像的方法、装置、系统及存储介质 |
US10607407B2 (en) * | 2018-03-30 | 2020-03-31 | Cae Inc. | Dynamically modifying visual rendering of a visual element comprising a visual contouring associated therewith |
CN113325659A (zh) * | 2021-05-31 | 2021-08-31 | 深圳市极鑫科技有限公司 | 一种基于投影显示的人机交互系统和方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7019748B2 (en) * | 2001-08-15 | 2006-03-28 | Mitsubishi Electric Research Laboratories, Inc. | Simulating motion of static objects in scenes |
US7134080B2 (en) * | 2002-08-23 | 2006-11-07 | International Business Machines Corporation | Method and system for a user-following interface |
FR2933218B1 (fr) * | 2008-06-30 | 2011-02-11 | Total Immersion | Procede et dispositif permettant de detecter en temps reel des interactions entre un utilisateur et une scene de realite augmentee |
TWI385842B (zh) * | 2009-07-21 | 2013-02-11 | Nat Univ Tsing Hua | 電池電極製作方法 |
US8730309B2 (en) * | 2010-02-23 | 2014-05-20 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction |
-
2014
- 2014-09-16 FR FR1458702A patent/FR3025917A1/fr active Pending
-
2015
- 2015-09-15 US US15/511,238 patent/US20170257610A1/en not_active Abandoned
- 2015-09-15 EP EP15775768.3A patent/EP3195593A1/fr not_active Withdrawn
- 2015-09-15 WO PCT/FR2015/052469 patent/WO2016042256A1/fr active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2016042256A1 (fr) | 2016-03-24 |
US20170257610A1 (en) | 2017-09-07 |
FR3025917A1 (fr) | 2016-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10657716B2 (en) | Collaborative augmented reality system | |
US11043031B2 (en) | Content display property management | |
US10553031B2 (en) | Digital project file presentation | |
KR102222974B1 (ko) | 홀로그램 스냅 그리드 | |
US10055888B2 (en) | Producing and consuming metadata within multi-dimensional data | |
CN107209386B (zh) | 增强现实视野对象跟随器 | |
CN106662925B (zh) | 使用头戴式显示器设备的多用户注视投影 | |
KR102460047B1 (ko) | 유저 안경 특성을 결정하는 눈 추적용 디바이스를 갖는 헤드업 디스플레이 | |
EP3195593A1 (fr) | Dispositif et procédé d'orchestration de surfaces d'affichage, de dispositifs de projection et de dispositifs d'intéraction spatialisés 2d et 3d pour la création d'environnements intéractifs | |
US11340707B2 (en) | Hand gesture-based emojis | |
US11743064B2 (en) | Private collaboration spaces for computing systems | |
WO2016209605A1 (fr) | Ancrage situé à un endroit virtuel | |
US11797720B2 (en) | Tool bridge | |
US11727238B2 (en) | Augmented camera for improved spatial localization and spatial orientation determination | |
KR20230017849A (ko) | 증강 현실 안내 | |
US11449189B1 (en) | Virtual reality-based augmented reality development system | |
US11057612B1 (en) | Generating composite stereoscopic images usually visually-demarked regions of surfaces | |
US11961195B2 (en) | Method and device for sketch-based placement of virtual objects | |
Sánchez Salazar Chavarría et al. | Interactive 3D touch and gesture capable holographic light field display with automatic registration between user and content | |
Guizzi et al. | Augmented Reality and Virtual Reality: From the Industrial Field to Other Areas | |
US11676329B1 (en) | Mobile device holographic calling with front and back camera capture | |
US11763525B1 (en) | Blind object tracking using point clouds | |
US20240112413A1 (en) | Mapping a Real-World Room for A Shared Artificial Reality Environment | |
US20240112412A1 (en) | Mapping a Real-World Room for A Shared Artificial Reality Environment | |
US20240112414A1 (en) | Mapping a Real-World Room for A Shared Artificial Reality Environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170418 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: PEYRUQUEOU, VINCENT Inventor name: VALES, STEPHANE Inventor name: LEMORT, ALEXANDRE |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200511 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
R17C | First examination report despatched (corrected) |
Effective date: 20200728 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20220224 |