EP3785098A1 - Systeme d'ajustement de localisation en environnement virtuel immersif - Google Patents
Systeme d'ajustement de localisation en environnement virtuel immersifInfo
- Publication number
- EP3785098A1 EP3785098A1 EP19716920.4A EP19716920A EP3785098A1 EP 3785098 A1 EP3785098 A1 EP 3785098A1 EP 19716920 A EP19716920 A EP 19716920A EP 3785098 A1 EP3785098 A1 EP 3785098A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- selection
- interface
- user
- last
- pointing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates to a system for locating a control element in an immersive virtual environment.
- the present invention also relates to a display system in an immersive virtual environment.
- the present invention more particularly relates to an auto-adaptive system for locating a control element in an immersive virtual environment.
- the current technological advancement of virtual reality makes it possible to project a user within a virtual universe in which various contents are displayed, for example a 3D object to be studied. It is not only possible to view different 3D content but also to interact with it in real time, which opens the way to many applications.
- the immersive digital project review for example is materialized by the use of virtual reality as a control tool and a support for discussion between the various interlocutors of a project.
- digital models of vehicles can then be used by different interlocutors including designers, engineers, ergonomists, and others who can interact during immersive meetings centered on the digital model then serving as a communication medium and exchange of points of view between the different trades.
- Said digital model also makes it possible to verify that the vehicle model meets architectural, ergonomic, perceived quality or livability criteria, among others.
- a user is able to interact with the virtual environment.
- the CAVE acronym Cave Automatic Virtual Environment is a 3D visualization room substantially cube. It consists of an immersive room with three joined vertical walls arranged between a ceiling and a floor and equipped with cameras and sensors at the corners of the ceiling and / or the floor of the room and overhead projectors to broadcast images of the environment. stored in a memory of a control unit. Said sensors and projectors are connected to said control unit. The user is able to move in said environment and to interact with said virtual environment.
- said user is located at his point of view and at a control element that belongs to the user.
- the control element is generally a hand of said user.
- the location of the user's point of view is facilitated by a head tracking device carried by the user, for example the wearing of glasses by said user.
- Said glasses are generally provided with a marker element which is formed for example by a ball with a metal surface disposed between the two spectacle lenses.
- Oblique vision pyramids are then calculated in real time from the theoretical position of each eye. However, this position is not always exact or stable: it is often interpolated. For example, variations in morphology between people, the inter-pupillary distance or the depth of the eyes or the height of the nose are all parameters that give rise to errors in the perception of the scale, the depth, or even the position. side of the image.
- the position of the tracking device is rarely very robust and may vary from one session to another, or even during a single session.
- Display devices can also be a source of error.
- the position of said devices can be approximate on the one hand on the geometry of the room (walls) and the layout of the projectors and very variable depending on the quality of the CAVE.
- the content of virtual scenes is rarely as rich in spatial perception indices as that of a real scene. Many conflicts of perception are present (accommodation / vergence, pixel size, latency) and induce a weak perception by the brain. Spatial information is full of uncertainties, and users are not very confident about distances, scales and visual depths. Especially, the mixture between physical objects, here the hand of the user, and the virtual objects is sometimes difficult to apprehend
- the design of virtual cell phone keypads is an example.
- said suggested key can be operated with a contact in a larger space.
- This differentiates the size of the collision box of each key which represents each of the keys increases and decreases according to the previously touched keys. More specifically, for example, after pressing on a "q", the letter “u” which is visually of the same dimensions as the letter "q” has a larger collision box, so as to be more easily touched.
- the collision box of an object substantially represents the space allocated for the contact of said object.
- GB201009714-A describes this improvement.
- Publication FR3028968-B1 proposes a tactile interface comprising a screen capable of detecting the approach and position of a user's finger.
- the interface is configured to display on the screen graphical elements superimposed on selection zones or collision boxes, distinct from each other, said interface makes it possible to center the touch selection zone under the finger of the when the user tries to touch an associated graphical element.
- Both publications offer interfaces that require a position of the control element which is here the finger of the user, relatively close to the collision box to be able to appreciate the point of impact of the user's finger on the box of collision.
- a disadvantage of these publications is not to allow contact between the user's finger with the collision box of each object from a sufficiently large distance as in an immersive virtual environment.
- the object of the invention is to remedy these problems and one of the objects of the invention is
- the present invention more particularly relates to a selection system in an immersive virtual environment, comprising:
- a 3D display system capable of displaying a selection graphic element in a region of a virtual control surface
- Said selection system comprising an interface capable of detecting the approach and the position of the pointing means with respect to said virtual control surface to reach a selection zone associated with the selection graphic element,
- the interface is adapted to modify the selection zone of the graphic element while keeping the display of said frozen element when the position of impact of the pointing means with the selection element on the control surface differs an effective position of the selection area by a deviation below a threshold.
- the selection system according to the invention is effective in a virtual immersive environment which may be a CAVE comprising an immersive room with sensors and projectors capable of diffusing images on the walls of said room.
- Said selection system comprises a 3D display system formed by the immersive room projectors which is able to display a graphic selection element that a user will be able to point and touch, in a spatial region of a virtual surface of ordered.
- the images projected on the walls of the room can be interpreted to display the selection graphical element in a spatial region of a control surface.
- the contact with said control surface makes it possible to validate a command, in particular in contact with a selection area associated with the graphic element, to validate a command linked to said graphic element.
- Said control surface is virtual is represented by its spatial coordinates for example.
- the selection system includes a positioning and head movement detection system for defining the user's point of view and for displaying the selection graphical element at the control surface, and a user-held pointing means for selecting said graphic element.
- Said pointing means may be a finger of the user whose end is surrounded by a tracking cap for the detection and tracking of said finger.
- the selection system comprises an interface adapted to detect the approach and the position of the pointing means with respect to said virtual control surface to reach the selection zone associated with the selection graphic element.
- Said interface is capable of verifying the impact or not of the pointing means on the selection zone associated with the graphic element on the one hand in a plane parallel to the control surface and on the other in depth, that is to say say along an axis orthogonal to the control surface.
- the display of the graphic element can be interpreted differently by each of the eyes of the user, the point of view of each of the user's eyes being substantially different from the central point of view between the two eyes taken into consideration. counts for displays in the immersive room, which induces terrain errors for the user. These relief errors can also evolve throughout the test session in the immersive room.
- the interface of the selection system is able to correct these terrain errors by modifying the selection zone that the user wants to reach with his pointing means, while keeping the display of said element frozen, when the position impact of the pointing means with the selection element on the control surface differs from an effective prior position of the selection area by a deviation below a threshold.
- Said previous effective position of the selection area is defined by the previous selections.
- the threshold being defined beforehand for example to avoid interference between two neighboring graphical elements.
- the interface moves the effective selection position of the selection element.
- the interface moves the effective position of the selection area of the selection element and does not disturb the user's vision.
- the interface is able to modify the effective position of the selection zone of the graphic element according to the positions of the last N points of the selection element.
- the interface of the selection system is able to hold N manipulations of selections of said graphic element and in particular several differences between the impact positions of the pointing means on the control surface with the graphic element with the effective position. prior to the selection area of the associated graphic element to avoid changes too fast that may disturb the user.
- the interface evaluates the consistency between the average of the N-1 penultimate pointing positions with the last pointing position.
- the interface evaluates the last pointing position of the selection area of the selection element with the averages of the N-1 penultimate pointing positions in order to check the perception drift of the user. . Indeed if the last score is substantially in the same space as the N-1 penultimate points, then the interface consolidates its strategy and moves the effective position of the selection area of the selection element. On the other hand if the last score is not in the same space, then the interface does not take the last score position with the same weight as the average of the N-1 penultimate scores and wait for the next score to check if this last score is due to an error or a correction on the part of the user.
- the selection interface defines the new effective position of the selection area of the selection element according to the average of the positions of the N-J penultimate scores and J last scores.
- the interface defines a new effective selection position for the selection area of the selection element when the last J scores differ from the average of the NJ j-penultimate scores, increasing the weight of the last J scores in relation to the average of NJ j-penultimate scores.
- the interface will rapidly move the selection area of said selection element closer to the perception of the user.
- the minimum number of consecutive selections of the same selection element is greater than or equal to 5.
- the number N of consecutive selections of the same selection element taken into account in the strategy of the interface is 5, which on the one hand gives sufficient inertia so as not to modify continuously the effective position of the selection zone of said element and at the same time a sufficient reactivity for the correction of said effective position.
- the effective anterior position of the selection zone is the center of the projection of the selection zone on the control surface.
- the interface evaluates the differences between the pointing positions and the center of the projection of the selection zone on the control surface, which makes the calculations easier and improves the reactivity of the interface.
- the pointing means is a tracking element held at the end of a finger of the user.
- the pointing means is a tracking element recognized by the tracking and tracking system of the immersive room and it can be simply formed by a tip surrounding a finger, generally the index of a hand of the user.
- FIG 1 is a schematic view of an immersive virtual environment.
- FIG. 2 is a schematic view of an interface of the selection system according to the invention.
- FIG. 3 is a schematic view of modification of a selection zone of a graphic element.
- a user 10 is immersed in an immersive environment of virtual reality 11, here a "CAVE” acronym for Cave Automatic Virtual Environment.
- Said virtual environment is composed of an immersive room 12 formed with three vertical walls 14, a floor 13 and a ceiling 15, with a display system 17 comprising rear projection devices 16 disposed at the upper corners of the vertical walls 14 and able to diffuse images on the walls of the immersive room 12, in particular on the vertical walls 14 and the floor 13, and a location system 18 comprising sensors or cameras 19 adapted to identify and track the movements of one or more identified tracking elements.
- the user 10 is equipped with 3D vision glasses that allow him to perceive from the images broadcast on the walls of the room by the retroprojection devices 16, one or three-dimensional virtual objects 3D.
- a tracking means 21 representing the point of view of the user is arranged between the glasses of the spectacles.
- said virtual objects comprise selection elements 31.
- Said tracking means 21 is for example a silver ball identified and tracked by the sensors 19 of the location system 18 of the immersive room 12.
- the point of view defined by this means of tracking may not be representative of the point of view of each eye of the user because for example the distance between the eyes which varies for each user 10. Therefore, each of the eyes of the user the user can see a different image resulting in a perception error of the selection element which may be a terrain error.
- the perception error of the selection element 31 may not be systematic or directed in a given direction.
- the perception of the selection element can drift throughout the test and test session in the immersive room.
- the user can himself make corrections throughout the test and test session. Both causes may lead to errors in the perception of the selection element 31 in different or even opposite directions.
- the user 10 holds by hand a pointing means 22 which is also detected and tracked by the sensors of the immersive room.
- said pointing means may be a tip worn at the end of a finger, including an index of a hand of the user, as shown in Figure 1 and 2.
- the invention relates to a gridded tablet or a keypad 21 to facilitate the understanding of the invention but the invention may relate to another object.
- the invention relates to a selection system 30 buttons 31 forming the keypad 23, each of said buttons is a selection element 31.
- Said keypad 23 and the selection elements 31 which are the buttons are displayed on a virtual control surface 34 and said display of the keypad and selection elements remains fixed.
- the images diffused on the walls of the immersive room are perceived by the user 10 as a three-dimensional object at a fixed spatial position.
- the selection system 30 comprises an interface capable of detecting the position of the pointing means 22 formed by the tip of the user's finger and therefore the approach of said pointing means of the virtual control surface 34.
- the interface defines a selection area 33 for each of the selection items 31 or buttons of the keypad.
- Each of the selection areas 33 is substantially a virtual element without appearing here a cube identified by its spatial coordinates, associated with each of the selection elements 31 displayed and which is initially coincident with the selection element 31 on the control surface 34.
- the selection zones 33 do not appear to the user and are represented in dotted lines.
- the area of selection also called collision box is a logical element representing a virtually solid volume here the selection element, which is able to collide with other collision boxes. Most of the time, the collision box follows the shape of the selection element.
- the "virtual" contact of the pointing means 22 held by the user 10 with the selection zone 33 or collision box validates the selection of the selection element 31. The virtual contact is verified by the presence of common spatial coordinates of the pointing means 22 with those of the selection zone 31.
- the interface is able to follow the pointing means in the immersive virtual environment and to detect the approach and the position of the pointing means 22 with respect to said virtual control surface 34 to reach a selection zone 33 associated with the graphic selection element 31, thanks to the sensors 19 of the immersive room.
- the interface is able to move the selection zone 33 or collision box of the selection element 31 when the pointing means 22 misses the virtual target displayed by the display system, said virtual target represented by the selection element 31 remains frozen at the same location.
- the display system 17 is independent of the selection system 30, in particular from said interface of the selection system. However, the difference between the impact position of the pointing means 22 on the control surface 34 must not exceed a distance threshold in order to avoid interference between the different selection zones 33 or adjacent collision boxes.
- the interface defines an effective position of the selection zone 33 of the selection element. This effective position is initially that is to say at the start of the test session coincides with the display position of the selection element 31.
- the effective position is at least the last position of the selection area to reach the selection element. Said effective position may vary according to the invention in order to take account of repeated pointing errors, for example drifts of perception of the selection element in the immersive virtual environment. According to the invention, said selection zone 33 has its center disposed at the effective position 36.
- the interface of the selection system is able to modify the selection area of the selection element to take into account the errors of perception of the selection element and in particular when the position of impact of the pointing means 22 held by the user on the control surface 34 differs from the effective position of the selection zone associated with said selection element.
- the center of the selection zone 33 can be displaced, for example by a translation parallel to the control surface 34, from the point A to the point B in view of the positions of the impact points 35 which are offset from the center of the selection zone 33.
- the points A and B are effective positions 36 of the selection zone 33.
- the difference taken into account depends on the spatial coordinates of the impact point on the one hand pointing means on the control surface 34, and secondly the effective position 36 of the selection zone 33 on the control surface 34 shown in Figure 3 by the center A.
- the interface retains at least a number N of the last Selection iterations on the same selection element 31.
- the number N of iterations is set at 5, which has the advantage of not overloading the recording memory of the central unit which unrolls the program of said interface.
- the number N set at 5 makes it possible to have sufficient inertia to allow adjustment of the effective position optimally.
- the interface applies a moving strategy which comprises several phases in which the interface takes into account the last N scores of the same selection element 31 with variable weight of influence, said phases are:
- a first evaluation phase of the impact position of the pointing means on the control surface of the last score with respect to the preceding scores on the same selection element is capable of evaluating the consistency between the average of the N-1 penultimate pointing positions with the last pointing position or the impact position of the pointing means on the control surface.
- the interface checks whether the position of the point of impact on the control surface of the last pointing is substantially in the same region of selection area of the selection element. More simply, the interface checks whether the spatial coordinates of the point of impact on the control surface 34 of the last pointing are substantially close to the coordinates of the average position obtained with the last N-1 scores.
- the interface assigns the different scores to the same weight of influence and then validates the drift of the effective position of the selection zone associated with the selection element and as shown in FIG. 3, the interface validates the displacement and is suitable for move the center of the selection area to the average position of the last N scores.
- the coordinates of the point of impact of the pointing means on the control surface of the last score are substantially different from the average of the coordinates of the impact points for the same selection element for the penultimate N-1s. scores.
- the interface does not validate the displacement of the effective position of the selection zone associated with said selection element.
- the interface assigns to the spatial position of the last score a low influence weight with respect to the influence weights of the last N-1 scores and validates the average position of the last N scores.
- the last score N is considered as an error and the correction associated with this last score does not take place.
- the interface is able to move the center of the selection zone to said average position of the last N scores.
- the interface takes into account this next score which becomes the last N score.
- the latter spatial points N are substantially close to the spatial position of the preceding average, while the interface assigns to the latter score the same weight of influence as the average of the last N-1 scores on the same selection element.
- a correction confirmation phase In this case, the previous score then becomes penultimate with another next score on the same selection item.
- the spatial coordinates of the last and penultimate scores on the same selection element are substantially similar.
- the interface then validates a correction and assigns the last score N a significant influence weight to take into account the correction made by the last and penultimate score.
- the interface then calculates the average position with the last two scores and is able to move an effective position of the selection area of the selection element accordingly.
- the interface of the selection system includes a limitation function so as not to stray too far from the visual.
- Said limiting function can be discrete, that is to say, it stops the drift when reaching a threshold of divergence, or degressive, that is to say that it applies a coefficient of gradual decline (what?).
- the interface runs a "fuzzy" logic algorithm.
- the algorithm is run on a calculator for each score.
- the algorithm described below is an exemplary embodiment of the invention.
- the algorithm is able to keep in memory the successes or errors of the user and to process them but without over-reacting in case of error on a score from time to time. Finally, the interface is able to catch the error faster if it repeats itself.
- the algorithm uses linguistic variables that are: • (A, [- 100,100], ⁇ Near, Middle, Far ⁇ - to take into account the proximity of the last score to the center, associated with a relative error in the interval [-100,100] with respect to center, and fuzzy subsets ⁇ Near, Middle, Far ⁇ .
- the membership functions of variables A and B are three Gaussians, the sum of which is equal to 1 everywhere in the range of values [-100,100].
- the membership functions of the variables X and Y are defined by a matrix of decisions.
- the algorithm defines a correction function for the displacement from the addition of six so-called partial functions.
- Each partial function is a response to the membership more or less strong in a fuzzy subset;
- Each partial function has a variable gain which is here the degree of membership in the sub -associated fuzzy set and a fixed gain represented by a coefficient Ki optimized by the calculation.
- pointing refers to the spatial position of the pointing on the control surface.
- fuzzy operators which are the operators of Zadeh.
- the decision matrix is described below.
- the function f z regulates the membership of the pointing made by the user to the fuzzy subsets Xo, Xi, X2, Yo, Y1 and Y2 as a function of the inputs Ao, Ai, A2, Bo, B1 and
- the six partial functions each perform a linear gain correction Ki to respectively the location of the last score and that of the last moving average.
- the Ki gains are the parameters to be optimized, in addition to the width of the fuzzy subsets.
- the interface of the selection system allows a correction of the selection of a selection element in an immersive virtual environment that can take into account the differences in perception of said element at each eye level of the user and to modulate the correction according to the error or the type of error made during a pointing.
- Each score is validated by its spatial position in the immersive room.
- SUBSTITUTE SHEET (RULE 26) to modulate the correction according to the error or the type of error made during a pointing.
- Each score is validated by its spatial position in the immersive room.
- the algorithm may be different and based on continuous average position changes.
- the algorithm can take into account the last J scores among N points, with J less than half of N to modify the inertia before moving.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1853599A FR3080696B1 (fr) | 2018-04-25 | 2018-04-25 | Systeme d'ajustement de localisation en environnement virtuel immersif |
PCT/EP2019/059673 WO2019206719A1 (fr) | 2018-04-25 | 2019-04-15 | Systeme d'ajustement de localisation en environnement virtuel immersif |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3785098A1 true EP3785098A1 (fr) | 2021-03-03 |
Family
ID=63294340
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19716920.4A Withdrawn EP3785098A1 (fr) | 2018-04-25 | 2019-04-15 | Systeme d'ajustement de localisation en environnement virtuel immersif |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3785098A1 (fr) |
KR (1) | KR20210002623A (fr) |
FR (1) | FR3080696B1 (fr) |
WO (1) | WO2019206719A1 (fr) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9164623B2 (en) * | 2012-10-05 | 2015-10-20 | Htc Corporation | Portable device and key hit area adjustment method thereof |
FR3023626A3 (fr) * | 2014-07-11 | 2016-01-15 | Renault Sas | Dispositif de navigation en realite virtuelle comportant une interface haptique |
FR3028968B1 (fr) | 2014-11-21 | 2016-11-25 | Renault Sa | Interface graphique et procede de gestion de l'interface graphique lors de la selection tactile d'un element affiche |
US10042438B2 (en) * | 2015-06-30 | 2018-08-07 | Sharp Laboratories Of America, Inc. | Systems and methods for text entry |
JP6483556B2 (ja) * | 2015-07-15 | 2019-03-13 | 株式会社東芝 | 操作認識装置、操作認識方法及びプログラム |
-
2018
- 2018-04-25 FR FR1853599A patent/FR3080696B1/fr active Active
-
2019
- 2019-04-15 EP EP19716920.4A patent/EP3785098A1/fr not_active Withdrawn
- 2019-04-15 KR KR1020207033714A patent/KR20210002623A/ko active Search and Examination
- 2019-04-15 WO PCT/EP2019/059673 patent/WO2019206719A1/fr unknown
Also Published As
Publication number | Publication date |
---|---|
FR3080696B1 (fr) | 2021-02-26 |
FR3080696A1 (fr) | 2019-11-01 |
WO2019206719A1 (fr) | 2019-10-31 |
KR20210002623A (ko) | 2021-01-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10789760B2 (en) | Focus guidance within a three-dimensional interface | |
US10416769B2 (en) | Physical haptic feedback system with spatial warping | |
US10394314B2 (en) | Dynamic adjustment of user interface | |
CN101689244B (zh) | 用于紧凑设备的基于相机的用户输入 | |
US9846486B2 (en) | Systems and methods of direct pointing detection for interaction with a digital device | |
US11683470B2 (en) | Determining inter-pupillary distance | |
US20110273369A1 (en) | Adjustment of imaging property in view-dependent rendering | |
US20110273466A1 (en) | View-dependent rendering system with intuitive mixed reality | |
CN109246463B (zh) | 用于显示弹幕的方法和装置 | |
Koulieris et al. | Gaze prediction using machine learning for dynamic stereo manipulation in games | |
Creem-Regehr et al. | Perceiving distance in virtual reality: theoretical insights from contemporary technologies | |
CN101180891A (zh) | 立体图像显示装置、立体图像显示方法及计算机程序 | |
Raudies et al. | Modeling boundary vector cell firing given optic flow as a cue | |
US20200233487A1 (en) | Method of controlling device and electronic device | |
US20190243527A1 (en) | Display device, program, display method, and control device | |
US20130314406A1 (en) | Method for creating a naked-eye 3d effect | |
Nightingale et al. | Can people detect errors in shadows and reflections? | |
EP3785098A1 (fr) | Systeme d'ajustement de localisation en environnement virtuel immersif | |
Hillaire et al. | Using a visual attention model to improve gaze tracking systems in interactive 3d applications | |
Rosa et al. | Selection techniques for dense and occluded virtual 3d environments, supported by depth feedback: Double, bound and depth bubble cursors | |
US11224801B2 (en) | Enhanced split-screen display via augmented reality | |
US11559744B2 (en) | Targeting of an individual object among a plurality of objects in a multi-player online video game | |
Smith | Unmediated Interaction: Communicating with Computers and Embedded Devices as If They Are Not There | |
WO2021191516A1 (fr) | Procédé et dispositif de contrôle d'accès. | |
Grossman | Interaction with volumetric displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201021 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: RENAULT S.A.S |
|
RAP3 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: RENAULT S.A.S |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20221201 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230608 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20230412 |