WO2022049608A1 - Immersive virtual reality system - Google Patents

Immersive virtual reality system Download PDF

Info

Publication number
WO2022049608A1
WO2022049608A1 PCT/IT2020/000062 IT2020000062W WO2022049608A1 WO 2022049608 A1 WO2022049608 A1 WO 2022049608A1 IT 2020000062 W IT2020000062 W IT 2020000062W WO 2022049608 A1 WO2022049608 A1 WO 2022049608A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
collision
time interval
control object
pointer
Prior art date
Application number
PCT/IT2020/000062
Other languages
French (fr)
Inventor
Sergio CARENA
Fabio FERRACANE
Original Assignee
Italdesign-Giugiaro S.P.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Italdesign-Giugiaro S.P.A. filed Critical Italdesign-Giugiaro S.P.A.
Priority to PCT/IT2020/000062 priority Critical patent/WO2022049608A1/en
Publication of WO2022049608A1 publication Critical patent/WO2022049608A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • This invention relates to virtual reality systems, and in particular to immersive virtual reality systems.
  • this invention relates to an immersive virtual reality system according to the preamble of claim 1 .
  • Virtual reality simulates actual reality, and immersive virtual reality is its most engaging development, as it allows users to interact with the simulation of real situations through specially developed interfaces, such as visors, headsets, gloves or other wearable garments, which make it possible for users to see, hear and touch an environment constructed around them, without being able to perceive the real environment where they are located.
  • specially developed interfaces such as visors, headsets, gloves or other wearable garments, which make it possible for users to see, hear and touch an environment constructed around them, without being able to perceive the real environment where they are located.
  • Virtual reality is a powerful tool used for product rendering in the automotive sector.
  • three-dimensional visualization and virtual prototyping software for designers has recently been developed, among which AUTODESK’S 3D VREDTM visualization software allows designers and engineers to create product presentations, revisions of designs and virtual prototypes by interacting with HTML 5 content in a VRED scene, including the immersive validation of human machine interfaces.
  • the drawback of this product, and of similar products based on visualization without interfaces, is that the interactions with the virtual scene mentioned above require the immersive experience to be interrupted, for example to access a configuration menu accessible only on the desktop, or for the scene to be specially prepared in advance or supported by an additional external user, for example to access the configuration menu through a monitor next to the user of the immersive virtual reality interfaces.
  • the object of this invention is to provide an immersive virtual reality system that allows a user immersed in the virtual scene to be able to interact directly therewith, through a virtual interface, without having to interrupt the immersive experience or resort to external support users, making the user no longer a passive observer, but rather an active operator of the scene.
  • the object of the invention is therefore to provide the possibility of full interaction with the scenes of a virtual environment, releasing the limits of the interaction from configuration activities prior to immersion in the virtual reality environment.
  • a further object is to provide a process for enhancing a virtual reality environment in an immersive virtual reality system through a configuration control object of said virtual reality environment that is flexible and usable with any means for interacting with virtual reality, whether it is wearable or otherwise operated by the user.
  • an immersive virtual reality system having the features referred to in claim 1 and a process for enhancing a virtual reality environment in an immersive virtual reality system, having the features referred to in claim 8.
  • this invention is based on the principle of generating a configuration control object of a virtual reality environment that may be represented in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable, hierarchical command logic levels, wherein the existence of a collision condition of said control object with a pointer object is verified cyclically to determine the display of a logic level of commands or the execution of a selected command.
  • the cycle for verifying the existence of a collision condition is temporarily stopped for a suspension period adapted to allow the collision event to be handled by executing the related programmed action, i.e.
  • the described procedure therefore confers a "one-touch" effect on the gesture performed by the operator, correctly executing the desired action only once, even if the pointer object collides with the command object for a prolonged time, up to a limit predetermined by the duration of said suspension period.
  • This invention takes concrete form, within the AUTODESK VRED product, in a software script which, by exploiting the VRED API and the VRED and Python programming functions, adds to the original virtual reality environment a configuration control object of said virtual reality environment representable in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable, hierarchical command logic levels , each represented by at least one object for interacting with the virtual reality environment.
  • Fig. 1 is a simplified block diagram of an immersive virtual reality system
  • Fig. 2 is an exemplifying representation of a virtual reality scene enhanced with a configuration control object of a virtual reality environment, according to the invention
  • Fig. 3 is an exemplifying representation of the control object according to the invention.
  • Fig. 4 is an exemplifying representation of the control object according to the invention together with a plurality of selectively viewable hierarchical command logic levels of ;
  • Fig. 5 is a flow diagram of the activation or deactivation of the control object according to the invention.
  • Fig. 6 is a flow diagram of the activation or deactivation of the display of a lower hierarchical logic level of commands or of the execution of a selected command of said control object, according to the invention.
  • Fig. 1 is a simplified block diagram of an immersive virtual reality system 10.
  • the system comprises processing means 12, such as a processing system, arranged to generate virtual reality scenes in a virtual reality environment, display means 14, such as for example a wearable visor, coupled to the processing means 12 and adapted to present said scenes of virtual reality to a user, and means 16 for interacting with said virtual reality scenes, such as for example a glove or a joystick, wearable or operable by said user and coupled to the processing means 12, which include sensor means 18 adapted to detect the relative spatial position in real space or in a real environment of a pointer member moved by said user.
  • This pointer member for example, may coincide with the portion of the glove that covers the tip of an index finger of the user.
  • the processing means 12 include timing means comprising first counters T_Lev_n of a predetermined collision verification time interval having a first time duration, for example 0.1 seconds, and a second counter T coll of a predetermined collision handling time interval having a second time duration, for example 1 second.
  • the processing means 12 are arranged to control the display of a pointer object P in a virtual reality scene S presented on the display means 14, on the basis of the relative spatial position of the pointer member in the real environment.
  • the processing means 12 are arranged to generate a configuration control object 20 of the virtual reality environment, which may be represented in a virtual reality scene in a predetermined position.
  • Fig. 2 shows by way of example a virtual reality scene S enhanced with a control object 20 according to the invention.
  • the virtual reality scene is for example an automotive design environment adapted to display a vehicle model 22, and within the environment the user may interact, for example to modify design parameters or to view parts of the vehicle being designed.
  • the pointer object P is displayed in the environment, in the current example virtual representations of both hands and the user’s forearms are displayed, and the pointer object P is associated with the tip of at least one index finger of the virtual representation of a user’s hand.
  • the control object 20 is displayed in association with an element of the virtual reality scene; in the example it is associated with the virtual representation of a user’s hand, belonging to the arm opposite to that bearing the pointer object P.
  • the control object 20 is shown - in the currently preferred embodiment - as a wearable accessory in a predefined position of the virtual representation of the user’s body, having the three-dimensional shape of a polyhedron with six triangular faces, formed by the coupling of two tetrahedra through a base thereof, the side faces of which are indicated with 20a, 20b, 20c in Fig. 3 and the vertices of which opposite the coupling bases are beveled in such a way as to present a flat control area 22, triangular in the described embodiment.
  • the control object 20 comprises a plurality of selectively viewable hierarchical command logic levels of , each represented by at least one object for interacting with the virtual reality environment.
  • the timing means comprise first counters T Lev n of a predetermined collision verification interval in a number equal to the number of hierarchical levels of commands, hereinafter indicated with T Lev O, T_Lev_l, etc.
  • the control object 20 is shown in Fig. 3 in an inactive condition (on the left) and in an active condition (on the right), respectively. Due to the smoothing of the vertices, the lateral faces 20a, 20b and 20c are represented in trapezoidal form. In the active condition at least one side face is duplicated symmetrically relative to its major base, and in the embodiment shown all three side faces 20a, 20b, 20c are duplicated symmetrically relative to their major bases, generating the faces 20a', 20b' and 20c', a condition that may be displayed graphically by simulating a “flower” opening of the polyhedron.
  • the faces 20a', 20b' and 20c' represent the highest hierarchical logic level of commands that is selectively viewable and each face 20a', 20b', 20c' creates an object for interacting with the virtual reality environment.
  • each face 20a', 20b' and 20c' has a pictogram indicative of the command that is executed by interacting with this face, respectively - in the example indicated - a “home” pictogram, a “checkmark” pictogram (for example indicative of a command to confirm an action performed) and an “X” pictogram (for example indicative of a command to negate an action performed).
  • the interaction with the faces 20a', 20b' and 20c' will be described hereinafter.
  • Fig. 4 shows an exemplifying hierarchical structure of commands, comprising a first level of commands LI including five objects for interacting with the virtual reality environment, each indicated with 24, a second level of commands L2, hierarchically lower than the command 24 selected in the first level, including four objects for interacting with the virtual reality environment, each indicated with 26, and a third level of commands L3, hierarchically lower than the selected command 26 of the second level, including two objects for interacting with the virtual reality environment, each indicated with 28.
  • Commands comprise enabling or disabling actions in the virtual reality environment, or the application or disapplication of attributes to one or more objects in a virtual reality scene.
  • the existence of a collision condition between the pointer object P (the tip of the virtual representation of the user's index finger) and the control object 20 is periodically verified according to a predetermined collision verification time interval having a first time duration, wherein the collision condition between the pointer object and the control object comprises the overlapping of the position of the pointer object and the control object in the virtual reality scene.
  • the elapsing of the collision verification interval is interrupted, and the display of a lower hierarchical logic level of commands or the execution of the selected command of the control object is determined, then resetting the elapsing of the collision verification time interval at the end of the collision handling time interval.
  • the duration of the collision verification time interval (e.g., 0.1 seconds) is less than the duration of the collision handling time interval (1 second or greater). This allows an almost continuous verification of the occurrence of a collision event and the univocal execution of the related programmed action.
  • step 100 a first counter T Lev O of a predetermined collision verification interval having a first time duration is started.
  • the processing means 12 verify the existence of a collision between the pointer object P and the control object 20, which occurs when the pointer object P overlaps the control area 22, and, in the affirmative, in step 1 10 the first counter T Lev O is switched off and the second counter T_coll is started.
  • step 120 the elapsing of the predetermined collision handling time interval is verified, and, in the affirmative, in step 130 the first counter T Lev O is reset and restarted and at the same time the state of the control object 20 is switched from an inactive condition to an active condition or vice versa.
  • step 140 it is verified whether the control object 20 is in an active condition (or state) following the switching in the previous step and, in the affirmative, in step 150 an animation is started to display the activation of the control object 20 (the “flower” opening) and to display the first hierarchical logic level of commands LI, i.e. the corresponding interaction objects 24, and for each of these, in step 160 a respective first common counter T Lev l of a predetermined collision verification time interval having a respective first time duration is started and a respective second counter of a collision handling time interval, T coll, is switched off.
  • a respective first common counter T Lev l of a predetermined collision verification time interval having a respective first time duration is started and a respective second counter of a collision handling time interval, T coll, is switched off.
  • step 170 an animation is started to display the deactivation of the control object 20 (the “flower” closure), and in step 180 the respective first counter T Lev l of the commands of the first hierarchical logic level of commands LI is deactivated.
  • step 200 a first counter T Lev l of a predetermined collision verification time interval having a first time duration is started.
  • the processing means 12 verify the existence of a collision between the pointer object P and the interaction object 24 of the control object 20 corresponding to the command, which occurs when the pointer object P overlaps the respective interaction object 24, and, in the affirmative, in step 210 the first counter T Lev I is switched off and the second counter of a collision handling time interval, T coll, is started.
  • step 220 the elapsing of the predetermined collision handling time interval is verified, and, in the affirmative, in step 230 the first counter T LevJ is reset and restarted.
  • step 240 it is verified whether the interaction object 24 of the control object 20 determines the activation or deactivation of the display of a lower hierarchical logic level of commands (level L2) or the execution of a selected command of the control object, i.e. the execution of an action.
  • step 250 the status of the attribute of at least one object of a virtual reality scene to which the command refers is switched from a condition of application to a condition of disappl ication of the attribute or vice versa. It is then verified in step 260 whether the attribute of the object has lower levels, and, in the affirmative, the status of the attribute is verified in step 270, whereby if there is a condition for applying the attribute following the switching in the previous step, in step 280 an animation is started for displaying the next hierarchical logic level of controls (level L2), i.e.
  • step 290 an animation is started for the removal of the display of the hierarchical logic level of controls L2, i.e. the corresponding interaction objects, underlying the attribute. If in step 260 it is verified that the attribute of the object does not have lower levels, in step 300 the condition of applying or disapplying the attribute is saved.
  • step 240 If in step 240 it is verified that the interaction object 24 of the control object 20 determines the execution of a command, i.e. the execution of an action, in step 310 the state of the action is switched from an enabled condition to a disabled condition of the action or vice versa.
  • step 320 it is verified whether the action to which the interaction object 24 refers is enabled or not, following the switching to the previous step and, in the affirmative, in step 330 the interactions between a user and a virtual scene dictated by the selected action are configured. If in step 320 it is verified that the action to which the interaction object 24 refers is disabled, in step 340 the interactions between a user and a virtual scene dictated by the selected action are stopped.

Abstract

An immersive virtual reality system and a method for enhancing a virtual reality environment in an immersive virtual reality system are described, wherein a processing system arranged to generate virtual reality scenes in a virtual reality environment is also arranged to (i) generate a configuration control object of the virtual reality environment, which may be represented in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable hierarchical logic levels of commands, each represented by at least one object for interacting with the virtual reality environment; (ii) verify periodically the existence of a collision condition between a pointer object displayed in a virtual reality scene and associated with a pointer member moved by a user and the control object according to a predetermined collision time verification interval having a first time duration, wherein the collision condition between the pointer object and the control object comprises the overlapping of the position of the pointer object and the control object in the virtual reality scene; (iii) on occurrence of a collision condition, interrupt the elapsing of said collision verification time interval for a predetermined collision handling time interval having a second time duration; (iv) determine the display of a lower hierarchical logic level of commands or the execution of the selected command of the configuration control object; and (v) after said collision handling time interval has elapsed, reset the elapsing of said collision verification time interval.

Description

Immersive virtual reality system
This invention relates to virtual reality systems, and in particular to immersive virtual reality systems.
More specifically, this invention relates to an immersive virtual reality system according to the preamble of claim 1 .
Virtual reality simulates actual reality, and immersive virtual reality is its most engaging development, as it allows users to interact with the simulation of real situations through specially developed interfaces, such as visors, headsets, gloves or other wearable garments, which make it possible for users to see, hear and touch an environment constructed around them, without being able to perceive the real environment where they are located.
Virtual reality is a powerful tool used for product rendering in the automotive sector. In this sector, three-dimensional visualization and virtual prototyping software for designers has recently been developed, among which AUTODESK’S 3D VRED™ visualization software allows designers and engineers to create product presentations, revisions of designs and virtual prototypes by interacting with HTML 5 content in a VRED scene, including the immersive validation of human machine interfaces.
During a product simulation activity, it is necessary to be able to interact with the objects represented, enabling and disabling actions on the objects, for example for moving them in the virtual space, or measuring attributes such as dimensions or reciprocal distances or the association with the designer’s notes.
The drawback of this product, and of similar products based on visualization without interfaces, is that the interactions with the virtual scene mentioned above require the immersive experience to be interrupted, for example to access a configuration menu accessible only on the desktop, or for the scene to be specially prepared in advance or supported by an additional external user, for example to access the configuration menu through a monitor next to the user of the immersive virtual reality interfaces. The object of this invention is to provide an immersive virtual reality system that allows a user immersed in the virtual scene to be able to interact directly therewith, through a virtual interface, without having to interrupt the immersive experience or resort to external support users, making the user no longer a passive observer, but rather an active operator of the scene. The object of the invention is therefore to provide the possibility of full interaction with the scenes of a virtual environment, releasing the limits of the interaction from configuration activities prior to immersion in the virtual reality environment.
A further object is to provide a process for enhancing a virtual reality environment in an immersive virtual reality system through a configuration control object of said virtual reality environment that is flexible and usable with any means for interacting with virtual reality, whether it is wearable or otherwise operated by the user.
According to this invention, these objects are achieved by an immersive virtual reality system having the features referred to in claim 1 and a process for enhancing a virtual reality environment in an immersive virtual reality system, having the features referred to in claim 8.
Particular embodiments form the subject matter of the dependent claims, the content of which is to be understood as an integral part of this description.
In short, this invention is based on the principle of generating a configuration control object of a virtual reality environment that may be represented in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable, hierarchical command logic levels, wherein the existence of a collision condition of said control object with a pointer object is verified cyclically to determine the display of a logic level of commands or the execution of a selected command. Specifically, when a collision condition is recognized, the cycle for verifying the existence of a collision condition is temporarily stopped for a suspension period adapted to allow the collision event to be handled by executing the related programmed action, i.e. the display of a logic level of commands or the execution of a selected command, in order to prevent the recognition of the collision condition from remaining active, since in the case of a prolonged collision condition this would cause the repeated execution of the same actions. The described procedure therefore confers a "one-touch" effect on the gesture performed by the operator, correctly executing the desired action only once, even if the pointer object collides with the command object for a prolonged time, up to a limit predetermined by the duration of said suspension period.
This invention takes concrete form, within the AUTODESK VRED product, in a software script which, by exploiting the VRED API and the VRED and Python programming functions, adds to the original virtual reality environment a configuration control object of said virtual reality environment representable in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable, hierarchical command logic levels , each represented by at least one object for interacting with the virtual reality environment.
Further features and advantages of the invention will be explained in greater detail in the following detailed description of an embodiment thereof, given by way of non-limiting example, with reference to the accompanying drawings, wherein:
Fig. 1 is a simplified block diagram of an immersive virtual reality system;
Fig. 2 is an exemplifying representation of a virtual reality scene enhanced with a configuration control object of a virtual reality environment, according to the invention;
Fig. 3 is an exemplifying representation of the control object according to the invention;
Fig. 4 is an exemplifying representation of the control object according to the invention together with a plurality of selectively viewable hierarchical command logic levels of ;
Fig. 5 is a flow diagram of the activation or deactivation of the control object according to the invention; and
Fig. 6 is a flow diagram of the activation or deactivation of the display of a lower hierarchical logic level of commands or of the execution of a selected command of said control object, according to the invention.
Fig. 1 is a simplified block diagram of an immersive virtual reality system 10. The system comprises processing means 12, such as a processing system, arranged to generate virtual reality scenes in a virtual reality environment, display means 14, such as for example a wearable visor, coupled to the processing means 12 and adapted to present said scenes of virtual reality to a user, and means 16 for interacting with said virtual reality scenes, such as for example a glove or a joystick, wearable or operable by said user and coupled to the processing means 12, which include sensor means 18 adapted to detect the relative spatial position in real space or in a real environment of a pointer member moved by said user. This pointer member, for example, may coincide with the portion of the glove that covers the tip of an index finger of the user.
The processing means 12 include timing means comprising first counters T_Lev_n of a predetermined collision verification time interval having a first time duration, for example 0.1 seconds, and a second counter T coll of a predetermined collision handling time interval having a second time duration, for example 1 second.
The processing means 12 are arranged to control the display of a pointer object P in a virtual reality scene S presented on the display means 14, on the basis of the relative spatial position of the pointer member in the real environment.
The processing means 12 are arranged to generate a configuration control object 20 of the virtual reality environment, which may be represented in a virtual reality scene in a predetermined position. Fig. 2 shows by way of example a virtual reality scene S enhanced with a control object 20 according to the invention. The virtual reality scene is for example an automotive design environment adapted to display a vehicle model 22, and within the environment the user may interact, for example to modify design parameters or to view parts of the vehicle being designed. The pointer object P is displayed in the environment, in the current example virtual representations of both hands and the user’s forearms are displayed, and the pointer object P is associated with the tip of at least one index finger of the virtual representation of a user’s hand. The control object 20 is displayed in association with an element of the virtual reality scene; in the example it is associated with the virtual representation of a user’s hand, belonging to the arm opposite to that bearing the pointer object P. The control object 20 is shown - in the currently preferred embodiment - as a wearable accessory in a predefined position of the virtual representation of the user’s body, having the three-dimensional shape of a polyhedron with six triangular faces, formed by the coupling of two tetrahedra through a base thereof, the side faces of which are indicated with 20a, 20b, 20c in Fig. 3 and the vertices of which opposite the coupling bases are beveled in such a way as to present a flat control area 22, triangular in the described embodiment.
The control object 20 comprises a plurality of selectively viewable hierarchical command logic levels of , each represented by at least one object for interacting with the virtual reality environment. Accordingly, the timing means comprise first counters T Lev n of a predetermined collision verification interval in a number equal to the number of hierarchical levels of commands, hereinafter indicated with T Lev O, T_Lev_l, etc.
The control object 20 is shown in Fig. 3 in an inactive condition (on the left) and in an active condition (on the right), respectively. Due to the smoothing of the vertices, the lateral faces 20a, 20b and 20c are represented in trapezoidal form. In the active condition at least one side face is duplicated symmetrically relative to its major base, and in the embodiment shown all three side faces 20a, 20b, 20c are duplicated symmetrically relative to their major bases, generating the faces 20a', 20b' and 20c', a condition that may be displayed graphically by simulating a “flower” opening of the polyhedron.
The faces 20a', 20b' and 20c' represent the highest hierarchical logic level of commands that is selectively viewable and each face 20a', 20b', 20c' creates an object for interacting with the virtual reality environment. Specifically, each face 20a', 20b' and 20c' has a pictogram indicative of the command that is executed by interacting with this face, respectively - in the example indicated - a “home” pictogram, a “checkmark” pictogram (for example indicative of a command to confirm an action performed) and an “X” pictogram (for example indicative of a command to negate an action performed). The interaction with the faces 20a', 20b' and 20c' will be described hereinafter.
Fig. 4 shows an exemplifying hierarchical structure of commands, comprising a first level of commands LI including five objects for interacting with the virtual reality environment, each indicated with 24, a second level of commands L2, hierarchically lower than the command 24 selected in the first level, including four objects for interacting with the virtual reality environment, each indicated with 26, and a third level of commands L3, hierarchically lower than the selected command 26 of the second level, including two objects for interacting with the virtual reality environment, each indicated with 28. Commands comprise enabling or disabling actions in the virtual reality environment, or the application or disapplication of attributes to one or more objects in a virtual reality scene.
When a user interacts with the control object 20 the existence of a collision condition between the pointer object P (the tip of the virtual representation of the user's index finger) and the control object 20 is periodically verified according to a predetermined collision verification time interval having a first time duration, wherein the collision condition between the pointer object and the control object comprises the overlapping of the position of the pointer object and the control object in the virtual reality scene. From the moment wherein said collision condition is verified, for a predetermined collision handling time interval having a second time duration, the elapsing of the collision verification interval is interrupted, and the display of a lower hierarchical logic level of commands or the execution of the selected command of the control object is determined, then resetting the elapsing of the collision verification time interval at the end of the collision handling time interval.
Preferably, the duration of the collision verification time interval (e.g., 0.1 seconds) is less than the duration of the collision handling time interval (1 second or greater). This allows an almost continuous verification of the occurrence of a collision event and the univocal execution of the related programmed action.
The interaction of a user with the control object 20 for switching from the inactive condition to the active condition and vice versa is described in reference to the flow diagram in Fig. 5.
In step 100 a first counter T Lev O of a predetermined collision verification interval having a first time duration is started. At the end of each collision verification interval, the processing means 12 verify the existence of a collision between the pointer object P and the control object 20, which occurs when the pointer object P overlaps the control area 22, and, in the affirmative, in step 1 10 the first counter T Lev O is switched off and the second counter T_coll is started.
In step 120 the elapsing of the predetermined collision handling time interval is verified, and, in the affirmative, in step 130 the first counter T Lev O is reset and restarted and at the same time the state of the control object 20 is switched from an inactive condition to an active condition or vice versa.
In step 140 it is verified whether the control object 20 is in an active condition (or state) following the switching in the previous step and, in the affirmative, in step 150 an animation is started to display the activation of the control object 20 (the “flower” opening) and to display the first hierarchical logic level of commands LI, i.e. the corresponding interaction objects 24, and for each of these, in step 160 a respective first common counter T Lev l of a predetermined collision verification time interval having a respective first time duration is started and a respective second counter of a collision handling time interval, T coll, is switched off. In the event that in step 140 it is verified that the control object is in an inactive condition (state) following the switching in the previous step, in step 170 an animation is started to display the deactivation of the control object 20 (the “flower” closure), and in step 180 the respective first counter T Lev l of the commands of the first hierarchical logic level of commands LI is deactivated.
The interaction of a user with the control object 20 for activating or deactivating the display of a lower hierarchical logic level of commands or for the execution of a selected command of the control object is described with reference to the flow chart of Fig. 6, referring by way of example to a command of the first level LI .
In step 200 a first counter T Lev l of a predetermined collision verification time interval having a first time duration is started.
At the end of each collision verification interval, the processing means 12 verify the existence of a collision between the pointer object P and the interaction object 24 of the control object 20 corresponding to the command, which occurs when the pointer object P overlaps the respective interaction object 24, and, in the affirmative, in step 210 the first counter T Lev I is switched off and the second counter of a collision handling time interval, T coll, is started.
In step 220 the elapsing of the predetermined collision handling time interval is verified, and, in the affirmative, in step 230 the first counter T LevJ is reset and restarted.
In step 240 it is verified whether the interaction object 24 of the control object 20 determines the activation or deactivation of the display of a lower hierarchical logic level of commands (level L2) or the execution of a selected command of the control object, i.e. the execution of an action.
In the first case, in step 250 the status of the attribute of at least one object of a virtual reality scene to which the command refers is switched from a condition of application to a condition of disappl ication of the attribute or vice versa. It is then verified in step 260 whether the attribute of the object has lower levels, and, in the affirmative, the status of the attribute is verified in step 270, whereby if there is a condition for applying the attribute following the switching in the previous step, in step 280 an animation is started for displaying the next hierarchical logic level of controls (level L2), i.e. the corresponding interaction objects, while if there is a condition of disapplication of the attribute following the switching in the previous step, in step 290 an animation is started for the removal of the display of the hierarchical logic level of controls L2, i.e. the corresponding interaction objects, underlying the attribute. If in step 260 it is verified that the attribute of the object does not have lower levels, in step 300 the condition of applying or disapplying the attribute is saved.
If in step 240 it is verified that the interaction object 24 of the control object 20 determines the execution of a command, i.e. the execution of an action, in step 310 the state of the action is switched from an enabled condition to a disabled condition of the action or vice versa.
In step 320 it is verified whether the action to which the interaction object 24 refers is enabled or not, following the switching to the previous step and, in the affirmative, in step 330 the interactions between a user and a virtual scene dictated by the selected action are configured. If in step 320 it is verified that the action to which the interaction object 24 refers is disabled, in step 340 the interactions between a user and a virtual scene dictated by the selected action are stopped.
The interaction of a user with the control object 20 to activate or deactivate the display of the lower hierarchical logic level of commands L3 or to execute a selected command of the control object at level L2 takes place in the same way as described with reference to the flow diagram in Fig. 6, referring by way of example to a command of the first level LI .
Naturally, without prejudice to the principle of the invention, the embodiments and the details of execution may be widely varied with respect to that which has been described and illustrated purely by way of non-limiting example, without thereby departing from the scope of protection of the invention defined by the appended claims.

Claims

I . Immersive virtual reality system, comprising: processing means arranged to generate virtual reality scenes in a virtual reality environment; display means, coupled to said processing means and adapted to present said virtual reality scenes to a user; and means for interacting with said virtual reality scenes, wearable or operable by said user and coupled to said processing means, which include sensor means adapted to detect the relative spatial position in a real environment of a pointer member moved by said user, the processing means being arranged to control the display of a pointer object in a virtual reality scene on the basis of the relative spatial position of said pointer member in the real environment, characterized in that the processing means are arranged to:
- generate a configuration control object of said virtual reality environment that may be represented in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable hierarchical logic levels of commands, each represented by at least one object for interacting with the virtual reality environment;
- verify periodically the existence of a collision condition between said pointer object and said control object according to a predetermined collision verification time interval having a first time duration, a collision condition between a pointer object and a control object comprising the overlapping of the position of the pointer object and the control object in the virtual reality scene;
- on occurrence of a collision condition, interrupt the elapsing of said collision verification time interval for a predetermined collision handling time interval having a second time duration;
- determine the display of a lower hierarchical logic level of commands or execute the selected command of said configuration control object; and
- once said collision handling time interval has elapsed, reset the elapsing of said collision verification time interval.
2. System according to claim 1 , wherein said control object is represented in association with an element of said virtual reality scene.
3. System according to claim 2, wherein said element of the virtual reality scene is a virtual representation of a user’s hand.
4. System according to claim 2, wherein said control object is a virtual representation of a wearable accessory in a predefined position of the virtual representation of the user’s body.
5. System according to any one of the preceding claims, wherein the first duration of said predetermined collision verification time interval is less than the second duration of said predetermined collision handling time interval.
6. System according to any one of the preceding claims, wherein said commands comprise the enabling or disabling of actions in the virtual reality environment.
7. System according to any one of the preceding claims, wherein said commands comprise the application or disapplication of attributes to at least one object of a virtual reality scene.
8. Method for enhancing a virtual reality environment in an immersive virtual reality system comprising: processing means arranged to generate virtual reality scenes in said virtual reality environment; display means, coupled to said processing means and adapted to present said virtual reality scenes to a user; and means for interacting with said virtual reality scenes, wearable or operable by said user and coupled to said processing means, which include sensor means adapted to detect the relative spatial position in a real environment of a pointer member moved by said user, the processing means being arranged to control the display of a pointer object in a virtual reality scene on the basis of the relative spatial position of said pointer member in the real environment, the method being characterized by the steps of:
- generating a configuration control object of said virtual reality environment that may be represented in a virtual reality scene in a predetermined position, which comprises a plurality of selectively viewable hierarchical logic levels of commands, each represented by at least one object for interacting with the virtual reality environment;
- verifying periodically the existence of a collision condition between said pointer object and said control object according to a predetermined collision verification time interval having a first time duration, a collision condition between a pointer object and a control object comprising the overlapping of the position of the pointer object and the control object in the virtual reality scene;
- on the occurrence of a collision condition, interrupting the elapsing of said collision verification time interval for a predetermined collision handling time interval having a second time duration;
- determining the display of a lower hierarchical logic level of commands or the execution of the selected command of said configuration control object;
- once said collision handling time interval has elapsed, resetting the elapsing of said collision verification time interval.
9. Method according to claim 8, wherein the first duration of said predetermined collision verification time interval is less than the second duration of said predetermined collision handling time interval.
10. Method according to claim 8 or 9, wherein said commands comprise enabling or disabling actions in the virtual reality environment.
1 1. Method according to any one of claims 8 to 10, wherein said commands comprise the application or disapplication of attributes to the object of a virtual reality scene.
12. Method according to any one of claims 8 to 1 1 , comprising the representation of said control object in association with an element of said virtual reality scene, preferably a virtual representation of a user’s hand.
13. Method according to any one of claims 8 to 11 , comprising the representation of said control object as a virtual representation of a wearable accessory in a predefined position of the virtual representation of the user’s body.
PCT/IT2020/000062 2020-09-01 2020-09-01 Immersive virtual reality system WO2022049608A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IT2020/000062 WO2022049608A1 (en) 2020-09-01 2020-09-01 Immersive virtual reality system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IT2020/000062 WO2022049608A1 (en) 2020-09-01 2020-09-01 Immersive virtual reality system

Publications (1)

Publication Number Publication Date
WO2022049608A1 true WO2022049608A1 (en) 2022-03-10

Family

ID=72964772

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IT2020/000062 WO2022049608A1 (en) 2020-09-01 2020-09-01 Immersive virtual reality system

Country Status (1)

Country Link
WO (1) WO2022049608A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018237172A1 (en) * 2017-06-21 2018-12-27 Quantum Interface, Llc Systems, apparatuses, interfaces, and methods for virtual control constructs, eye movement object controllers, and virtual training
US20190332182A1 (en) * 2017-04-25 2019-10-31 Tencent Technology (Shenzhen) Company Limited Gesture display method and apparatus for virtual reality scene
US20190362562A1 (en) * 2018-05-25 2019-11-28 Leap Motion, Inc. Throwable Interface for Augmented Reality and Virtual Reality Environments

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190332182A1 (en) * 2017-04-25 2019-10-31 Tencent Technology (Shenzhen) Company Limited Gesture display method and apparatus for virtual reality scene
WO2018237172A1 (en) * 2017-06-21 2018-12-27 Quantum Interface, Llc Systems, apparatuses, interfaces, and methods for virtual control constructs, eye movement object controllers, and virtual training
US20190362562A1 (en) * 2018-05-25 2019-11-28 Leap Motion, Inc. Throwable Interface for Augmented Reality and Virtual Reality Environments

Similar Documents

Publication Publication Date Title
US10949057B2 (en) Position-dependent modification of descriptive content in a virtual reality environment
Lampen et al. Combining simulation and augmented reality methods for enhanced worker assistance in manual assembly
US6111577A (en) Method and apparatus for determining forces to be applied to a user through a haptic interface
Ryken et al. Applying virtual reality techniques to the interactive stress analysis of a tractor lift arm
US7084869B2 (en) Methods and apparatus for detecting and correcting penetration between objects
US20090319892A1 (en) Controlling the Motion of Virtual Objects in a Virtual Space
US6141015A (en) Method and apparatus for determining collision between virtual objects in a virtual space
Wolfartsberger et al. A virtual reality supported 3D environment for engineering design review
WO1997021160B1 (en) Method and apparatus for providing force feedback for a graphical user interface
KR20180094053A (en) Programs and information processing methods
Periverzov et al. IDS: The intent driven selection method for natural user interfaces
Bordegoni et al. Evaluation of a haptic-based interaction system for virtual manual assembly
WO2022049608A1 (en) Immersive virtual reality system
KR20140071365A (en) Interaction with a three-dimensional virtual scenario
JP2000047567A (en) Tactile sense simulation device for object
JP2021024028A5 (en) Information processing device, information processing method, program, recording medium, and article manufacturing method
JP3193238B2 (en) Operation verification device
JPH09330016A (en) Virtual object operating method and virtual object display device
CN106843676A (en) For the method for toch control and touch control device of touch terminal
JPH11250283A (en) Object expressing system
KR20170116310A (en) System and method for task teaching
JPH0520403A (en) Automatic interference check system
JP2000047566A (en) Hair touch simulation device for object
CN114833826B (en) Control method and device for realizing collision touch sense of robot and rehabilitation robot
Hernantes et al. Effective haptic rendering method for complex interactions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20793829

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20793829

Country of ref document: EP

Kind code of ref document: A1