EP3227879A1 - Système de simulation, dispositifs, méthodes et programmes correspondants - Google Patents

Système de simulation, dispositifs, méthodes et programmes correspondants

Info

Publication number
EP3227879A1
EP3227879A1 EP15804471.9A EP15804471A EP3227879A1 EP 3227879 A1 EP3227879 A1 EP 3227879A1 EP 15804471 A EP15804471 A EP 15804471A EP 3227879 A1 EP3227879 A1 EP 3227879A1
Authority
EP
European Patent Office
Prior art keywords
virtual environment
description
environment
module
procedure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP15804471.9A
Other languages
German (de)
English (en)
French (fr)
Inventor
Pierre JANNIN
Bernard GIBAUD
Valérie GOURANTON
Benoît CAILLAUD
Bruno Arnaldi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institut National de la Sante et de la Recherche Medicale INSERM
Institut National de Recherche en Informatique et en Automatique INRIA
Institut National des Sciences Appliquees INSA
Original Assignee
Institut National de la Sante et de la Recherche Medicale INSERM
Institut National de Recherche en Informatique et en Automatique INRIA
Institut National des Sciences Appliquees INSA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institut National de la Sante et de la Recherche Medicale INSERM, Institut National de Recherche en Informatique et en Automatique INRIA, Institut National des Sciences Appliquees INSA filed Critical Institut National de la Sante et de la Recherche Medicale INSERM
Publication of EP3227879A1 publication Critical patent/EP3227879A1/fr
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Definitions

  • the invention relates to the field of simulation.
  • the invention relates more particularly to the field of simulation in virtual reality environments.
  • the invention more particularly takes place in the field of realistic simulation in virtual reality environments representative of real environments, in particular for professional training.
  • Virtual reality is defined as an interactive computer simulation of immersive, visual and / or sound and / or haptic environments. Environments in q uestion can be real or imaginary.
  • the simulation is called interactive because unlike a simple spectator, the user interacts with the virtual environment.
  • the simulation is immersive because it plunges the user into an environment different from the real environment in which they are located.
  • the simulation is visual and / or sound and / or haptic because the user is generally provided with visual and sound reproduction equipment (such as a virtual reality headset reproducing sounds and images) and equipment for reproducing physical sensations. , and especially the touch, through gloves, combinations or active systems suitable for this use.
  • virtual reality The purpose of virtual reality is to allow a user a sensorimotor and cognitive activity in an artificial world, created numerically, which can be a simulation of certain aspects of the real world.
  • the virtual reality helmet built by the company Oculus Rift TM now allows the general public to understand virtual reality in a simple and inexpensive way.
  • This type of headset is currently used to develop consumer applications. These consumer applications are mainly games, including third-person game derivatives (called FPS for "First Person Shoot”).
  • FPS third-person game derivatives
  • the advantage of these headphones is that they render images in three dimensions and thus allow the user to be immersed in an imaginary sound and visual environment. The use of such a device for anything other than this type of application is possible.
  • the creation of a virtual environment representative of reality is a more complex task than the modeling of an "imaginary" virtual environment: in the case of a real environment, the user is based on precise sensory references .
  • the creation of a virtual environment representative of reality requires more elaborate modeling processes in which real physical objects are taken into account (which must be modeled so that they conform to reality). ) and on the other hand the physical laws defining the interaction between these objects (so that the interaction of these objects in the virtual world is identical to the interaction of these objects in the real world).
  • the game engine manages the possible actions on the virtual environment.
  • This engine has a number of constraints and limitations that are usually desired to get a video game fast and not too complicated: the actions are generally limited to movements left, right, forward, backward, up, down, grab object, shoot, run, jump, etc. Interactions with people are generally few.
  • the described technique does not include at least some of these disadvantages of the prior art.
  • the described technique relates to a simulation module of an interventional procedure within a virtual environment, said virtual environment being implemented via an environment data processing device. virtual.
  • the simulation module includes:
  • a module configured to obtain a data structure representative of a set of objects of said virtual environment, said data structure further comprising at least one interaction relation between at least two objects of said set of objects;
  • a module configured to receive at least one scenario representative of a procedure to be simulated, said scenario being defined from a descriptive ontology, said at least one scenario being obtained from at least one predetermined procedural model;
  • a module configured to restore, within said virtual environment, at least a portion of said scenario according to said set of objects of said virtual environment, said at least one procedural model and at least one action performed by at least one real user of said virtual environment.
  • the described technique relates to a system for simulating an interventional procedure within a virtual environment, said virtual environment being implemented via an environment data processing device.
  • virtual the system includes:
  • an environment description module configured to deliver at least one description of a real procedure in the real environment according to at least one domain ontology
  • a procedural model creation module configured to create, based on at least one description of the actual environment to be modeled and delivering a procedural model materializing conditions and transitions between procedural steps.
  • the procedural environment description module comprises: access to a storage space comprising a physical description of an environment in which a procedure to be described takes place;
  • an input module comprising input means and a graphical interface comprising description display means, said input module being configured to implement said physical description of the environment and said domain ontology to produce at least one description a real procedure;
  • the procedural model creation module comprises:
  • an analysis module configured to analyze, according to said domain ontology, each real procedure description of plurality of actual procedure descriptions, to create a procedural model materializing conditions and transitions between procedural steps.
  • the simulation system furthermore comprises:
  • an immersive device called virtual reality, including an immersive room and / or at least a virtual reality helmet.
  • the described technique also relates to a method for simulating an interventional procedure within a virtual environment, said virtual environment being implemented via a data processing device of virtual environment.
  • a simulation method comprises: a step of obtaining a data structure representative of a set of objects of said virtual environment, said set of objects being defined from a descriptive ontology or a formal language, said data structure comprising in addition to at least one interaction relation between at least two objects of said set of objects;
  • the various steps of the methods according to the invention are implemented by one or more software or computer programs, comprising software instructions intended to be executed by a data processor of a relay model according to the invention. invention and being designed to control the execution of the various process steps.
  • the invention is also directed to a computer program, capable of being executed by a computer or a data processor, which program includes instructions for controlling the execution of the steps of a method as mentioned above. .
  • This program can use any programming language, and be in the form of source code, object code, or intermediate code between source code and object code, such as in a partially compiled form, or in any form what other form is desirable.
  • the invention also relates to a data carrier readable by a data processor, and including instructions of a program as mentioned above.
  • the information carrier may be any entity or device capable of storing the program.
  • the medium may comprise storage means, such as a ROM, for example a CD ROM or a microelectronic circuit ROM, or a magnetic recording medium, for example a floppy disk, a D isque d ur, SSD, etc.
  • the information medium may be a transmissible medium such as an electrical or optical signal, which may be conveyed via an electrical or optical cable, by radio or by other means.
  • the program according to the invention can be downloaded in particular on an Internet type network.
  • the information medium may be an integrated circuit (ASIC or FPGA type) in which the program is incorporated, the circuit being adapted to execute or to be used in the execution of the method in question.
  • ASIC integrated circuit
  • FPGA field-programmable gate array
  • the invention is implemented by means of software and / or hardware components.
  • module may correspond in this document as well to a software component, q u'à a hardware component or a set of hardware and software components.
  • a software component corresponds to one or more computer programs, one or more subroutines of a program, or more generally to any element of a program or software capable of implementing a function or a set of functions, as described below for the module concerned.
  • Such a software component is executed by a data processor of a physical entity (terminal, server, gateway, router, etc.) and is capable of accessing the hardware resources of this physical entity (memories, recording media, bus communications, electronic input / output cards, user interfaces, etc.).
  • a hardware component corresponds to any element of a hardware set (or hardware) able to implement a function or a set of functions, as described below for the module concerned. It may be a hardware component that is programmable or has an integrated processor for executing software, for example an integrated circuit, a smart card, a memory card, an electronic card for executing a firmware. (firmware), etc.
  • FIG. 2 schematically shows a simulation system
  • FIG. 3 schematically represents a device implemented to obtain a scenario simulation.
  • the technique aims to train personnel in a virtual environment reproducing real conditions of intervention.
  • the invention relates more specifically to a method of simulating an interventional procedure within a virtual environment, said virtual environment being implemented via a virtual environment data processing device (SYST ).
  • SYST virtual environment data processing device
  • Such a method comprises. :
  • a step of obtaining (10) a representative data structure (SDR E o) of a set of objects (EO) of said virtual environment (EV), said data structure comprising at least one interaction relation between at least two objects of said set of objects;
  • Such a system comprises, in one embodiment:
  • DESCR description module
  • This module makes it possible to describe procedures, unitary, which are actually implemented.
  • This module can also be called procedural / event description module; this module takes as input one or more descriptions of the real environment to be modeled. It is based on an initial physical description, including a number of descriptions of objects in the environment virtual and a grammar of procedural description, also called ontology;
  • the description module also takes as input a domain ontology II produced, at output, a description of one or more real or ideal procedures, called described procedures (PrDecs).
  • the description module comprises:
  • a storage space comprising a physical description of an environment in which a procedure to be described takes place; access to a storage space comprising a grammar of procedural description, called domain ontology;
  • an input module comprising input means and a graphical interface comprising description display means, said input module being configured to implement said physical description of the environment and said domain ontology to produce at least one description a real procedure;
  • Storage spaces may be shared or shared with other modules.
  • a procedural model creation module (MODP). This module uses one or more actual or ideal procedure descriptions; it delivers a procedural model materializing conditions and transitions between procedural steps; this model depends on the ontology; the procedural model creation module thus comprises:
  • an analysis module configured to analyze, according to said domain ontology, each real procedure description description of real procedure descriptions, to create a procedural model (MP) materializing conditions and transitions between procedural steps.
  • MP procedural model
  • such a system furthermore comprises: a simulation module (SIMU) implementing the previously described method;
  • SIMU simulation module
  • an immersive system called virtual reality, comprising for example an immersive room and / or virtual reality headsets and / or other immersion devices; this immersive system (IMER) is driven by a virtual reality engine (ENG).
  • ENG virtual reality engine
  • Such an engine manages the immersive display (using the aforementioned devices) and the actions of the users; it takes as input the physical model of the virtual environment as well as the outputs produced by the simulation module.
  • An intervention (or training) scenario is typically based on a procedural model or a subset of the procedural model and includes the execution of a series of steps that must be implemented by a user (he either real or virtual) as part of the simulation of the interventional procedure (work procedure, intervention procedure in a specific environment, surgical procedure, etc.).
  • the proposed technique also relates to various methods used in the aforementioned devices.
  • the first phase of the method is a descriptive phase in which one or more procedure descriptions are performed.
  • the second phase of the method is an analytical phase in which the descriptions are analyzed.
  • the technique consists in creating and providing to the virtual reality engine, elementary tasks representative of coherent and realistic actions, united in the procedural model. These basic tasks can then be used to create, in a virtual environment, a simulation of a real or at least realistic situation using a pre-established scenario and to understand or follow the scenario followed by the actor or the real actors.
  • the purpose of the system is to enable real users to learn, in a virtual environment, various situations that may occur within a given real environment.
  • the system of the present technique finds, for example, an application in the formation of a team operating in a confined and controlled environment such as intervention teams in sensitive sites (nuclear power plant, chemical treatment plant). This system also finds application for surgical teams so that they can train themselves in intervention procedures on patients.
  • This system also finds an application for the training of security personnel and or technical intervention or maintenance personnel (mechanic, electricians, etc.)
  • the basic tasks are provided alongside the physical models of the virtual environment. More specifically, the physical models of the virtual environment are notably managed by a physics engine whose objective is to represent the physical objects of the virtual environment and their interactions. From this point of view, the virtual reality engine and the physics engine differ little from an application engine and a physics engine from a consumer application.
  • Elementary tasks represent interactions between individuals or between individuals and objects, whether real or virtual.
  • An elementary task can for example represent a type of action produced by a real user and the reaction of a virtual user.
  • an elementary task may represent a type of action produced by a virtual user and the response (expected of) of a real user.
  • An elementary task can still represent a type of action produced by a virtual user and the reaction of a virtual user.
  • An elementary task can still represent a type of action produced by a real user and the reaction of one or more virtual objects.
  • Elementary tasks are organized into a coherent set of elementary tasks, representing a set of actions and possible interactions.
  • the objective is to have a set representation of the different elementary tasks and their sequences so that the virtual reality engine can, on the basis of an action (corresponding to a given elementary task) compute corresponding reactions.
  • the reactions are in some weighted forms according to subsequent or previous observations.
  • probabilistic, statistical or pseudo random selection methods can be implemented to manage the sequences of actions and reactions based on the elementary tasks and their sequences.
  • the coherent set of elementary tasks represents the field of possibilities.
  • the procedural model is organized within a data structure.
  • the data structure includes several types of elements, including element types "elementary tasks".
  • Elementary task element types include less the following data: representative data of the action, data representative of the object of the action, data representative of a reaction and data representative of one or more subsequent elementary tasks, possibly weighted according to the probability of occurrence of these following elementary tasks.
  • this procedural model is more or less correlated with the virtual world itself, and therefore with all the virtual objects that make up this world.
  • the degree of correlation of the set of elementary tasks depends on the number of elementary tasks involving the implementation of an object of the virtual world, and therefore the description of the virtual world itself.
  • the procedural model is strongly correlated with the virtual world when the number of elemental tasks that influence one or more objects in the virtual world is important.
  • the procedural model is weakly correlated with the virtual world when the number of elementary tasks having an influence on one or more objects of the virtual world is weak. As will be noted later, this distinction is relatively important in terms of defining the procedural model. Simulations involving little interaction with the environment are easier to implement than simulations involving many interactions with the environment.
  • the system of the invention has a model for creating sets of elementary tasks.
  • This module uses one or more descriptions of the real environment to be modeled. This module for creating all the elementary tasks is described in more detail below.
  • Descriptions of the environment are made through a suitable description module.
  • This description module comprises an input module comprising input means and a graphical interface comprising description display means.
  • Such a description module takes for example the form of a dedicated application, installed for example on a tablet or a laptop, to enter the description of the environment.
  • a description of the environment in general, is of two types of elements: a physical description, including for example a description of the elements making up the physical environment and a procedural description (or event), including the description of how users interact with the environment to perform one or more given tasks.
  • the procedural model creation module uses, as input, a procedural description of the environment. In the context of the invention, this procedural description is based on a physical description previously produced and integrated within the description module.
  • the description module In addition to a physical description of the environment, the description module also takes a procedural description grammar.
  • This grammar of procedural description makes it possible to define a suitable description reference frame.
  • This grammar can also be called ontology. It defines the basics of the description, namely a common description language that can be used to describe how the procedures are actually implemented or how the events are chained together. These are axioms that are considered true to describe this or that event or procedure.
  • the ontology of course, is adapted to the procedures or events that one wishes to describe. In the case of a description of a part replacement procedure on a vehicle, the ontology defines the language to describe the replacement procedure and is based on a physical description of a set or a subset of parts of an engine.
  • the description module through the input module, is used to describe the different steps required to replace the part using the previously defined ontology, for example: remove the water pump with the aid of a key of 12; rotate the flywheel using your hand; lock the flywheel in a blocking position by means of a locking device; remove the timing belt by hand; install a new timing belt; mount a new water pump; mount the belt on the new water pump; unlock the flywheel.
  • the procedure of replacing a timing belt is thus described, in procedural terms, based on a physical description (the elements of the engine, the necessary tools) and on a grammar (remove, rotate, block, mount, unblock , using, ).
  • the module for creating the procedural model comprises as input a set of descriptive records made with the recording module.
  • the procedural model creation module based on the descriptive records, determines elementary tasks to implement real or realistic procedures.
  • the model for creating all the elementary tasks is based on the descriptions to identify the phases, steps and sub-steps necessary for the implementation of the procedures. It proceeds by iterative analysis and classification of the recordings. In a complementary manner, it determines, in the procedural model produced, sequences of possible steps and sub-stages, corresponding to procedures not necessarily recorded, but "realistic" procedures. This may correspond for example to a series of steps to perform a procedure, this sequence not corresponding to a sequence recorded in one of the descriptions. Thus, it is on the basis of this procedural model that intervention scenarios can be constructed. They correspond to the implementation of a subset of the procedural model to achieve a given result.
  • the replacement of the timing belt may not include the blocking of the flywheel or includes re blocking the flywheel at the beginning of the operation (before any disassembly). If several different records of the disassembling procedure of the timing belt have been made, then the module for creating all basic tasks can, for a descriptive scenario "replacement of timing belt", indicate that (a ) the step of locking the flywheel is optional; and (b) it can take place after the disassembly of the water pump.
  • the system of the invention is implemented to perform simulations of surgical procedures. More particularly, in this embodiment, attempts are made to simulate the interaction between different members of a surgical interventional team for training purposes. It is thus possible to train nursing teams for a specific intervention performed by a virtual surgeon. It is also possible to train a surgeon to practice a specific procedure that he has not yet had the opportunity to practice (alone). In this mode of realization it is proposed to automate the production of virtual training scenarios (virtual training) and facilitate their exploitation in Virtual Reality, in fields where the scenarios are complex and variable and must nevertheless be realistic.
  • a scenario is therefore a procedural description that can describe processes at different levels of granularity (typically “complete procedure”, “phase”, “procedural step”, “elementary action”).
  • the model allows to specify the type of action performed (for example, incising), the object on which it is worn (for example, the patient's skin), the instrument used (for example , a scalpel), the actor who intervenes, and if it is a human actor to specify the part of the body put into play, for example, the surgeon's right hand.
  • the first phase of the method is a descriptive phase in which one or more descriptions of surgical procedures are performed.
  • Surgical procedure descriptions are based on an ontology as a description language. These descriptions can be made by observation, by interviews, from post-operative reports or automatically. By observation, an operator observes what the actor realizes and describes his understanding of the scenario. This observation can be made during the surgical or delayed procedure, from video recordings of the procedure.
  • interviews the actor or a group of actors (human, in this case) explains what he usually does and the interviewer records this description on a suitable device that allows to transcribe the description made with the help of the ontology previously defined. Descriptions can also be generated automatically by the use of action sensors, actors, instruments or other effectors.
  • the interpretation of the sensor signals generates a description of the procedure.
  • the descriptions can relate to the preoperative vision of the procedure (what one thinks to realize), the intraoperative vision (what one realizes) or the postoperative vision (what one realized). We can also describe what we should have done (ideal procedure).
  • the second phase of the method is an analytic phase in which the descriptions are analyzed.
  • the description corpus (corresponding to a single theoretical procedure) is analyzed to create a generic, generative model capable of generating new fictitious examples, reproducing the diagrams observed in real examples. This is, for example, a Test & Flip method.
  • Test & Flip networks are a restriction of Petri nets in all Booleans. This eliminates processes or an action is followed by another a number of times varying from one example to another.
  • transitions possible between a state (place) and a transition (event) There are six types of transitions possible between a state (place) and a transition (event): the orthogonal transition (one does not do anything), the flip transition (whatever the state of the place, 0 or 1, one complements the state), the transition plus (if the place is at zero, we complement), the transition less (if the place is at 1, we complement), the transition 0 (we test if the state is at 0, but we do nothing) and transition 1 (we test if the state is 1 but we do nothing).
  • the principle of synthesis is to find the Test & Flip network which represents the smallest language in which is included the language represented by the examples provided in input. That is, find a set of places and transitions that best represent our process. This is what leads to generalization, because when it is not possible to solve the problem directly, we add states in the set of starting processes, which amounts to increasing the language.
  • This model includes a number of phases, steps and sub-steps that correspond to a possible procedure, possibly with feedback loops and complementary steps (depending on the amount of descriptive records used and discrepancies between records ). For example, based on a set of descriptions of a corneal transplant procedure, a generic corneal transplant model is created.
  • the proposed modification which exploits the ontology, makes it possible to insert (then to use) complementary data within a generic model: for example the fact that an action class can be realized only by certain classes actors or still, that some elementary actions suppose that an elementary action of a certain type has been carried out beforehand, or that certain actions require the use of particular instruments, etc.
  • the instances are connected to classes of the ontology, themselves located in a complete taxonomy, organized by means of axioms expressed in a logical language.
  • entities associated with instances are no longer simple labels but convey rich semantics specific to the domain under consideration.
  • the procedural model obtained carries a meaning of the actions that compose it.
  • such a procedural model makes it possible to manage hierarchical aspects and multi-actor aspects, in particular by associating, for a given action or class of action, a particular actor class or actor.
  • the ontology can be formalized in a language of representation of knowledge belonging to the Logic Description (DL) family, which constitutes a decidable fragment of the logic of the first order. It can be expressed using OWL ("Ontology Web Language"), a language that makes it easy to share ontologies on the web.
  • DL Logic Description
  • OWL Ontology Web Language
  • “Virtual Reality” of a scenario in the form of a virtual training is based: on the one hand, on a representation, in a virtual environment, elements belonging to ontology classes. This step expresses in the form of three-dimensional virtual objects and interaction relations between these objects, the elementary actions of the scenario model (for example objects, instruments, actors).
  • the scenario (produced by the extension of Test & Flip) is used directly as an interactive simulation control tool allowing to place one or several real or virtual users in a virtual collaborative training situation.
  • the method may also include a step of creating a scenario chosen from the procedural model.
  • this scenario will be followed by the actor or virtual actors.
  • free mode the real actor can perform any action and any scenario freely; the virtual actors react according to and in coherence with the procedural model.
  • the method thus allows the execution of the set by establishing a chain which, starting from a 3D interaction of the actor, triggers the advancement of the domain process by a change of state of the scenario engine which in its tour triggers the virtual realization of the action in the 3D environment.
  • the change of state of the scenario engine is effective when the action of the real user is valid, ie it is represented by a transition contained in the space of the possible scenarios. In case of invalid action, a specific treatment can be integrated according to the requirements of the domain.
  • a device used to obtain a scenario simulation is described according to the method described previously.
  • a device may be in the form of a module as previously described.
  • the device comprises a memory 31 consisting of a buffer memory, a processing unit 32, equipped for example with a microprocessor, and driven by the computer program 33, implementing a simulation method.
  • the code instructions of the computer program 33 are for example loaded into a memory before being executed by the processor of the processing unit 32.
  • the processing unit 32 receives as input a description of the virtual environment, a procedural model and / or a scenario.
  • the microprocessor of the processing unit 32 implements the steps of the method according to the instructions of the computer program 33 to generate data representative of the actions to be performed or to be performed by real or virtual users of the simulation in order to conduct the scenario or the planned intervention.
  • the device comprises, in addition to the buffer memory 31, communication means, such as network communication modules, data transmission means and possibly an encryption processor.
  • communication means such as network communication modules, data transmission means and possibly an encryption processor.
  • These means may be in the form of a particular processor implemented within the device, said processor being a secure processor. According to one particular embodiment, this device implements a particular application q ui is in charge of calculations.
  • such a device comprises:
  • a module configured to obtain (10) a representative data structure (SDREO) of a set of objects (EO) dud it virtual environment (EV), said data structure further comprising at least one interaction relation between minus two objects of said set of objects;
  • SDREO representative data structure
  • EO set of objects
  • EV virtual environment
  • a module configured to receive (20) at least one scenario (S) representative of a procedure to be simulated, said scenario (S) being defined from a descriptive ontology, said at least one scenario being obtained from from minus a previously determined procedural model (PM);
  • the rendering module (30) is driven by a virtual reality engine (31).
  • a virtual reality engine manages the immersive display (using the aforementioned devices) and the actions of the users; it takes as input the physical model of the virtual environment as well as a procedural model and / or a scenario.
  • the system also includes interaction devices (sensors, cameras, %) and sensory reproduction (generators of mages, screen, sound card, speakers, %)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Health & Medical Sciences (AREA)
  • Pure & Applied Mathematics (AREA)
  • Medicinal Chemistry (AREA)
  • Medical Informatics (AREA)
  • Computational Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)
EP15804471.9A 2014-12-03 2015-12-02 Système de simulation, dispositifs, méthodes et programmes correspondants Ceased EP3227879A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1461848A FR3029662A1 (fr) 2014-12-03 2014-12-03 Systeme de simulation, dispositifs, methodes et programmes correspondants.
PCT/EP2015/078431 WO2016087555A1 (fr) 2014-12-03 2015-12-02 Système de simulation, dispositifs, méthodes et programmes correspondants

Publications (1)

Publication Number Publication Date
EP3227879A1 true EP3227879A1 (fr) 2017-10-11

Family

ID=53298426

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15804471.9A Ceased EP3227879A1 (fr) 2014-12-03 2015-12-02 Système de simulation, dispositifs, méthodes et programmes correspondants

Country Status (5)

Country Link
US (1) US10282910B2 (zh)
EP (1) EP3227879A1 (zh)
CN (1) CN107209789B (zh)
FR (1) FR3029662A1 (zh)
WO (1) WO2016087555A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10520581B2 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Sensor fusion for autonomous or partially autonomous vehicle control
US20170242443A1 (en) 2015-11-02 2017-08-24 Peloton Technology, Inc. Gap measurement for vehicle convoying
US8744666B2 (en) 2011-07-06 2014-06-03 Peloton Technology, Inc. Systems and methods for semi-autonomous vehicular convoys
US10474166B2 (en) 2011-07-06 2019-11-12 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US10254764B2 (en) 2016-05-31 2019-04-09 Peloton Technology, Inc. Platoon controller state machine
US11294396B2 (en) 2013-03-15 2022-04-05 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US10369998B2 (en) 2016-08-22 2019-08-06 Peloton Technology, Inc. Dynamic gap control for automated driving
EP3500940A4 (en) 2016-08-22 2020-03-18 Peloton Technology, Inc. AUTOMATED CONNECTED VEHICLE CONTROL SYSTEM ARCHITECTURE
WO2019051492A1 (en) * 2017-09-11 2019-03-14 Cubic Corporation TOOLS AND ARCHITECTURE OF IMMERSIVE VIRTUAL ENVIRONMENT (IVE)
CN109543349B (zh) * 2018-12-21 2023-10-24 核动力运行研究所 一种核电模拟机多外挂集成方法
CN111613122A (zh) * 2020-05-19 2020-09-01 威爱医疗科技(中山)有限公司 虚实融合的血管介入手术模拟系统
CN111833462B (zh) * 2020-07-14 2024-05-17 深圳市瑞立视多媒体科技有限公司 基于虚幻引擎的切割方法、装置、设备及存储介质
IT202000020347A1 (it) * 2020-08-24 2022-02-24 Biagi Lorenzo Metodo e relativo strumento per personalizzazione di riesecuzione di sequenze video in un mondo virtuale

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1605420A3 (fr) * 2004-06-09 2010-05-05 Nexter Training Système de formation à l'exploitation, l'utilisation ou la maintenance d'un cadre de travail dans un environnement de realité virtuelle
US8615383B2 (en) * 2008-01-18 2013-12-24 Lockheed Martin Corporation Immersive collaborative environment using motion capture, head mounted display, and cave
CN101777271A (zh) * 2009-01-13 2010-07-14 深圳市科皓信息技术有限公司 应急培训演练方法和系统
US9345957B2 (en) * 2011-09-30 2016-05-24 Microsoft Technology Licensing, Llc Enhancing a sport using an augmented reality display
US10824310B2 (en) * 2012-12-20 2020-11-03 Sri International Augmented reality virtual personal assistant for external representation
CN103678873A (zh) * 2013-09-30 2014-03-26 广州供电局有限公司 化学实验室危险品管理应急预案协同仿真方法及其系统
CN103838922A (zh) * 2014-02-17 2014-06-04 国网吉林省电力有限公司培训中心 安全行为仿真培训系统

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2016087555A1 *

Also Published As

Publication number Publication date
US20170372521A1 (en) 2017-12-28
CN107209789A (zh) 2017-09-26
US10282910B2 (en) 2019-05-07
CN107209789B (zh) 2021-06-04
FR3029662A1 (fr) 2016-06-10
WO2016087555A1 (fr) 2016-06-09

Similar Documents

Publication Publication Date Title
EP3227879A1 (fr) Système de simulation, dispositifs, méthodes et programmes correspondants
EP2593909A1 (fr) Processeur d'analyse situationnelle
Bonnail et al. Memory manipulations in extended reality
Petre Mental imagery and software visualization in high-performance software development teams
CN111886608A (zh) 以用户为中心的人工智能知识库
Bollinger et al. Multimodel ecologies: cultivating model ecosystems in industrial ecology
Scherr et al. Acceptance testing of mobile applications: Automated emotion tracking for large user groups
KR20220156870A (ko) 확장 현실 레코더
White et al. Memory Tracer & Memory Compass: Investigating Personal Location Histories as a Design Material for Everyday Reminiscence
WO2023286087A1 (en) Providing personalized recommendations based on users behavior over an e-commerce platform
Fleck et al. Teegi, he's so cute: example of pedagogical potential testing of an interactive tangible interface for children at school
Kotrotsios Data, New Technologies, and Global Imbalances: Beyond the Obvious
Gjøsæter Interaction with mobile augmented reality: An exploratory study using design research to investigate mobile and handheld augmented reality
Aminolroaya et al. Watch the videos whenever you have time: Asynchronously involving neurologists in vr prototyping
Gregor et al. Designing knowledge interface systems: past, present, and future
Jabri A Novel Approach for Generating SPARQL Queries from RDF Graphs
Bill et al. URRBP: Workflow for AI Accessibility Through Expert-Driven Prompt Engineering and Intelligent Model Management
Carrascosa et al. Virtual environments 4 mas
Hao Empirically Validating the Threshold Distribution Assumption in Complex Contagion
Caffiau et al. Natural language generation to support the understanding of task models: a preliminary study
Caballero Emotion detection, processing and response through human-machine interaction system modelling
Roh et al. SIG-Net: GNN based dropout prediction in MOOCs using Student Interaction Graph
Azubuike et al. West African Journal of Industrial & Academic Research Vol. 19 No. 2. April 31, 2018 Engineering Research and Production Page No
Heldal et al. Limits of Model Transformations for Embedded Software
FR3118815A1 (fr) Estimation de la progression de l'exécution d'une tâche logicielle

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20170529

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: CAILLAUD, BENOIT

Inventor name: JANNIN, PIERRE

Inventor name: LES AUTRES INVENTEURS ONT RENONCE AU DROIT D'ETRE

Inventor name: GIBAUD, BERNARD

Inventor name: ARNALDI, BRUNO

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20180327

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20200501