EP4349006A1 - Kommunikationsverfahren mit gemischter realität, kommunikationssystem, computerprogramm und informationsmedium - Google Patents

Kommunikationsverfahren mit gemischter realität, kommunikationssystem, computerprogramm und informationsmedium

Info

Publication number
EP4349006A1
EP4349006A1 EP22732605.5A EP22732605A EP4349006A1 EP 4349006 A1 EP4349006 A1 EP 4349006A1 EP 22732605 A EP22732605 A EP 22732605A EP 4349006 A1 EP4349006 A1 EP 4349006A1
Authority
EP
European Patent Office
Prior art keywords
virtual
user
terminal
communication
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22732605.5A
Other languages
English (en)
French (fr)
Inventor
Guillaume Bataille
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orange SA
Original Assignee
Orange SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orange SA filed Critical Orange SA
Publication of EP4349006A1 publication Critical patent/EP4349006A1/de
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Definitions

  • TITLE Mixed reality communication method, communication system, computer program and information carrier
  • the present invention relates to a mixed reality remote communication method and an associated communication system.
  • the invention also relates to a computer program and information carrier.
  • Mixed reality remote collaboration tools generally use a first-person view of the remote person's environment, which affects the quality of interpersonal communication because they cannot see their collaborator. Indeed, interpersonal communication is considered optimal when the interlocutors are face to face and visualize the facial expressions of each other.
  • a virtual reality headset is the trademark "HoloLens" headset.
  • This virtual reality headset makes it possible to generate avatars which are reproduced in a virtual environment superimposed on the real environment of the physical person wearing the virtual reality headset.
  • This helmet captures the position of the head, hands and eyes and generates holographic images that make the avatar move according to the positions captured.
  • the face of the avatars remains fixed.
  • a first object of the present invention is to propose a communication method facilitating remote communication.
  • a second object of the present invention is to propose a communication method facilitating the training of at least one person at a distance with respect to an object to be manipulated.
  • the subject of the invention is a method of communication between at least a first mixed reality terminal and a second mixed reality terminal via a communication network, the method comprising at least the following steps:
  • the first terminal transmits to the second terminal an object which can be modified by the latter.
  • the characteristics exposed in the following paragraphs can, optionally, be implemented. They can be implemented independently of each other or in combination with each other.
  • the first and second mixed reality terminals respectively comprise at least a first and a second mixed reality headset and at least a first and a second sensor, the method comprising the steps of:
  • the second user P2 can remotely manipulate the virtual object representing the real object.
  • the fact that the second user P2 can manipulate a virtual representation of the real object will help the acquisition of the gesture by the first user P1 in the same way as if the two people were physically in the same room.
  • the second user P2 can intervene remotely and prevent a possible accident.
  • the second user can teach the first user even if he does not have the real object.
  • the method further comprises the following steps:
  • the first user P1 can visualize the second user P2 manipulate the real object and adapt his learning according to what he visualizes. For example, an apprentice sees the manipulations of his master on the third virtual object and can try to reproduce it on the real object.
  • the second virtual object is projected between the at least one first virtual object and the second virtual reality headset.
  • the second virtual object I2 is projected within easy reach of the second user P2 so that she can manipulate the virtual representation of the real object if necessary.
  • the method further comprises the steps of:
  • the first user visualizes the second user P2 who is remote.
  • the first virtual character represents a person from the front.
  • communication is improved. As the first user is viewed from the front, the second user can see his facial expressions, understand his emotional state, and adapt his communication accordingly.
  • the second remote user does not view an avatar but holographic images representing the first user.
  • the first user can re-visualize the training that she has received in order to memorize it.
  • the method further comprises the steps of:
  • the second user can review manipulations previously carried out by the first user in order to indicate, for example, the moment when the manipulation must be improved.
  • At least one virtual object is comprised of a series of holographic images.
  • the invention also relates to a communication system comprising a communication network, a first communication terminal and at least one second communication terminal, the communication system being capable of implementing the method mentioned above.
  • the invention also relates to a computer program comprising instructions for the implementation of a communication method mentioned above, when it is loaded and executed by a microprocessor.
  • the subject of the invention is an information medium readable by a microprocessor, comprising the instructions of a computer program for implementing a communication method mentioned above.
  • FIG. 1 is a diagram showing a communication system according to the present invention
  • FIG. 2 is a diagram representing the steps of the communication method of the invention.
  • FIG. 3 is a diagram representing optional additional steps of the communication method according to the invention.
  • FIG. 4 is a diagram representing optional additional steps of the communication method according to the invention.
  • the communication system and the communication method have been represented and described with two users.
  • the invention is applicable to all data exchanges with a larger number of users.
  • the first user is the user to be trained and the second user is the trainer.
  • the communication method can also be applied in a context in which the first user is the trainer and the second user is the trained user.
  • the communication system also allows an exchange of audio data through a microphone mounted in the virtual reality headsets. This exchange of audio data is known per se and is not described in this patent application.
  • an example of a communication system 2 capable of implementing the invention comprises a first communication terminal 6 or communication server, a communication network 8, and at least one second communication terminal 12 or a communications server.
  • the first communication terminal 6 is for example constituted by at least a first sensor 4 and a first virtual reality headset 7 worn by the first user P1.
  • the second communication terminal 12 is for example constituted by at least a second sensor 10 and a second virtual reality helmet 13 worn by the at least second remote user P2.
  • the first communication terminal 6 and the second communication terminal 12 comprise first and respectively second sensors.
  • the first sensors 4 are suitable for capturing images representing the first user P1 and a real object 14 located close to the first user P1 and which he may possibly manipulate.
  • the first sensors are able to very accurately capture the movements of the chest, arms, hands, fingers as well as the facial expression of the person wearing them.
  • the first sensors may optionally comprise at least one camera capable of capturing images representing the second user P2.
  • the first sensors may further comprise other cameras with other viewing angles and/or motion sensors.
  • the first sensors further comprise a communication interface which allows them to transmit the digital data captured to the first communication terminal 6.
  • the second sensors 10 are suitable for capturing images representing the second user P2 and data representing the gestures and movements of the second user. They are also suitable for transmitting the digital data picked up to the second communication terminal 12, via the communication network 8. They are identical or similar to the first sensors and will not be described again.
  • the first virtual reality headset 7 comprises on the one hand a processor and, on the other hand, a camera, a memory, a coherent light source and microphones connected to the processor.
  • the processor comprises a synthetic holographic image generator, a network interface and a communication interface capable of receiving data from the first sensors.
  • the synthetic holographic image generator is suitable for generating virtual objects which consist of a series of holographic images.
  • the first virtual reality headset 7 is similar to the registered trademark "HoloLens" virtual reality headsets. It will not be described in detail.
  • the second virtual reality headset 13 is similar to the first virtual reality headset. Its own communication interface to receive data from the second sensors.
  • Mixed reality is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects coexist and interact in real time.
  • Mixed reality does not take place exclusively in the physical or virtual world, but is a hybrid of reality and virtual reality, encompassing both augmented reality and augmented virtuality through immersive technology.
  • the communication network 8 is for example an Internet network.
  • the exchanges comply with the Internet protocol.
  • a user is a real person as opposed to a virtual person who is either an avatar of the real person or a hologram of the real person.
  • the first user P1 and the second user P2 are adult physical persons or children capable of wearing a virtual reality helmet and first sensors or second sensors.
  • the first user P1 can manipulate a real object 14.
  • This real object can for example be a new machine tool to be implemented in a company or in a production site.
  • This real object 14 can also be a musical instrument such as a violin or any other accessory.
  • the second user is located remotely from the first user.
  • the term "remote" can be understood as a sufficiently large distance so that the first and the second user cannot hear and see each other.
  • the first user can be located in one country and the second user in a different country several thousand kilometers from the first user.
  • the first user and the second user can be in the same building but in different rooms.
  • the communication method begins with a step 18 during which the first virtual reality headset 7 is parameterized with respect to the vision and the optical characteristics of the first user P1 who wears it.
  • the manner of implementing this parameter setting is known per se. It essentially consists of detecting the pupil of the wearer of the virtual reality helmet in order to be able to generate and project virtual characters and virtual objects in his field of view at the desired distance.
  • the second virtual reality helmet 13 is parameterized with respect to the vision and to the optical characteristics of the second user P2 who wears it.
  • the method continues with a step 20 during which the first sensors 4 pick up data relating to the first user P1, and in particular data relating to the representation of the first user P1, to the latter's movements and data representative of the real object 14 possibly manipulated by the first user P1.
  • the second sensors 10 pick up data relating to the second user P2 and in particular data relating to the representation of the second user P2.
  • the data captured by the first sensors 4 are transmitted to the first virtual reality headset 7.
  • the first virtual reality headset 7 generates at a given instant T1 a first virtual character SP1 representing the first user P1, a first virtual object 11 representing the real object 14 and a second virtual object I2 representing the real object 14 from the data picked up by the first sensors.
  • the second virtual object I2 is identical to the first virtual object at the given instant T1.
  • the first virtual character SP1 represents the first user P1.
  • the first virtual character SP1 represents the first user P1 from the front.
  • the first virtual character SP1 represents an avatar prerecorded in the memory and selected by the first user.
  • the data captured by the first sensors 4 are transmitted to the second virtual reality headset 13 and this is the second reality headset virtual 13 which generates the first virtual character SP1, the first virtual object 11 and the second virtual object I2.
  • the first virtual character SP1, the first virtual object 11 and the second virtual object I2 are transmitted from the first communication terminal 6, and in particular from the first headset 7, to the second communication terminal
  • the second sensors 10 transmit the data captured to the second virtual reality headset 13.
  • the second virtual reality headset 13 generates a second virtual character SP2 representing the second user P2.
  • the second terminal transmits to the first terminal 6, and in particular to the first headset 7, the second generated virtual character SP2.
  • the data captured by the second sensors 10 are transmitted to the first virtual reality headset 7 and it is the first virtual reality headset 7 which generates the first virtual character SP1, the first virtual object 11 and the second virtual object I2 .
  • the second virtual reality headset 13 projects the first virtual character SP1 representing the first user, the first virtual object 11 and the second virtual object I2 representing the real object at a given instant T1.
  • the second virtual object I2 is projected between the first virtual object 11 and the second virtual reality headset.
  • the second virtual object I2 is projected at a distance less than or equal to 1 meter from the second virtual reality headset.
  • the second virtual object I2 is projected within reach of the second user so that the second user (the trainer) sees very clearly and can, if necessary, manipulate the virtual representation of the real object.
  • the second sensors 10 capture data representative of the movements of the second user P2.
  • the second communication terminal 12, and in particular the second headset 13 generates at a later time T2 a third virtual object I3 by modifying the second virtual object I2 to represent the real object manipulated by the second user P2.
  • the third virtual object I3 is generated from data captured by the first sensors 4 and data captured by the second sensors 10.
  • the second user the trainer
  • the second sensors pick up the movements of the second user P2 and in particular the movements of these arms and of his fingers.
  • the data representative of his gestures and his movements are transmitted to the processor of the second virtual reality helmet 13. From these data, the processor calculates what would be the successive positions of the real object if it had been manipulated by these movements. .
  • the second virtual reality helmet 13 generates the third virtual object I3 representing the real object manipulated by the second user P2 from the data captured by the second sensors to virtually represent the movements of the real object if it were subjected to the manipulations picked up by the second sensors 10.
  • the second virtual object I2 represents the real object manipulated by the second user P2.
  • the second communication terminal 12, and in particular the second headset 13 transmits to the first communication terminal 6, and in particular to the first headset 7, the data making it possible to generate the third virtual object I3 representing the virtual object manipulated by the second user P2.
  • the second virtual reality helmet 13 projects the third virtual object I3 generated during step 34, as well as the first virtual character SP1 representing the first user and the first virtual object 11 representing the real object.
  • the first virtual reality headset 7 projects the third virtual object I3 representing the real object from the data transmitted by the second communication terminal during step 36, as well as the second character virtual SP2 representing the second user manipulating the real object from the data generated during step 25.
  • the first user P1 visualizes the second user P2 (the trainer) who is at a distance and who manipulates the real object .
  • the second user P2 can remotely manipulate the second virtual object I2 representing the real object.
  • the fact that the second user can manipulate a virtual representation of the real object will help the acquisition of the gesture by the first user P1 in the same way as if the two people were physically in the same room.
  • the second user P2 can intervene remotely and prevent a possible accident. All of the steps 20 to 40 take place simultaneously during the communication and the exchanges between the first user P1 and the second user P2.
  • the communication method possibly includes a step 48 of registration request by the first user P1.
  • This request can be made by a voice request from the first user or by pressing a button of the first communication terminal 6 by the first user.
  • Step 48 continues with a step 50 of recording in the memory of the first communication terminal 6, of the second virtual character SP2 representing the second user and of the third virtual object 13 representing the moving object.
  • Step 50 can subsequently be followed by a step 52 of projection by the first virtual reality headset 7 of the second virtual character SP and of the virtual object recorded.
  • the first user can re-visualize the training that she has received in order to memorize it.
  • the communication method possibly includes a step 54 of registration request by the second user P2.
  • This request can be made by a voice request from the second user or by pressing a button of the second communication terminal 12 by the second user.
  • Step 54 continues with a step 56 of recording in the memory of the second communication terminal 12 of the first virtual character SP1 representing the first user P1 and of the first virtual object 11 representing the real object.
  • Step 56 can optionally continue with a step 58 of projection by the second communication terminal 12 of the first virtual character SP1 and of the virtual object recorded.
  • the second user P2 can review manipulations previously carried out by the first user P1 in order to indicate, for example, the moment when the manipulation must be improved.
  • at least the first virtual character SP1 or the second virtual character SP2 consists of a series of holographic images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Information Transfer Between Computers (AREA)
EP22732605.5A 2021-05-31 2022-05-30 Kommunikationsverfahren mit gemischter realität, kommunikationssystem, computerprogramm und informationsmedium Pending EP4349006A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2105727A FR3123532A1 (fr) 2021-05-31 2021-05-31 Procédé de communication en réalité mixte, système de communication, programme d‘ordinateur et support d’informations
PCT/FR2022/051009 WO2022254135A1 (fr) 2021-05-31 2022-05-30 Procédé de communication en réalite mixte, système de communication, programme d'ordinateur et support d'informations

Publications (1)

Publication Number Publication Date
EP4349006A1 true EP4349006A1 (de) 2024-04-10

Family

ID=77411808

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22732605.5A Pending EP4349006A1 (de) 2021-05-31 2022-05-30 Kommunikationsverfahren mit gemischter realität, kommunikationssystem, computerprogramm und informationsmedium

Country Status (3)

Country Link
EP (1) EP4349006A1 (de)
FR (1) FR3123532A1 (de)
WO (1) WO2022254135A1 (de)

Also Published As

Publication number Publication date
WO2022254135A1 (fr) 2022-12-08
FR3123532A1 (fr) 2022-12-02

Similar Documents

Publication Publication Date Title
CN107924584B (zh) 增强现实
EP3096208B1 (de) Bildverarbeitung für kopfmontierte anzeigevorrichtungen
CN105264460B (zh) 全息图对象反馈
JP2022502800A (ja) 拡張現実のためのシステムおよび方法
CA2942652C (fr) Systeme de simulation tridimensionnelle virtuelle propre a engendrer un environnement virtuel reunissant une pluralite d'utilisateurs et procede associe
WO2014168994A1 (en) Personal holographic billboard
JP6550522B1 (ja) 動画配信システム、動画配信方法及び動画配信プログラム
TW201228332A (en) Mobile electronic device
TW202141120A (zh) 具可調整影像感測模組的頭戴式裝置及其系統
CN114207557A (zh) 虚拟和物理相机的位置同步
CN115413353A (zh) 扩展现实记录仪
EP3651057B1 (de) Gesichtserkennungsverfahren eines armbanduhrträgers
EP3983870B1 (de) Digitales missionsvorbereitungssystem
EP4349006A1 (de) Kommunikationsverfahren mit gemischter realität, kommunikationssystem, computerprogramm und informationsmedium
WO2018178560A1 (fr) Procédé de sélection d'au moins une partie d'image à télécharger par anticipation pour restituer un flux audiovisuel
WO2017187095A1 (fr) Dispositif et procede de partage d'immersion dans un environnement virtuel
FR3081572A1 (fr) Procede et systeme d'authentification d'un utilisateur porteur d'un dispositif d'immersion
FR3056770A1 (fr) Dispositif et procede de partage d'immersion dans un environnement virtuel
FR3043818B1 (fr) Systeme d'elaboration, d'emission et d'interpretation d'un flux composite et procedes associes
WO2019138186A1 (fr) Dispositif et procede ameliores de communication d'informations sonores a un utilisateur en realite augmentee
EP1124212B1 (de) 3D-visuelle Präsentationsmethode und Apparat für Autosimulator
US11924317B1 (en) Method and system for time-aligned media playback
WO2018011497A1 (fr) Système et procédé de capture embarquée et de reproduction 3d/360° du mouvement d'un opérateur dans son environnement
US11671254B2 (en) Extended reality authentication
US20240104871A1 (en) User interfaces for capturing media and manipulating virtual objects

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231116

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR