WO2020101197A1 - Procédé et système de partage de contenu de réalité augmentée - Google Patents

Procédé et système de partage de contenu de réalité augmentée Download PDF

Info

Publication number
WO2020101197A1
WO2020101197A1 PCT/KR2019/013848 KR2019013848W WO2020101197A1 WO 2020101197 A1 WO2020101197 A1 WO 2020101197A1 KR 2019013848 W KR2019013848 W KR 2019013848W WO 2020101197 A1 WO2020101197 A1 WO 2020101197A1
Authority
WO
WIPO (PCT)
Prior art keywords
user terminal
coordinates
shared
coordinate system
augmented reality
Prior art date
Application number
PCT/KR2019/013848
Other languages
English (en)
Korean (ko)
Inventor
서석은
정연운
최필균
Original Assignee
유엔젤주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유엔젤주식회사 filed Critical 유엔젤주식회사
Publication of WO2020101197A1 publication Critical patent/WO2020101197A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Definitions

  • the present invention relates to an augmented reality content sharing method and system.
  • Augmented Reality is a field of virtual reality that is a computer graphic technique that synthesizes virtual objects or information in a real environment to look like objects existing in the original environment. It is also referred to as Mixed Reality (MR) because it shows the virtual world with additional information in real time in one image.
  • MR Mixed Reality
  • Korean Registered Patent No. 1,788,248 when a large number of users gather in a real space that reproduces augmented reality content and share the augmented reality content using their own user terminal, the world coordinate system in that space is shared. When each of the augmented reality content was applied in a non-existing state, there was a problem that the augmented reality content could be seen at different locations.
  • the present invention has been devised in consideration of the above situation, and the technical problem of the present invention is to augmented reality that allows a shared object, which is augmented reality content, to be placed in the same location in a real space, and can be synchronized to a direction in which the object to be shared is directed. It is to provide a content sharing method and system.
  • the method for sharing augmented reality content according to the present invention for solving the above technical problem is a step in which a first user terminal and a second user terminal recognize a reference object located in a real space reproducing augmented reality content, and the first user terminal Obtaining a first reference coordinate corresponding to a location of the reference object from a reference point of the local coordinate system, obtaining a second reference coordinate corresponding to a location of the reference object from a reference point of the local coordinate system of the second user terminal, the Obtaining coordinates for sharing the coordinates of the object to be shared in the local coordinate system of the first user terminal based on the first reference coordinate, and the second user terminal is configured to share coordinates transmitted from the first user terminal. And disposing the shared target object at coordinates obtained by applying the second reference coordinates.
  • the method includes obtaining a first reference rotation value corresponding to the direction of the reference object in the local coordinate system of the first user terminal, and a second corresponding to the direction of the reference object in the local coordinate system of the second user terminal.
  • the method may further include obtaining a reference rotation value.
  • the step of obtaining the coordinates for sharing may include obtaining coordinates to which the first reference rotation value is applied to coordinates of an object to be shared in the local coordinate system of the first user terminal, and to coordinates to which the first reference rotation value is applied. And obtaining the shared coordinates by subtracting the first reference coordinates.
  • the step of arranging the object to be shared may include obtaining coordinates to which the second reference rotation value is applied to the shared coordinates in the local coordinate system of the second user terminal, and to coordinates to which the second reference rotation value is applied. And disposing the shared object at coordinates obtained by adding second reference coordinates.
  • the first user terminal and the second user terminal recognizing the reference object located in the real space reproducing the augmented reality content may include recognizing an object of a predetermined shape, and indicating an object of the recognized predetermined shape.
  • the method may include displaying on the screen, and confirming that the recognized object of the predetermined type is a reference object from the user.
  • the augmented reality content sharing system for solving the above technical problem, a first user terminal recognizing a reference object located in a real space reproducing augmented reality content, and a second user terminal recognizing the reference object It includes.
  • the first user terminal obtains first reference coordinates corresponding to the location of the reference object at a reference point of the local coordinate system of the first user terminal, and obtains coordinates of an object to be shared in the local coordinate system of the first user terminal.
  • the shared coordinates converted based on the 1 reference coordinate are obtained, and the second user terminal obtains a second reference coordinate corresponding to the location of the reference object at a reference point of the local coordinate system of the second user terminal, and the first The object to be shared is disposed at coordinates obtained by applying the second reference coordinates to the sharing coordinates transmitted from the user terminal.
  • a computer-readable recording medium containing a program for performing augmented reality content sharing method.
  • the program corresponds to a command set recognized by a first user terminal and a second user terminal in a reference object located in a real space reproducing augmented reality content, and corresponds to the location of the reference object at a reference point of the local coordinate system of the first user terminal
  • a command set for obtaining coordinates for sharing obtained by converting the coordinates of the first reference coordinates based on the first reference coordinates, and the second user terminal is obtained by applying the second reference coordinates to the shared coordinates transmitted from the first user terminal
  • a set of instructions for arranging the object to be shared may be included in the coordinates.
  • it is to provide a method and system for sharing augmented reality content that allows an object to be shared, which is augmented reality content, to be placed in the same location in a real space, and can be synchronized in a direction toward the object to be shared.
  • FIG. 1 is a block diagram showing the configuration of an augmented reality content sharing system according to an embodiment of the present invention.
  • FIG. 2 illustrates that a target object, which is augmented reality content, is displayed at different locations for each user terminal.
  • FIG. 3 is a flowchart provided to describe the operation of the augmented reality content sharing system according to an embodiment of the present invention.
  • FIG. 4 illustrates that an object to be shared, which is augmented reality content, is displayed at the same location for each user terminal in the augmented reality content sharing system according to an embodiment of the present invention.
  • FIG. 5 is a view provided to explain matters to be considered for the direction of a reference object according to another embodiment of the present invention.
  • FIG. 6 is a flowchart provided to describe the operation of the augmented reality content sharing system according to another embodiment of the present invention.
  • FIG. 1 is a block diagram showing the configuration of an augmented reality content sharing system according to an embodiment of the present invention.
  • the augmented reality content sharing system may include a plurality of user terminals 100a, 100b, ..., 100n capable of sharing augmented reality content. And it may include a server 200 capable of providing augmented reality content to a plurality of user terminals (100a, 100b, ..., 100n).
  • the plurality of user terminals 100a, 100b, ..., 100n and the server 200 may exchange various information and data through the communication network 10.
  • the communication network 10 does not cover a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), the Internet, etc. It doesn't matter which communication method you use.
  • a plurality of user terminals such as a smart phone, tablet PC, HMD (Head mounted Display) terminal, such as by synthesizing things or information in the real space to look like objects existing in the original environment It may be implemented as an information communication terminal supporting computer graphics technology.
  • the server 200 provides contents based on virtual reality and augmented reality, and at the same time, accepts each user's motion based on the scenario and changes the state of the content accordingly to provide a service using augmented reality and virtual reality.
  • an object image to be shared corresponding to augmented reality content may be provided according to a request from the user terminals 100a, 100b, ..., 100n.
  • FIG. 2 illustrates that a target object, which is augmented reality content, is displayed at different locations for each user terminal.
  • Augmented reality content When a plurality of user terminals 100a, 100b, ..., and 100n arrange augmented reality content based on each local coordinate system without sharing the world coordinate system in the real space, in the real space as illustrated in FIG. Augmented reality content will be shown in different locations.
  • the local coordinate system 1 reference point of the user terminal 100a and the local coordinate system 2 reference point of the user terminal 100b have the same coordinates (0,0,0), but different positions in the real space. Therefore, if the user object 100a and the user terminal 100b are respectively applied in their local coordinate system by applying the object coordinates to be shared as (10,0,0), the user of the user terminal 100a (hereinafter referred to as 'user 1') To the user of the user terminal 100b (hereinafter referred to as 'user 2'), the objects to be shared (S1, S2) appear to be arranged at different locations in the actual space.
  • FIG. 3 is a flowchart provided to describe the operation of the augmented reality content sharing system according to an embodiment of the present invention
  • FIG. 4 is augmented reality content sharing target in the augmented reality content sharing system according to an embodiment of the present invention It shows that objects are displayed at the same location for each user terminal.
  • the user terminal 100a and the user terminal 100b may recognize a reference object R located in a real space that reproduces augmented reality content (S305).
  • 4 shows a case where the trash can is a reference object (R).
  • the reference object R may be determined by selecting one of objects located in an actual space between users of the user terminal 100a and the user terminal 100b.
  • a method in which a reference object R is displayed on the screens of the user terminal 100a and the user terminal 100b is set as a reference object.
  • the shape of the reference object may be determined in advance according to an embodiment.
  • the user terminal 100a and the user terminal 100b recognize a predetermined type of object and display an indication on the screen, so that the user can check whether the object is the reference object.
  • the user terminal 100a may obtain a first reference coordinate corresponding to the position of the reference object R at a reference point of its local coordinate system (hereinafter referred to as local coordinate system 1) (S310). 3 shows that (10, -10,10) is obtained as the first reference coordinate.
  • the user terminal 100b may obtain a second reference coordinate corresponding to the position of the reference object R at a reference point of its local coordinate system (hereinafter referred to as local coordinate system 2) (S315).
  • 3 shows that (10, -10, -10) is obtained as the second reference coordinate.
  • the user terminal 100a may obtain the coordinates for sharing by converting the coordinates of the object S to be shared in the local coordinate system 1 by applying Equation 1 based on the first reference coordinate (S320).
  • Shared coordinates object coordinates to be shared in the local coordinate system 1-first reference coordinate
  • the coordinate for sharing is (10,0,0) Can be obtained.
  • the user terminal 100a may transmit the sharing coordinates obtained for the object to be shared S to the user terminal 100b (S325).
  • the transmission of the coordinates for sharing may be directly transmitted through short-range communication between the user terminal 100a and the user terminal 100b, or may be transmitted through the server 200.
  • graphic image information for rendering the object S to be shared on the screen of the user terminal may also be transmitted together with the coordinates for sharing.
  • the coordinates for sharing may be directly shared between user terminals, and graphic image information may be shared through the server 200.
  • the user terminal 100b may apply the second reference coordinate to the sharing coordinates transmitted from the user terminal 100a to place the object to be shared in the coordinates obtained by Equation 2 (S330).
  • Object coordinates to be shared in local coordinate system 2 coordinates for sharing + second reference coordinates
  • the coordinates for sharing are (10,0,0) and the second reference coordinates are (10, -10, -10)
  • the coordinates of the object S to be shared in the local coordinate system 2 are (20, -10,-) 10).
  • the location in the real space of the shared object S viewed by the user 1 and the location in the real space of the shared object S viewed by the user 2 are the same.
  • FIG. 5 is a view provided to explain consideration of a direction of a reference object according to another embodiment of the present invention
  • FIG. 6 is provided to describe the operation of the augmented reality content sharing system according to another embodiment of the present invention It is a flow chart.
  • the reference object not only exists in different locations, but also lies in different directions based on each user.
  • the reference object R is on the right side in the direction that user 1 is looking at, and the vertex RP is pointing at the 12 o'clock direction.
  • the reference object R is located on the left side, and the vertex RP faces the middle direction between 12 o'clock and 3 o'clock.
  • the user terminal 100a and the user terminal 100b may recognize a reference object R located in a real space that reproduces augmented reality content (S605).
  • the first user terminal is a user terminal 100a
  • the second user terminal is a user terminal 100b.
  • the user terminal 100a may obtain first reference coordinates corresponding to the position of the reference object R at a reference point of its local coordinate system (hereinafter referred to as 'local coordinate system 1') (S610).
  • the user terminal 100b may obtain a second reference coordinate corresponding to the position of the reference object R at a reference point of its local coordinate system (hereinafter referred to as 'local coordinate system 2') (S615).
  • 'local coordinate system 2' a reference point of its local coordinate system
  • the user terminal 100a may obtain a first reference rotation value corresponding to the direction of the reference object R in the local coordinate system 1 (S620).
  • Step S620 may be performed simultaneously with step S610.
  • the direction of the vertex RP in the local coordinate system 1 may be obtained as a first reference rotation value.
  • the user terminal 100b may obtain a second reference rotation value corresponding to the direction of the reference object R in the local coordinate system 2 (S625).
  • Step S625 may be performed simultaneously with step S615.
  • the direction of the vertex RP in the local coordinate system 2 can be obtained as the second reference rotation value.
  • the user terminal 100a obtains the coordinates to which the first reference rotation value is applied to the coordinates of the object to be shared in the local coordinate system 1 (S630), and subtracts the first reference coordinate to the coordinates obtained in step S630 for sharing. Coordinates can be obtained (S635). Steps S630 and S635 may be performed by Equation (3).
  • Shared coordinates (first reference rotation value * coordinates of the object to be shared in the local coordinate system 1)-first reference coordinate
  • Converting the object coordinates to be shared to the coordinates for sharing in the local coordinate system 1 by Equation 3 is as follows.
  • the shared object is rotated as much as the reference object R is rotated so that the vertex RP is directed in the z-axis direction.
  • the shared object is moved as much as the reference object R is moved to the reference point.
  • the coordinates for sharing can be obtained.
  • the user terminal 100a may transmit or transmit the shared coordinates obtained for the object to be shared to the user terminal 100b (S640).
  • the user terminal 100b obtains the coordinates to which the second reference rotation value is applied in the local coordinate system 2 to the shared coordinates transmitted from the user terminal 100a (S645), and the second reference coordinates to the coordinates to which the second reference rotation value is applied.
  • the object to be shared can be arranged at the coordinates obtained by adding (S650). Steps S645 and S650 may be performed by Equation (4).
  • Object coordinate to be shared in local coordinate system 2 (-2nd reference rotation value * sharing coordinate) + 2nd reference coordinate
  • the user 1 and the user 2 may not only appear that the object to be shared is in the same position in the real space, but also reflect the direction in which the object to be shared is facing.
  • the embodiments described above may be implemented by hardware components, software components, and / or combinations of hardware components and software components.
  • the devices, methods, and components described in the embodiments include, for example, processors, controllers, arithmetic logic units (ALUs), digital signal processors (micro signal processors), microcomputers, and field programmable gates (FPGAs). It can be implemented using one or more general purpose computers or special purpose computers, such as arrays, programmable logic units (PLUs), microprocessors, or any other device capable of executing and responding to instructions.
  • the processing device may run an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may access, store, manipulate, process, and generate data in response to the execution of the software.
  • OS operating system
  • the processing device may access, store, manipulate, process, and generate data in response to the execution of the software.
  • a processing device may be described as one being used, but a person having ordinary skill in the art, the processing device may include a plurality of processing elements and / or a plurality of types of processing elements. It can be seen that may include.
  • the processing device may include a plurality of processors or a processor and a controller.
  • other processing configurations such as parallel processors, are possible.
  • the software may include a computer program, code, instruction, or a combination of one or more of these, and configure the processing device to operate as desired, or process independently or collectively You can command the device.
  • Software and / or data may be interpreted by a processing device, or to provide instructions or data to a processing device, of any type of machine, component, physical device, virtual equipment, computer storage medium or device. , Or may be permanently or temporarily embodied in the transmitted signal wave.
  • the software may be distributed over networked computer systems, and stored or executed in a distributed manner.
  • Software and data may be stored in one or more computer-readable recording media.
  • the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, or the like alone or in combination.
  • the program instructions recorded in the medium may be specially designed and configured for the embodiments or may be known and usable by those skilled in computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs, DVDs, and magnetic media such as floptical disks.
  • -Hardware devices specifically configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • program instructions include high-level language codes that can be executed by a computer using an interpreter, etc., as well as machine language codes produced by a compiler.
  • the hardware device described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Architecture (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un système de partage de contenu de réalité augmentée. Le procédé selon la présente invention comprend : une étape dans laquelle un premier terminal utilisateur et un second terminal utilisateur reconnaissent un objet de référence situé dans un espace réel dans lequel un contenu de réalité augmentée est reproduit ; une étape consistant à obtenir, au niveau d'un point de référence d'un système de coordonnées locales du premier terminal d'utilisateur, une première coordonnée de référence correspondant à la position de l'objet de référence ; une étape consistant à obtenir, au niveau d'un point de référence d'un système de coordonnées locales du second terminal d'utilisateur, une seconde coordonnée de référence correspondant à la position de l'objet de référence ; une étape consistant à obtenir une coordonnée pour partage qui est réalisée par conversion de la coordonnée d'un objet à partager par rapport à la première coordonnée de référence, dans le système de coordonnées locales du premier terminal d'utilisateur ; et une étape dans laquelle le second terminal d'utilisateur place l'objet à partager au niveau d'une coordonnée qui est obtenue en appliquant la seconde coordonnée de référence à la coordonnée pour partage transmis à partir du premier terminal d'utilisateur.
PCT/KR2019/013848 2018-11-15 2019-10-22 Procédé et système de partage de contenu de réalité augmentée WO2020101197A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0140587 2018-11-15
KR20180140587 2018-11-15

Publications (1)

Publication Number Publication Date
WO2020101197A1 true WO2020101197A1 (fr) 2020-05-22

Family

ID=70730849

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/013848 WO2020101197A1 (fr) 2018-11-15 2019-10-22 Procédé et système de partage de contenu de réalité augmentée

Country Status (1)

Country Link
WO (1) WO2020101197A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022045819A1 (fr) * 2020-08-27 2022-03-03 Samsung Electronics Co., Ltd. Dispositif électronique permettant de fournir un contenu de réalité augmentée et son procédé de fonctionnement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101292463B1 (ko) * 2011-01-27 2013-07-31 주식회사 팬택 원격으로 증강현실 서비스를 공유하는 증강현실 시스템 및 그 방법
KR101720132B1 (ko) * 2016-03-17 2017-03-27 주식회사 엔토소프트 측위 정보를 기반으로 복수의 사용자 간 동일한 증강 현실 영상을 공유하는 방법 및 그 시스템
KR20170036704A (ko) * 2014-07-25 2017-04-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 헤드 마운트형 디스플레이 디바이스를 사용한 다중 유저 시선 투영
WO2017054421A1 (fr) * 2015-09-30 2017-04-06 深圳多新哆技术有限责任公司 Procédé et dispositif d'ajustement d'image de réalité virtuelle
US20180286122A1 (en) * 2017-01-30 2018-10-04 Colopl, Inc. Information processing method and apparatus, and program for executing the information processing method on computer

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101292463B1 (ko) * 2011-01-27 2013-07-31 주식회사 팬택 원격으로 증강현실 서비스를 공유하는 증강현실 시스템 및 그 방법
KR20170036704A (ko) * 2014-07-25 2017-04-03 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 헤드 마운트형 디스플레이 디바이스를 사용한 다중 유저 시선 투영
WO2017054421A1 (fr) * 2015-09-30 2017-04-06 深圳多新哆技术有限责任公司 Procédé et dispositif d'ajustement d'image de réalité virtuelle
KR101720132B1 (ko) * 2016-03-17 2017-03-27 주식회사 엔토소프트 측위 정보를 기반으로 복수의 사용자 간 동일한 증강 현실 영상을 공유하는 방법 및 그 시스템
US20180286122A1 (en) * 2017-01-30 2018-10-04 Colopl, Inc. Information processing method and apparatus, and program for executing the information processing method on computer

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022045819A1 (fr) * 2020-08-27 2022-03-03 Samsung Electronics Co., Ltd. Dispositif électronique permettant de fournir un contenu de réalité augmentée et son procédé de fonctionnement
US11475587B2 (en) 2020-08-27 2022-10-18 Samsung Electronics Co., Ltd. Electronic device for providing augmented-reality content and operation method thereof

Similar Documents

Publication Publication Date Title
US6710753B2 (en) Multi-screen session mobility between terminal groups
WO2015174729A1 (fr) Procédé et système de fourniture de réalité augmentée destinés à fournir des informations spatiales, ainsi que support d'enregistrement et système de distribution de fichier
WO2018004154A1 (fr) Dispositif d'affichage de réalité mixte
CN111652946B (zh) 显示标定方法及装置、设备、存储介质
WO2013165180A1 (fr) Procédé de suivi de journaux, serveur associé et support d'enregistrement
WO2017026738A1 (fr) Système et procédé de protection de codes pour une application
WO2013176342A1 (fr) Système et procédé d'interconnexion d'un terminal d'utilisateur et d'un dispositif externe
US20220358662A1 (en) Image generation method and device
WO2016186236A1 (fr) Système et procédé de traitement de couleur pour objet tridimensionnel
WO2023059087A1 (fr) Procédé et appareil d'interaction de réalité augmentée
WO2020101197A1 (fr) Procédé et système de partage de contenu de réalité augmentée
CN111324376B (zh) 功能配置方法、装置、电子设备及计算机可读介质
CN113778593B (zh) 云桌面控制方法、装置、电子设备、存储介质及程序产品
US10733689B2 (en) Data processing
WO2022004978A1 (fr) Système et procédé pour tâche de conception de décoration architecturale
WO2013183877A1 (fr) Système destiné à fournir un dispositif de visualisation d'animation numérique tridimensionnelle et procédé associé
WO2019045128A1 (fr) Amélioration de la qualité d'image d'un appel vidéo
WO2020060123A1 (fr) Dispositif de support d'un ensemble optimal de matériaux à l'aide d'une technologie de réalité mixte, et son procédé
WO2016204412A1 (fr) Dispositif de terminal et procédé pour afficher une image au moyen de ce dernier, et serveur web et procédé pour fournir une page web au moyen de ce dernier
WO2016027910A1 (fr) Procédé de traçage de rayons, dispositif de traçage de rayons pour mettre en œuvre ledit procédé, et support d'enregistrement permettant de stocker celui-ci
WO2021194082A1 (fr) Procédé et dispositif pour fournir une page web à l'aide d'un portail captif
Hudák et al. Peripheral devices support for LIRKIS CAVE
WO2019124802A1 (fr) Appareil et procédé permettant de produire un pseudo-hologramme de mappage en utilisant une sortie de signal d'image individuelle
WO2019107637A1 (fr) Système et procédé d'application d'hologramme numérique
WO2019031622A1 (fr) Procédé et système de fourniture d'un écran d'appel de groupe

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19884658

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19884658

Country of ref document: EP

Kind code of ref document: A1