WO2024005473A1 - Système et procédé de conférence en réalité virtuelle (rv) en temps réel capables de partager une indication d'emplacement spécifique sur un objet 3d dans un espace virtuel - Google Patents

Système et procédé de conférence en réalité virtuelle (rv) en temps réel capables de partager une indication d'emplacement spécifique sur un objet 3d dans un espace virtuel Download PDF

Info

Publication number
WO2024005473A1
WO2024005473A1 PCT/KR2023/008831 KR2023008831W WO2024005473A1 WO 2024005473 A1 WO2024005473 A1 WO 2024005473A1 KR 2023008831 W KR2023008831 W KR 2023008831W WO 2024005473 A1 WO2024005473 A1 WO 2024005473A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
conference
user
specific location
meeting
Prior art date
Application number
PCT/KR2023/008831
Other languages
English (en)
Korean (ko)
Inventor
이종오
박인규
김종현
변상우
김성민
Original Assignee
주식회사 와이엠엑스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 와이엠엑스 filed Critical 주식회사 와이엠엑스
Publication of WO2024005473A1 publication Critical patent/WO2024005473A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/157Conference systems defining a virtual conference space and using avatars or agents
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present invention relates to a non-face-to-face conference system and method, and to a conference system and method based on virtual reality (VR).
  • VR virtual reality
  • Videoconferencing involves the reception and transmission of audio-visual signals by users at different locations for real-time communication. These video conferences are available through multiple computing devices on a variety of video conferencing services, including the ZOOM service.
  • VR Virtual Reality
  • online conference systems using virtual reality have emerged, and attempts are being made to solve the lack of realism and communication difficulties of existing video conferences.
  • Korean Patent No. 10-2292606 proposes a conference service provision system using virtual reality.
  • the offline experience can be captured online.
  • the purpose of the present invention is to provide a VR conference system and method that allows multiple users to accurately share a specific location indication on a 3D object in a virtual space in real time.
  • the present invention provides a VR conference system and method that can facilitate communication between users in a non-face-to-face meeting by enabling multiple users to accurately share a specific location indication on a 3D object in a virtual space in real time. provided for a different purpose.
  • the present invention improves the convenience and efficiency of the VR conference system by enabling multiple users to share specific location indications on 3D objects in a virtual space accurately in real time to facilitate communication between users in non-face-to-face meetings.
  • Another purpose is to provide a VR conference system and method that can be used.
  • servers provided;
  • a virtual conference operation terminal that transmits and receives data to and from the server through the virtual conference platform and transmits conference information for the virtual VR conference to the server;
  • a user terminal connected to the server through the virtual conference platform and capable of accessing a virtual conference room for the virtual VR conference.
  • the meeting information for the virtual VR meeting may include at least one of participant information for the virtual VR meeting, meeting room location and size, and a 3D object for the meeting.
  • the server may create the virtual meeting room based on meeting information for the virtual VR meeting and reflect the 3D object for the meeting in the virtual meeting room.
  • the server may represent a conference management account connected to the virtual conference room through the virtual conference operation terminal as a conference operation avatar, and a user account connected to the virtual conference room through the user terminal may be expressed as a user avatar.
  • the server transmits and shares the specific location on the pointed 3D object to all user terminals to which the user account is logged in. You can.
  • the user terminal sets three-dimensional coordinates (x, y, z) in space corresponding to the specific location on the 3D object for a conference. Measure and store while transmitting to the server, and the server transmits the received spatial 3D coordinates (x, y, z) to the user terminal of another user who is viewing the conference 3D object. And, having received the 3D coordinates (x, y, z) in the space, the user terminal of the other user can output and display them based on the anchor of the 3D object for the conference.
  • the measurement of 3D coordinates (x, y, z) in space by one of the user terminals corresponding to a specific location on the 3D object for the conference can be done based on the raycasting technique.
  • the display method may use a laser point display.
  • the step of performing a predefined interaction of the user interaction element includes, when a specific location on the 3D object for a conference is touched by a user terminal, the user terminal moves to a specific location on the 3D object for a conference. Measuring and storing three-dimensional coordinates (x, y, z) in the corresponding space and transmitting them to the server; The server transmitting the received spatial 3D coordinates (x, y, z) to the user terminal of another user who is viewing the conference 3D object (Object); And the user terminal of the other user, which has received the 3D coordinates (x, y, z) in the space, may output and display them based on the anchor of the 3D object for the meeting.
  • the measurement of 3D coordinates (x, y, z) in space by one of the user terminals corresponding to a specific location on the 3D object for the conference can be done based on the raycasting technique.
  • the display method may use a laser point display.
  • a plurality of users can accurately share a specific location indication on a 3D object in a virtual space in real time. It works.
  • the real-time VR conference system and method for sharing a specific location indication on a 3D object in a virtual space of the present invention multiple users can accurately share a specific location indication on a 3D object in a virtual space in real time. This has the effect of facilitating communication between users in non-face-to-face meetings.
  • the real-time VR conference system and method for sharing a specific location indication on a 3D object in a virtual space of the present invention multiple users can accurately share a specific location indication on a 3D object in a virtual space in real time.
  • it has the effect of improving the convenience and efficiency of the VR conference system.
  • Figure 1 is an example configuration of a real-time VR conference system capable of sharing a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • Figure 2 is a diagram for explaining the operation of a real-time VR conference system that can share a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • Figure 3 is a diagram to explain the principles of the raycasting technique used in the present invention to measure the specific position of a 3D object.
  • FIG. 4 is a diagram illustrating a method of sharing a specific location indication of a 3D object in a real-time VR conference system capable of sharing a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • Figure 5 is an operation flowchart showing an operation of sharing a specific location indication of a 3D object in a real-time VR conference system capable of sharing a specific location indication of a 3D object in a virtual space according to an embodiment of the present invention.
  • first and second may be used to describe various components, but the components may not be limited by the terms. Terms are intended only to distinguish one component from another. For example, a first component may be referred to as a second component, and similarly, the second component may be referred to as a first component without departing from the scope of the present invention.
  • Figure 1 is an example configuration of a real-time VR conference system capable of sharing a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • the server 110, the virtual meeting operation terminal 120, and the user terminals 131-133 are configured to exchange data through wired and wireless communication.
  • the VR conference system of the present invention can provide a VR conference through VR (Virtual reality) technology and can configure a virtual space for a 360° VR virtual conference.
  • VR Virtual reality
  • the server 110 may be an in-house server or a cloud server owned by the person providing the service using the server 110, and has all the calculation functions, storage/reference functions, input/output functions, and control functions of a normal computer. Or it may be configured to perform some operations.
  • the server 110 is equipped with a virtual meeting platform 140 and can provide a virtual space for a virtual VR meeting through the virtual meeting platform 140. That is, the server 110 can configure a virtual space including a 360° VR virtual meeting room by incorporating VR technology through the virtual meeting platform 140.
  • the virtual space may be a method that combines 3D modeling and VR technology.
  • the server 110 may create a virtual 3D conference room and a virtual 3D object that is the subject of meeting discussion.
  • the virtual 3D conference room may implement a specific concept unique to each conference room.
  • conference room A can be rendered with general conference room graphics
  • conference room B can be rendered with laboratory graphics
  • area C can be rendered with manufacturing plant graphics.
  • the virtual conference operation terminal 120 and the user terminals 131-133 may exchange data with the server 110 through the virtual conference platform 140.
  • the virtual meeting operation terminal 120 may be a desktop computer, laptop, tablet, smartphone, VR device, etc. That is, in FIG. 1, the virtual meeting operation terminal 120 may be a desktop, and depending on the embodiment, it may be a laptop or tablet. Additionally, the virtual meeting operation terminal 120 may be configured to perform all or part of the calculation function, storage/reference function, input/output function, and control function of a typical computer.
  • the meeting management account (151-153) can access the virtual meeting platform 140 through the virtual meeting management terminal 120, and the meeting management account (151-153) can access the meeting management avatar on the virtual meeting platform 140. It can be expressed as
  • the meeting management avatar may interact with the user avatar linked to the user account 161-163.
  • a predefined action may be performed.
  • a chat may occur between a meeting management account linked to a meeting management avatar and a user account linked to the user avatar.
  • the user terminals 131-133 may be desktop computers, laptops, tablets, smartphones, VR devices, etc.
  • the first user terminal 131 may be a smartphone
  • the second user terminal 132 may be a smartphone
  • the desktop and third user terminal 133 may be a laptop.
  • the user terminals 131-133 may be configured to perform all or part of the calculation function, storage/reference function, input/output function, and control function of a typical computer.
  • user accounts 161-163 may access the virtual conference platform 140 through user terminals 131-133. That is, each user terminal 131-133 may be connected to the server 110 through the virtual conference platform 140.
  • User accounts 161-163 may be represented by user avatars in the virtual conference platform 140, and the user avatars may interact with the conference management avatars described above.
  • a predefined operation may be performed. For example, when an interaction occurs, a chat may occur between a meeting management account linked to a meeting management avatar and a user account linked to the user avatar.
  • each terminal 120 and three user terminals 131 to 133 are shown in FIG. 1, but the number of each terminal may vary. As long as the processing capacity of the server 110 allows, the number of each terminal is not particularly limited.
  • the virtual conference operation terminal 120 and the user terminals 131-133 are shown as separate devices in FIG. 1, but depending on the implementation, the virtual conference operation terminal and the user terminal are treated as one terminal. Integration is also possible.
  • Figure 2 is a diagram for explaining the operation of a real-time VR conference system that can share a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • the server 110 may obtain meeting information from the virtual meeting operation terminal 120 (S201).
  • the meeting information may include information on attendees who wish to participate in the VR meeting, conference room size and location, etc.
  • the meeting management account (151-153) may be logged in to the virtual meeting operating terminal 120, and the server 110 may provide meeting information to the virtual meeting operating terminal 120 connected to the virtual meeting platform 140.
  • a visual configuration including an input UI (User Interface) can be transmitted. Additionally, the server 110 may upload meeting information entered by the meeting management accounts 151-153 through the meeting information input UI.
  • the server 110 may create a virtual conference room based on the conference information (S202).
  • the server 110 may determine the location and size of the virtual conference room based on the conference information entered by the conference management accounts 151-153. Additionally, the server 110 may create the basic appearance of the virtual conference room using a predefined template according to the characteristics and requirements of the virtual conference room.
  • the server 110 may obtain at least one 3D object for a meeting from the virtual meeting operation terminal 120 (S203).
  • the 3D object for the meeting may be a 3D modeling resource (3D Model resource) to be discussed by users in the corresponding VR meeting.
  • 3D Model resource 3D Model resource
  • the 3D object for the meeting may be a 3D modeling resource for the vehicle.
  • the 3D object for the meeting may be a 3D modeling resource for the mobile phone.
  • the server 110 may reflect at least one conference 3D object in the virtual conference room (S204).
  • the server 110 may perform predefined rendering on the 3D object according to the characteristics and requirements of the virtual meeting room.
  • Predefined renderings can include colors, textures, lighting tones, etc. depending on the virtual meeting room. Through this, 3D objects included in the virtual meeting room can be presented without looking unnatural.
  • the server 110 may represent each user account 161 - 163 connected to the virtual conference room through each user terminal 131 - 133 as each user avatar (S205).
  • the user accounts 161-163 can log in from the user terminals 131-133 and access the virtual conference room through the virtual conference platform 140, and the server 110 can access the virtual conference room using user accounts ( 161-163) can be rendered as a user avatar.
  • the user avatar may be a 3D humanoid avatar capable of performing actions such as moving, looking around, talking, and making selections.
  • the appearance of the user avatar may be modeled or characterized based on real-life images, or may be customizable.
  • the server 110 may perform the predefined interaction of the corresponding user interaction element when each user avatar in the VR meeting performs an action that matches the predefined action of at least one user interaction element. There is (S206).
  • the predefined actions of the user interaction element may include various actions. For example, when clicking on another user avatar, chatting with the user account of that user avatar may be possible.
  • the predefined actions of these user interaction elements are linked to 3D objects for meetings.
  • the server 110 selects the specific location on the pointed 3D object to the user terminal 131 where the user account 161-163 is logged in. -133) Send it to everyone and share it.
  • FIGS. 3 to 5 will be described for the operation of the server 110 of the present invention transmitting and sharing a specific location on a pointed 3D object to all user terminals 131-133 to which user accounts 161-163 are logged in. Please refer to and explain.
  • Figure 3 is a diagram to explain the principles of the raycasting technique used in the present invention to measure the specific position of a 3D object.
  • raycasting technique is used to obtain coordinates in virtual three-dimensional space.
  • Raycasting is a technique that projects invisible light (ray) into a virtual space and identifies the surface that the light touches.
  • the coordinates are three-dimensional coordinates in space and can be expressed as (x,y,z).
  • a predetermined effect can be created and displayed at the coordinates of the location indicated by the user on the surface of the 3D object.
  • a predetermined effect may be a laser point display.
  • the present invention can provide information on which user generated the event by connecting the user avatar who generated the event at the corresponding coordinates in virtual space and the point of occurrence with a line.
  • FIG. 4 is a diagram illustrating a method of sharing a specific location indication of a 3D object in a real-time VR conference system capable of sharing a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • Figure 5 is an operation flowchart showing an operation of sharing a specific location indication of a 3D object in a real-time VR conference system capable of sharing a specific location indication on a 3D object in a virtual space according to an embodiment of the present invention.
  • User 1 may be a user located in Korea, as shown in FIG. 4.
  • the method of measuring coordinates (x, y, z) corresponding to a specific position on a 3D object applies the raycasting technique of FIG. 3 as described above.
  • raycasting is performed from the touch point of the screen touched by the user to the forward direction where the camera is looking, and the collision point is based on the 3D object.
  • the coordinates are three-dimensional coordinates in space and can be expressed as (x,y,z).
  • User 1's user terminal transmits the 3D coordinates (x, y, z) in the stored space to the server 110, and the server 110 receives and stores them (S520).
  • the server 110 transmits the received spatial 3D coordinates (x, y, z) to other users (User 2 and User 3) who are sharing the 3D object (S530).
  • User 2 and User 3 may be remote users located in LA, USA, as shown in FIG. 4.
  • the user terminals of other users (User 2 and User 3) that have received the 3D coordinates (x, y, z) in the corresponding space output and display them based on the anchor of the 3D object (S540).
  • a laser point display can be used as an example of a display method.
  • the present invention can provide information on which user generated the event by connecting the user avatar that generated the event to the coordinates and the point of occurrence with a line.
  • 3D coordinates (x, y, z) values in space can be displayed as pixel values of 3D objects, so they can be displayed accurately without error, and this allows users of VR meetings who are far away.
  • multiple users can accurately share a specific location indication on a 3D object in a virtual space that one user is referring to.
  • the real-time VR conference system and method for sharing a specific position indication on a 3D object in a virtual space of the present invention of the above-described configuration allows multiple users to accurately share a specific position indication on a 3D object in a virtual space. By doing so, communication between users in non-face-to-face meetings can be facilitated. In addition, the convenience and efficiency of the VR conference system can be improved by facilitating communication between users in non-face-to-face meetings by sharing specific location indications on 3D objects in a virtual space accurately in real time with multiple users.
  • the embodiments described above may be implemented with hardware components, software components, and/or a combination of hardware components and software components.
  • the devices, methods, and components described in the embodiments may include, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, and a field programmable gate (FPGA). It may be implemented using one or more general-purpose or special-purpose computers, such as an array, programmable logic unit (PLU), microprocessor, or any other device capable of executing and responding to instructions.
  • a processing device may perform an operating system (OS) and one or more software applications that run on the operating system. Additionally, a processing device may access, store, manipulate, process, and generate data in response to the execution of software.
  • OS operating system
  • a processing device may access, store, manipulate, process, and generate data in response to the execution of software.
  • a single processing device may be described as being used; however, those skilled in the art will understand that a processing device includes multiple processing elements and/or multiple types of processing elements. It can be seen that it may include.
  • a processing device may include a plurality of processors or one processor and one controller. Additionally, other processing configurations, such as parallel processors, are possible.
  • the method according to the embodiment may be implemented in the form of program instructions that can be executed through various computer means and recorded on a computer-readable medium.
  • Computer-readable media may include program instructions, data files, data structures, etc., singly or in combination. Program instructions recorded on the medium may be specially designed and configured for the embodiment or may be known and available to those skilled in the art of computer software. Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks. -Includes optical media (magneto-optical media) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, etc.
  • Examples of program instructions include machine language code, such as that produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter, etc.
  • a hardware device may be configured to operate as one or more software modules to perform the operations of an embodiment, and vice versa.
  • Software may include a computer program, code, instructions, or a combination of one or more of these, which may configure a processing unit to operate as desired, or may be processed independently or collectively. You can command the device.
  • Software and/or data may be used on any type of machine, component, physical device, virtual equipment, computer storage medium or device to be interpreted by or to provide instructions or data to a processing device. , or may be permanently or temporarily embodied in a transmitted signal wave.
  • Software may be distributed over networked computer systems and stored or executed in a distributed manner.
  • Software and data may be stored on one or more computer-readable recording media.
  • the present invention relates to a conference system and method based on virtual reality (VR), and can be used in the field of virtual reality.
  • VR virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • Telephonic Communication Services (AREA)

Abstract

La présente invention concerne un système de conférence en réalité virtuelle (RV) en temps réel capable de partager une indication d'emplacement spécifique sur un objet 3D dans un espace virtuel, lequel système comprend : un serveur qui comprend une plateforme de conférence virtuelle et fournit une salle de conférence virtuelle pour une conférence RV virtuelle par l'intermédiaire de la plateforme de conférence virtuelle ; un terminal d'exploitation de conférence virtuelle qui transmet/reçoit des données à/à partir du serveur par l'intermédiaire de la plateforme de conférence virtuelle et transmet des informations de conférence pour la conférence RV virtuelle au serveur ; et un terminal d'utilisateur qui est connecté au serveur par l'intermédiaire de la plateforme de conférence virtuelle et capable d'accéder à la salle de conférence virtuelle pour la conférence RV virtuelle.
PCT/KR2023/008831 2022-06-30 2023-06-26 Système et procédé de conférence en réalité virtuelle (rv) en temps réel capables de partager une indication d'emplacement spécifique sur un objet 3d dans un espace virtuel WO2024005473A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0080912 2022-06-30
KR1020220080912A KR20240002829A (ko) 2022-06-30 2022-06-30 가상 공간에 있는 3d 오브젝트 상의 특정 위치 표시를 공유할 수 있는 실시간 vr 회의시스템 및 방법

Publications (1)

Publication Number Publication Date
WO2024005473A1 true WO2024005473A1 (fr) 2024-01-04

Family

ID=89380853

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/008831 WO2024005473A1 (fr) 2022-06-30 2023-06-26 Système et procédé de conférence en réalité virtuelle (rv) en temps réel capables de partager une indication d'emplacement spécifique sur un objet 3d dans un espace virtuel

Country Status (2)

Country Link
KR (1) KR20240002829A (fr)
WO (1) WO2024005473A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120026899A (ko) * 2010-09-10 2012-03-20 엘지전자 주식회사 가상 메모기능을 갖는 이동 단말기 및 그 가상 메모 관리방법
KR20190028238A (ko) * 2017-09-08 2019-03-18 삼성전자주식회사 가상 현실에서의 포인터 제어 방법 및 전자 장치
KR20200017373A (ko) * 2019-10-16 2020-02-18 김보언 유니크베뉴를 포함하는 가상현실 이벤트 제공방법, 장치 및 프로그램
KR102259350B1 (ko) * 2019-12-24 2021-06-01 주식회사 브이알미디어 Ar 기반 원격 지원 시스템 및 그 동작 방법
KR20220029451A (ko) * 2020-08-28 2022-03-08 티엠알더블유 파운데이션 아이피 에스에이알엘 가상 환경에서 상호작용을 가능하게 하는 시스템 및 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102292606B1 (ko) 2021-03-30 2021-08-24 주식회사 에이드소프트 Mice 산업 기반 가상현실을 이용한 컨퍼런스 서비스 제공 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120026899A (ko) * 2010-09-10 2012-03-20 엘지전자 주식회사 가상 메모기능을 갖는 이동 단말기 및 그 가상 메모 관리방법
KR20190028238A (ko) * 2017-09-08 2019-03-18 삼성전자주식회사 가상 현실에서의 포인터 제어 방법 및 전자 장치
KR20200017373A (ko) * 2019-10-16 2020-02-18 김보언 유니크베뉴를 포함하는 가상현실 이벤트 제공방법, 장치 및 프로그램
KR102259350B1 (ko) * 2019-12-24 2021-06-01 주식회사 브이알미디어 Ar 기반 원격 지원 시스템 및 그 동작 방법
KR20220029451A (ko) * 2020-08-28 2022-03-08 티엠알더블유 파운데이션 아이피 에스에이알엘 가상 환경에서 상호작용을 가능하게 하는 시스템 및 방법

Also Published As

Publication number Publication date
KR20240002829A (ko) 2024-01-08

Similar Documents

Publication Publication Date Title
EP3769509B1 (fr) Réunions en réalité mixte à points d'extrémité multiples
US20180376104A1 (en) Method and apparatus for sharing information during video call
US20120115603A1 (en) Single user multiple presence in multi-user game
KR20120118019A (ko) 공간 통신 환경을 위한 웹 브라우저 인터페이스
WO2016173357A1 (fr) Procédé et dispositif de mise en œuvre d'une conférence multimédia
WO2023128308A1 (fr) Procédé de commande de données d'image d'utilisateur dans un environnement de bureau basé sur le métavers, support d'enregistrement sur lequel est enregistré un programme l'exécutant, et système de commande de données d'image d'utilisateur le comprenant
WO2023128305A1 (fr) Procédé de mise en correspondance de données d'image d'utilisateur dans un environnement de bureau basé sur le métavers, support d'enregistrement dans lequel un programme pour l'exécuter est enregistré et système de mise en correspondance de données d'image d'utilisateur comprenant un support d'enregistrement
US20180123816A1 (en) Collaboration environments and views
WO2015008932A1 (fr) Créateur d'espace digilogue pour un travail en équipe à distance dans une réalité augmentée et procédé de création d'espace digilogue l'utilisant
KR100611255B1 (ko) 작업 공간을 공유하는 원격 회의 방법
CN110111241A (zh) 用于生成动态图像的方法和装置
CN116310062A (zh) 三维场景构建方法及装置、存储介质及电子设备
US11799925B2 (en) Communication system, communication terminal, and screen sharing method
WO2021187646A1 (fr) Procédé et système pour mener une conférence en utilisant un avatar
WO2022048428A1 (fr) Procédé et appareil de commande d'un objet cible, dispositif électronique et support de stockage
CN112423142B (zh) 图像处理方法、装置、电子设备及计算机可读介质
WO2024005473A1 (fr) Système et procédé de conférence en réalité virtuelle (rv) en temps réel capables de partager une indication d'emplacement spécifique sur un objet 3d dans un espace virtuel
US20230370686A1 (en) Information display method and apparatus, and device and medium
WO2024005472A1 (fr) Système et procédé de conférence vr en temps réel pouvant partager un mémo lié à un emplacement spécifique sur un objet 3d dans un espace virtuel
WO2023128309A1 (fr) Procédé de commande de dispositif d'affichage dans un environnement de bureau basé sur le métavers, support d'enregistrement dans lequel un programme pour l'exécuter est enregistré, et système de commande de dispositif d'affichage le comprenant
WO2021187647A1 (fr) Procédé et système d'expression d'avatar imitant le mouvement d'un utilisateur dans un espace virtuel
CN111696214A (zh) 房屋展示方法、装置和电子设备
US20230061662A1 (en) Method, apparatus, medium and electronic device for generating round-table video conference
JP7229628B1 (ja) プログラム、情報処理装置、情報処理システム、情報処理方法
WO2024101590A1 (fr) Dispositif électronique pour acquérir l'emplacement d'un objet virtuel et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23831833

Country of ref document: EP

Kind code of ref document: A1