KR20200072584A - System for remote collaboration and the method thereof - Google Patents

System for remote collaboration and the method thereof Download PDF

Info

Publication number
KR20200072584A
KR20200072584A KR1020180151814A KR20180151814A KR20200072584A KR 20200072584 A KR20200072584 A KR 20200072584A KR 1020180151814 A KR1020180151814 A KR 1020180151814A KR 20180151814 A KR20180151814 A KR 20180151814A KR 20200072584 A KR20200072584 A KR 20200072584A
Authority
KR
South Korea
Prior art keywords
map
deep learning
remote
information
image
Prior art date
Application number
KR1020180151814A
Other languages
Korean (ko)
Inventor
노진송
Original Assignee
(주)익스트리플
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)익스트리플 filed Critical (주)익스트리플
Priority to KR1020180151814A priority Critical patent/KR20200072584A/en
Publication of KR20200072584A publication Critical patent/KR20200072584A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Disclosed are a remote collaboration system capable of quickly and accurately reflecting changing surrounding environments, and a method thereof. The remote collaboration system comprises: an image input part that receives surrounding images from a smart device; a deep learning processing part that performs deep learning from the input surrounding images; an object recognition processing part that recognizes an object in the input surrounding images based on deep learning performed image information; a map management part that updates surrounding image information, which has undergone deep learning, to a previously stored 3D map; and a collaborative control part that controls to share a 3D map and object information between remote workers and experts based on updated 3D map information to share images between the workers and experts.

Description

원격협업 시스템 및 그 방법{SYSTEM FOR REMOTE COLLABORATION AND THE METHOD THEREOF}Remote Collaboration System and its Method {SYSTEM FOR REMOTE COLLABORATION AND THE METHOD THEREOF}

본 발명은 원격협업 시스템 및 그 방법에 관한 것으로 특히, 원격지의 작업자와 상기 작업자로부터 원거리에 있는 전문가가 협업을 통해 작업을 수행하도록 원격협업을 지원하는 시스템 및 그 방법에 관한 것이다.The present invention relates to a remote collaboration system and a method thereof, and more particularly, to a system and method for supporting remote collaboration so that a remote operator and an expert distant from the operator perform work through collaboration.

서로 멀리 떨어져 있는 원격지의 작업자와 상기 작업자와 원거리에 있는 전문가 간의 의사소통 및 협업을 실현하기 위한 방법으로 다양한 시도들이 이루어지고 있다.Various attempts have been made as a way to realize communication and collaboration between remote workers and remote experts who are remote from each other.

이러한 원격협업은 크게 비디오 영상 및 공간 기반의 원격협업으로 구분이 되며, 구현 형태에 따라 영상 기반 증강현실 원격협업, 영상 기반 혼합현실 원격협업, 가상현실 공간 기반 원격협업, 혼합현실 공간 기반 원격협업, 가상현실 to 혼합현실 공간 기반 원격협업으로 분류될 수 있다.These remote collaborations are largely divided into video and spatial based remote collaborations, depending on the implementation type, video based augmented reality remote collaboration, video based mixed reality remote collaboration, virtual reality space based remote collaboration, mixed reality space based remote collaboration, Virtual reality to mixed reality space-based remote collaboration.

그러나 종래 기술들의 경우 착용형 디스플레이 기반 원격 협업 장치 및 방법 (한국공개특허 제10-2014-0108428호) 등과 같이 특정 공간(실내 또는 미리 설정된 공간)에 한정 되는 경우가 많으며, 비교적 넓은 실내 및 산업현장과 같은 열악한 환경에서의 원격협업은 불가능한 문제점이 있었다.However, in the case of the prior art, it is often limited to a specific space (indoor or preset space), such as a wearable display-based remote collaboration device and method (Korean Patent Publication No. 10-2014-0108428), and is relatively large indoor and industrial sites. There was a problem that remote collaboration in such a harsh environment was impossible.

따라서, 공간의 제약이 없고 열악한 현장 환경에서도 효과적으로 원격협업이 가능한 원격협업 시스템 및 그 방법에 대한 연구의 필요성이 있다.Therefore, there is a need for research on a remote collaboration system and method capable of effectively performing remote collaboration even in a poor field environment without space limitations.

한국공개특허 제10-2014-0108428호Korean Patent Publication No. 10-2014-0108428

본 발명은 스마트 디바이스(웨어러블 글라스, 스마트폰, 테블릿 등)를 통해 주변 영상을 획득하고 딥러닝을 통해 주변 환경 정보를 신속하게 업데이트함으로써, 변화하는 주변 환경 상황을 보다 빠르고 정확하게 반영할 수 있는 원격협업 시스템 및 그 방법을 제공한다.The present invention acquires the surrounding image through a smart device (wearable glass, smartphone, tablet, etc.) and rapidly updates the surrounding environment information through deep learning, so that a remote environment that can reflect the changing surrounding environment more quickly and accurately Provide a collaborative system and method.

본 발명은 3차원 공간 맵생성기술과 연계하여 원격협업을 지원함을쏘, 보다 정확하고 광범위한 지역에서의 원격협업을 지원할 수 있는 원격협업 시스템 및 그 방법을 제공한다. The present invention provides a remote cooperation system and a method capable of supporting remote cooperation in a more accurate and wide area by supporting remote cooperation in connection with a 3D spatial map generation technology.

본 발명의 일실시예에 따른 원격협업 시스템은 스마트 디바이스로부터 주변영상을 입력받는 영상입력부, 상기 입력받은 주변영상으로부터 딥러닝을 수행하는 딥러닝처리부, 상기 딥러닝을 수행한 영상 정보를 기반으로 상기 입력받은 주변영상 내의 사물을 인식하는 사물인식처리부, 상기 딥러닝을 수행한 주변영상 정보를 기저장된 3D 맵에 업데이트하는 맵관리부 및 상기 업데이트된 3D 맵의 정보에 기초하여 원격지의 작업자와 전문가간에 3D 맵과 사물정보를 공유하여 상기 작업자와 전문가 간의 영상을 공유하도록 제어하는 협업제어부를 포함한다.The remote collaboration system according to an embodiment of the present invention is based on an image input unit that receives a peripheral image from a smart device, a deep learning processing unit that performs deep learning from the received peripheral image, and based on the image information that performs the deep learning. An object recognition processing unit that recognizes objects in the received surrounding image, a map management unit that updates the surrounding image information that has been subjected to deep learning to a pre-stored 3D map, and 3D between a worker and an expert at a remote location based on the updated 3D map information. It includes a collaboration control unit that controls to share the image between the worker and the expert by sharing the map and object information.

본 발명의 일측면에 따르면, 상기 협업제어부는, 상기 3D 공간맵과 내부의 사물정보를 공유하고 상기 스마트 디바이스의 카메라 및 IMU센서와 융합하여 상기 작업자가 바라보고 있는 방향과 시선을 공유하도록 제어할 수 있다.According to an aspect of the present invention, the cooperative control unit controls to share the 3D spatial map and the inside object information and to fuse the camera and IMU sensor of the smart device to share the direction and gaze that the worker is looking at. Can.

본 발명의 일실시예에 따른 원격협업 방법은 스마트 디바이스로부터 주변영상을 입력받는 단계, 상기 입력받은 주변영상으로부터 딥러닝을 수행하는 단계, 상기 딥러닝을 수행한 영상 정보를 기반으로 상기 입력받은 주변영상 내의 사물을 인식하는 단계, 상기 딥러닝을 수행한 주변영상 정보를 기저장된 3D 맵에 업데이트하는 단계; 및 상기 업데이트된 3D 맵의 정보에 기초하여 원격지의 작업자와 전문가간에 3D 맵과 사물정보를 공유하여 상기 작업자와 전문가 간의 영상을 공유하도록 제어하는 단계를 포함한다.In the remote collaboration method according to an embodiment of the present invention, receiving a peripheral image from a smart device, performing deep learning from the received peripheral image, and receiving the input based on the image information of the deep learning Recognizing an object in the image, updating the surrounding image information that has undergone the deep learning to a pre-stored 3D map; And controlling to share an image between the worker and the expert by sharing the 3D map and the object information between the worker and the expert at a remote location based on the updated 3D map information.

본 발명은 스마트 디바이스(웨어러블 글라스, 스마트폰, 테블릿 등)를 통해 주변 영상을 획득하고 딥러닝을 통해 주변 환경 정보를 신속하게 업데이트함으로써, 변화하는 주변 환경 상황을 보다 빠르고 정확하게 반영할 수 있는 원격협업 시스템 및 그 방법이 제공된다.The present invention acquires the surrounding image through a smart device (wearable glass, smartphone, tablet, etc.) and rapidly updates the surrounding environment information through deep learning, so that a remote environment that can reflect the changing surrounding environment more quickly and accurately A collaborative system and method are provided.

본 발명은 3차원 공간 맵생성기술과 연계하여 원격협업을 지원함으로써, 보다 정확하고 광범위한 지역에서의 원격협업을 지원할 수 있는 원격협업 시스템 및 그 방법이 제공된다.The present invention provides a remote cooperation system and a method capable of supporting remote cooperation in a more accurate and wide area by supporting remote cooperation in connection with a 3D spatial map generation technology.

도 1은 본 발명의 실시예에 따른 원격협업 시스템을 나타내는 블록도.
도 2는 본 발명의 실시예에 따라 원격협업을 수행하는 과정을 설명하기 위한 개념도.
도 3은 본 발명의 실시예에 따라 네트워크를 통해 공유 가상객체에 대한 원격 협업을 진행하는 과정을 설명하는 도면.
1 is a block diagram showing a remote collaboration system according to an embodiment of the present invention.
2 is a conceptual diagram for explaining a process of performing a remote collaboration according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating a process of remote collaboration on a shared virtual object through a network according to an embodiment of the present invention.

이하, 첨부된 도면들에 기재된 내용들을 참조하여 본 발명의 실시예들을 상세하게 설명한다. 다만, 본 발명이 실시예들에 의해 제한되거나 한정되는 것은 아니다. 각 도면에 제시된 동일한 참조부호는 동일한 부재를 나타낸다.Hereinafter, embodiments of the present invention will be described in detail with reference to the contents described in the accompanying drawings. However, the present invention is not limited or limited by the embodiments. The same reference numerals in each drawing denote the same members.

도 1은 본 발명의 실시예에 따른 원격협업 시스템을 나타내는 블록도이고, 도 2는 본 발명의 실시예에 따라 원격협업을 수행하는 과정을 설명하기 위한 개념도이며, 도 3은 본 발명의 실시예에 따라 네트워크를 통해 공유 가상객체에 대한 원격 협업을 진행하는 과정을 설명하는 도면이다.1 is a block diagram showing a remote cooperation system according to an embodiment of the present invention, Figure 2 is a conceptual diagram for explaining the process of performing a remote cooperation according to an embodiment of the present invention, Figure 3 is an embodiment of the present invention It is a diagram explaining the process of remote collaboration on a shared virtual object through a network according to the.

도 1을 참고하면, 원격협업 시스템(100)은 스마트 디바이스로부터 주변영상을 입력받는 영상입력부(110), 상기 입력받은 주변영상으로부터 딥러닝을 수행하는 딥러닝처리부(120), 상기 딥러닝을 수행한 영상 정보를 기반으로 상기 입력받은 주변영상 내의 사물을 인식하는 사물인식처리부(130), 상기 딥러닝을 수행한 주변영상 정보를 기저장된 3D 맵에 업데이트하는 맵관리부(140) 및 상기 업데이트된 3D 맵의 정보에 기초하여 원격지의 작업자와 전문가간에 3D 맵과 사물정보를 공유하여 상기 작업자와 전문가 간의 영상을 공유하도록 제어하는 협업제어부(150)를 포함할 수 있다.Referring to FIG. 1, the remote collaboration system 100 performs an image input unit 110 that receives a peripheral image from a smart device, a deep learning processor 120 that performs deep learning from the input peripheral image, and performs the deep learning. Object recognition processing unit (130) for recognizing objects in the received surrounding image based on one image information, map management unit (140) for updating surrounding image information that has undergone deep learning to a pre-stored 3D map, and the updated 3D It may include a collaboration control unit 150 for controlling to share the image between the worker and the expert by sharing the 3D map and object information between the worker and the expert of the remote site based on the information of the map.

도 2에 도시된 바와 같이, 현장의 작업자가 원격협업 시스템(100)에 원격지원을 요청하면 전문가가 매칭되어, 상기 작업자와 전문가가 원격으로 협업을 수행하도록 구성될 수 있다.As shown in FIG. 2, when an on-site worker requests remote assistance from the remote collaboration system 100, experts are matched, and the worker and the expert can be configured to remotely collaborate.

일예로, 스마트폰, 스마트 태블릿, 스마트 글라스 등을 이용한 영상 기반 Augmented Reality Remote Collaboration을 통해 원격지의 작업자와 전문가가 협업할 수 있으며, 홀로렌즈 등을 사용하여 영상 기반 Mixed reality Remote Collaboration을 통해 협업이 이루어질 수도 있다.For example, workers and experts at remote sites can collaborate through video-based Augmented Reality Remote Collaboration using smartphones, smart tablets, smart glasses, etc., and collaboration can be achieved through video-based Mixed reality Remote Collaboration using holo lenses, etc. It might be.

또한, 도 3의 일예와 같이 Sealed type HMD를 통해 Virtual Reality Remote Collaboration이 이루이질 수도 있다.In addition, Virtual Reality Remote Collaboration may be achieved through a Sealed type HMD as in the example of FIG. 3.

이와 같은 방식으로 회의실 원격 협업 및 상호 작용을위한 협업 가상 및 증강 현실 시스템이 구현될 수 있으며, AR (Augmented Reality) 및 VR (Virtual Reality) 기술을 결합하여 각 플랫폼의 장점을 강화하고, 원격 사용자를 가상의 머리와 손으로 표현한 후 AR 사용자의 실제 환경을 재구성하여 원격 VR 사용자와 공유하는 방법을 통해 둘 다 같은 공간을 공유하고있는 것처럼 느끼도록 구현할 수도 있다.In this way, a collaborative virtual and augmented reality system for remote collaboration and interaction in the conference room can be implemented, combining AR (Augmented Reality) and VR (Virtual Reality) technologies to enhance the advantages of each platform and enable remote users. After expressing it with a virtual head and hand, the real environment of the AR user can be reconstructed and shared with a remote VR user, so that both can feel as if they are sharing the same space.

또한 본 발명의 일실시예에 따른, 원격협업 방법은 다양한 컴퓨터로 구현되는 동작을 수행하기 위한 프로그램 명령을 포함하는 컴퓨터 판독 가능 매체에 기록될 수 있다. 상기 컴퓨터 판독 가능 매체는 프로그램 명령, 데이터 파일, 데이터 구조 등을 단독으로 또는 조합하여 포함할 수 있다. 상기 매체는 프로그램 명령은 본 발명을 위하여 특별히 설계되고 구성된 것들이거나 컴퓨터 소프트웨어 당업자에게 공지되어 사용 가능한 것일 수도 있다. 컴퓨터 판독 가능 기록 매체의 예에는 하드 디스크, 플로피 디스크 및 자기 테이프와 같은 자기 매체(magnetic media), CD-ROM, DVD와 같은 광기록 매체(optical media), 플롭티컬 디스크(floptical disk)와 같은 자기-광 매체(magneto-optical media), 및 롬(ROM), 램(RAM), 플래시 메모리 등과 같은 프로그램 명령을 저장하고 수행하도록 특별히 구성된 하드웨어 장치가 포함된다. 프로그램 명령의 예에는 컴파일러에 의해 만들어지는 것과 같은 기계어 코드뿐만 아니라 인터프리터 등을 사용해서 컴퓨터에 의해서 실행될 수 있는 고급 언어 코드를 포함한다.In addition, according to an embodiment of the present invention, a remote collaboration method may be recorded on a computer readable medium including program instructions for performing various computer-implemented operations. The computer-readable medium may include program instructions, data files, data structures, or the like alone or in combination. The media may be program instructions specially designed and constructed for the present invention, or may be known and usable by those skilled in computer software. Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs, DVDs, and magnetic media such as floptical disks. -Hardware devices specifically configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like. Examples of program instructions include high-level language codes that can be executed by a computer using an interpreter, etc., as well as machine language codes produced by a compiler.

이상과 같이 본 발명의 일실시예는 비록 한정된 실시예와 도면에 의해 설명되었으나, 본 발명의 일실시예는 상기 설명된 실시예에 한정되는 것은 아니며, 이는 본 발명이 속하는 분야에서 통상의 지식을 가진 자라면 이러한 기재로부터 다양한 수정 및 변형이 가능하다.  따라서, 본 발명의 일실시예는 아래에 기재된 특허청구범위에 의해서만 파악되어야 하고, 이의 균등 또는 등가적 변형 모두는 본 발명 사상의 범주에 속한다고 할 것이다.As described above, although one embodiment of the present invention has been described by a limited embodiment and drawings, one embodiment of the present invention is not limited to the above-described embodiment, which is a general knowledge in the field to which the present invention pertains. Various modifications and variations can be made by those who have this description. Accordingly, one embodiment of the present invention should be understood only by the claims set forth below, and all equivalents or equivalent modifications thereof will be said to fall within the scope of the spirit of the present invention.

100 : 원격협업 시스템
110 : 영상입력부
120 : 딥러닝처리부
130 : 사물인식처리부
140 : 맵관리부
150 : 협업제어부
100: remote cooperation system
110: video input unit
120: deep learning processing unit
130: object recognition processing unit
140: map management department
150: collaboration control unit

Claims (3)

스마트 디바이스로부터 주변영상을 입력받는 영상입력부;
상기 입력받은 주변영상으로부터 딥러닝을 수행하는 딥러닝처리부;
상기 딥러닝을 수행한 영상 정보를 기반으로 상기 입력받은 주변영상 내의 사물을 인식하는 사물인식처리부;
상기 딥러닝을 수행한 주변영상 정보를 기저장된 3D 맵에 업데이트하는 맵관리부; 및
상기 업데이트된 3D 맵의 정보에 기초하여 원격지의 작업자와 전문가간에 3D 맵과 사물정보를 공유하여 상기 작업자와 전문가 간의 영상을 공유하도록 제어하는 협업제어부
를 포함하는 원격협업 시스템.
An image input unit that receives a peripheral image from a smart device;
A deep learning processing unit that performs deep learning from the input peripheral image;
An object recognition processing unit recognizing an object in the received peripheral image based on the image information of the deep learning;
A map management unit updating the surrounding image information on which the deep learning is performed to a pre-stored 3D map; And
Based on the updated 3D map information, a collaborative control unit that controls 3D map and object information to be shared between workers and experts by sharing 3D map and object information between remote workers and experts
Remote collaboration system comprising a.
제1항에 있어서,
상기 협업제어부는,
상기 3D 공간맵과 내부의 사물정보를 공유하고 상기 스마트 디바이스의 카메라 및 IMU센서와 융합하여 상기 작업자가 바라보고 있는 방향과 시선을 공유하도록 제어하는 것을 특징으로 하는 원격협업 시스템.
According to claim 1,
The collaboration control unit,
A remote collaboration system characterized by controlling to share the 3D spatial map and the inside object information and to fuse the camera and IMU sensor of the smart device to share the direction and gaze that the worker is looking at.
스마트 디바이스로부터 주변영상을 입력받는 단계;
상기 입력받은 주변영상으로부터 딥러닝을 수행하는 단계;
상기 딥러닝을 수행한 영상 정보를 기반으로 상기 입력받은 주변영상 내의 사물을 인식하는 단계;
상기 딥러닝을 수행한 주변영상 정보를 기저장된 3D 맵에 업데이트하는 단계; 및
상기 업데이트된 3D 맵의 정보에 기초하여 원격지의 작업자와 전문가간에 3D 맵과 사물정보를 공유하여 상기 작업자와 전문가 간의 영상을 공유하도록 제어하는 단계
를 포함하는 원격협업 방법.
Receiving a peripheral image from a smart device;
Performing deep learning from the input peripheral image;
Recognizing an object in the received peripheral image based on the deep learning image information;
Updating surrounding image information on which the deep learning is performed to a pre-stored 3D map; And
Controlling to share a video between the worker and the expert by sharing the 3D map and the object information between the remote worker and the expert based on the updated 3D map information.
Remote collaboration method comprising a.
KR1020180151814A 2018-11-30 2018-11-30 System for remote collaboration and the method thereof KR20200072584A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020180151814A KR20200072584A (en) 2018-11-30 2018-11-30 System for remote collaboration and the method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020180151814A KR20200072584A (en) 2018-11-30 2018-11-30 System for remote collaboration and the method thereof

Publications (1)

Publication Number Publication Date
KR20200072584A true KR20200072584A (en) 2020-06-23

Family

ID=71137996

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020180151814A KR20200072584A (en) 2018-11-30 2018-11-30 System for remote collaboration and the method thereof

Country Status (1)

Country Link
KR (1) KR20200072584A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140108428A (en) 2013-02-27 2014-09-11 한국전자통신연구원 Apparatus and method for remote collaboration based on wearable display

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140108428A (en) 2013-02-27 2014-09-11 한국전자통신연구원 Apparatus and method for remote collaboration based on wearable display

Similar Documents

Publication Publication Date Title
US10452133B2 (en) Interacting with an environment using a parent device and at least one companion device
CN107850779B (en) Virtual position anchor
US20180293798A1 (en) Context-Based Discovery of Applications
US9424239B2 (en) Managing shared state information produced by applications
EP3769509B1 (en) Multi-endpoint mixed-reality meetings
CN109923509B (en) Coordinated manipulation of objects in virtual reality
US11004256B2 (en) Collaboration of augmented reality content in stereoscopic view in virtualized environment
JP2020024752A (en) Information processing device, control method thereof, and program
EP2974509B1 (en) Personal information communicator
JP2016515239A (en) Method and apparatus for providing augmented reality using optical character recognition
US20140320404A1 (en) Image processing device, image processing method, and program
US20190130599A1 (en) Systems and methods for determining when to provide eye contact from an avatar to a user viewing a virtual environment
US20220044439A1 (en) Real-Time Pose Estimation for Unseen Objects
KR20160015972A (en) The Apparatus and Method for Wearable Device
US20220012923A1 (en) Techniques for enabling multiple mutually untrusted applications to concurrently generate augmented reality presentations
CN105511620A (en) Chinese three-dimensional input device, head-wearing device and Chinese three-dimensional input method
WO2022252688A1 (en) Augmented reality data presentation method and apparatus, electronic device, and storage medium
KR102103399B1 (en) System for offering virtual-augmented information using object recognition based on artificial intelligence and the method thereof
US20190130631A1 (en) Systems and methods for determining how to render a virtual object based on one or more conditions
US10591986B2 (en) Remote work supporting system, remote work supporting method, and program
KR20200072584A (en) System for remote collaboration and the method thereof
US20230196697A1 (en) Augmented reality (ar) visual display to save
WO2021182124A1 (en) Information processing device and information processing method
KR20210086860A (en) Inspection and management support system and method thereof using real-time special space scan and 3d space modeling
CN110908509B (en) Multi-augmented reality equipment cooperation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
E902 Notification of reason for refusal
E601 Decision to refuse application