WO2021075878A1 - Procédé permettant de fournir un service d'enregistrement de réalité augmentée et terminal utilisateur - Google Patents

Procédé permettant de fournir un service d'enregistrement de réalité augmentée et terminal utilisateur Download PDF

Info

Publication number
WO2021075878A1
WO2021075878A1 PCT/KR2020/014096 KR2020014096W WO2021075878A1 WO 2021075878 A1 WO2021075878 A1 WO 2021075878A1 KR 2020014096 W KR2020014096 W KR 2020014096W WO 2021075878 A1 WO2021075878 A1 WO 2021075878A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
marker
feature
user
user terminal
Prior art date
Application number
PCT/KR2020/014096
Other languages
English (en)
Korean (ko)
Inventor
이재만
Original Assignee
주식회사 도넛
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 도넛 filed Critical 주식회사 도넛
Publication of WO2021075878A1 publication Critical patent/WO2021075878A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/9201Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present invention relates to a method of providing an augmented reality recording service and a user terminal.
  • AR augmented reality
  • AR refers to a technology field that provides a new paradigm that can be used for human-computer interaction and communication.
  • augmented reality is one of virtual reality that combines the real world that the user sees with the user's eyes and the virtual world with additional information as a single image.
  • Augmented reality a concept that complements the real world with a virtual world, uses a virtual environment created with computer graphics, but it may be different from general virtual reality in that it is based on the real environment.
  • Computer graphics serve to provide additional information necessary for the real environment. That is, by overlapping the 3D virtual image with the real image viewed by the user, the distinction between the real environment and the virtual screen becomes ambiguous.
  • a marker is created using a feature image taken at a user's desired location and geographical location information of the feature, and the augmented reality record is created by linking and storing the content produced by the user to the generated marker.
  • An augmented reality recording service providing method and a user terminal for displaying a marker on a map using geographic location information to view the augmented reality records created by a user or another user in the future, and outputting the augmented reality records according to the user input.
  • a method for providing an augmented reality (AR) recording service performed by a user terminal is disclosed.
  • a method for providing an augmented reality recorded service includes the steps of generating a feature image by photographing a feature desired by a user of the user terminal, and obtaining current location information, from the generated feature image. Extracting the feature selected by the user, generating a marker by combining the extracted feature and the acquired location information, receiving the personal content created by the user, and receiving the input personal content And creating an augmented reality record by linking with the generated marker, and setting a viewing condition of the generated augmented reality record.
  • the generating of the marker includes: outputting the feature image, selecting and inputting the feature by dragging an edge or region of the feature from the output feature image, and receiving the feature from the feature image. And extracting the selected feature.
  • the personal content includes at least one of text, photos, audio, and video including content that the user wants to leave at a corresponding place, and the user creates and inputs the personal content using the user terminal, or The stored personal content is input.
  • the viewing conditions include a viewing permission range, viewing start and end date and time, and photographing direction information.
  • the viewing permission range is set to be shared so that the generated augmented reality record is shared with all others, set to a personal use that is not shared with others at all, or set to a friend so that only a designated portion is shared.
  • the photographing direction information is calculated when the current location information is obtained when photographing the feature, and includes an azimuth, a pitch, and a roll of the user terminal when photographing the feature.
  • the setting of the viewing condition includes setting augmented reality screen setting information by performing setting of an augmented reality screen based on the generated feature image.
  • the augmented reality screen setting information includes information on an icon overlapping the image of the terrain feature and information on a special effect produced when the icon appears or the linked personal content is output.
  • the augmented reality recording service providing method includes the steps of outputting a marker map indicating the location where the marker is generated, and activating the camera as the user moves to the location where the desired marker exists and inputs a scan command, thereby activating the desired marker. Scanning whether or not a feature corresponding to is photographed, when a feature corresponding to the desired marker is detected, outputting an augmented reality screen according to the augmented reality screen setting information, and executed through the output augmented reality screen If the command is input, further comprising the step of outputting the personal content.
  • the outputting of the marker map includes the step of expanding and displaying the marker map by a preset ratio around the selected location when one of the locations where the marker is generated displayed on the marker map is selected by the user, In the marker map, the location where the marker is created is displayed as an icon having a different color or shape according to the viewing permission range of each marker set for personal, friend, or public use.
  • the outputting of the marker map includes outputting a notification message and a marker map indicating the location where the desired marker is generated when the user enters a location where the desired marker exists within a preset distance. .
  • the scanning step when a feature corresponding to the desired marker is photographed in a direction according to the photographing direction information corresponding to the desired marker, it is determined that the feature corresponding to the desired marker is detected.
  • a user terminal providing an augmented reality (AR) recording service is disclosed.
  • AR augmented reality
  • a user terminal includes a memory for storing a command and a processor for executing the command, wherein the command generates a feature image by photographing a feature desired by a user of the user terminal, Acquiring current location information, extracting a feature selected by the user from the generated feature image, and creating a marker by combining the extracted feature and the obtained location information, the An augmented reality record comprising receiving personal content created by a user, creating an augmented reality record by linking the input personal content with the generated marker, and setting viewing conditions of the generated augmented reality record Perform the service provision method.
  • the augmented reality recording service providing method and the user terminal generate a marker using the geographical feature image and the geographical location information of the feature captured at the user's desired place, and use the generated marker to the user.
  • Create an augmented reality record by connecting and storing the content created by the user, displaying a marker on the map using geographic location information, and outputting the augmented reality record according to the user input, so that a user or another user is created later. Augmented reality records can be viewed.
  • FIG. 1 is a flow chart showing a method of providing an augmented reality (AR) recording service according to an embodiment of the present invention.
  • AR augmented reality
  • FIG. 2 and 3 are diagrams for explaining a method of providing an augmented reality recording service according to an embodiment of the present invention of FIG. 1.
  • FIG. 4 is a flow chart showing a method of providing an augmented reality recording service according to another embodiment of the present invention.
  • FIG. 5 and 6 are views for explaining a method of providing an augmented reality recording service according to another embodiment of the present invention of FIG. 4.
  • FIG. 7 is a diagram schematically illustrating a configuration of a user terminal that performs a method of providing an augmented reality recorded service according to an embodiment of the present invention.
  • FIGS. 1 and 3 are a flowchart illustrating a method of providing an augmented reality recording service according to an embodiment of the present invention of FIG. It is a drawing for explanation.
  • AR augmented reality
  • the method of providing an augmented reality recording service according to an embodiment of the present invention may be implemented in software such as an application and driven in an electronic device such as a PC or a smartphone.
  • an electronic device that performs a method of providing an augmented reality recording service according to an embodiment of the present invention will be described collectively as a user terminal.
  • the user terminal may perform a method of providing an augmented reality recording service in connection with a server.
  • an application that performs a method of providing an augmented reality recording service mounted on a user terminal can be provided from a server, installed, and managed, and requests for large-capacity operations that are difficult to be performed on the user terminal to the server or various data generated. You can also upload it to the server and save it.
  • step S110 the user terminal generates a feature image by photographing a feature desired by the user according to the user's manipulation.
  • the user terminal acquires current geographic location information along with photographing a feature.
  • the user terminal may acquire current geographic location information using a GPS module.
  • step S120 the user terminal extracts the feature selected by the user from the generated feature image, and creates a marker by combining the extracted feature and the acquired geographic location information.
  • the user terminal may output an intro screen as shown in FIG. 2A according to a user's input of an application driving command.
  • the intro screen of FIG. 2A may include a first touch button 10 for inputting a command for performing marker creation and a second touch button 20 for inputting a command for checking the generated marker. have.
  • the user terminal may activate the camera and then output a camera view screen as shown in FIG. 2B.
  • the camera view screen may include a third touch button 30 that outputs an image input to an activated camera and inputs a command for generating a marker, as shown in FIG. 2B.
  • the user terminal captures and outputs a feature image including the feature 40.
  • the user terminal is a method of dragging the edge or area of the feature 40 desired by the user from the output feature image, and after receiving the desired feature 40 selected from the user, the third touch button 30 When this is input, a marker may be generated by extracting the selected feature 40 from the feature image.
  • the user terminal receives the personal content created by the user.
  • personal content may include text, photos, audio, video, and the like, and may include content that the user wants to leave at a corresponding place.
  • the user can create and input personal content using the user terminal, or can input previously stored personal content.
  • step S140 the user terminal creates an augmented reality record by linking the input personal content with a marker.
  • the user terminal sets a viewing condition of the generated augmented reality record.
  • the viewing conditions may include a viewing permission range, viewing start date and end date and time, and photographing direction information.
  • the viewing permission range may be set as public so that the created augmented reality record is shared with others, privately not shared with others at all, or set for friends so that only a designated part such as family or friends is shared. have.
  • the photographing direction information may be calculated when the current geographic location information is obtained using a GPS module when capturing a terrain feature.
  • the photographing direction information may include an azimuth angle, a pitch, and a roll of the user terminal when capturing a feature.
  • the user terminal may include a magnetic field sensor, an acceleration sensor, a gyroscope sensor, a motion sensor, etc., and photographing direction information may be calculated using these sensors.
  • the user terminal may output a screen as shown in (a) of FIG. 3 in order to generate an augmented reality record and set a viewing condition. That is, the screen of Fig. 3 (a) includes a plurality of touch buttons 50 for inputting commands for creating and inputting personal content, and a plurality of touch buttons 50 for inputting commands for setting and checking viewing conditions. It may include a button 60.
  • the user terminal may set the augmented reality screen setting information by performing the setting of the augmented reality screen based on the generated terrain feature image.
  • the user terminal may output a screen as shown in (b) of FIG. 3 including a plurality of touch buttons 70 for inputting a command for performing the setting of an augmented reality screen.
  • the user terminal may set the icon 71 in the form of a ribbon as shown in (b) of FIG. 3 to overlap and output on the feature image according to the user input.
  • the user terminal may set a special effect produced when an icon appears or a linked personal content is output according to a user input. That is, an icon may be set to appear while approaching from a distance, or an effect such as a written text being written in the air may be produced.
  • step S160 the user terminal uploads an augmented reality record including the reading condition to the server according to the setting of the reading condition.
  • FIG. 4 is a flowchart illustrating a method of providing an augmented reality recorded service according to another embodiment of the present invention
  • FIGS. 5 and 6 are diagrams for explaining a method of providing an augmented reality recorded service according to another embodiment of the present invention of FIG. to be.
  • the method of providing an augmented reality recording service according to another embodiment of the present invention of FIG. 4 shows a method for a user or another user to view the augmented reality recorded material generated in FIG. 1 and uploaded to the server later.
  • FIG. 4 a method of providing an augmented reality recording service according to another embodiment of the present invention will be described with reference to FIG. 4, but reference will be made to FIGS. 5 and 6.
  • step S410 the user terminal outputs a marker map indicating the location where the marker is generated according to the input of the user's marker search command.
  • the user terminal when a second touch button 20 for inputting a command for checking a marker on the intro screen of FIG. 2 (a) is input, the user terminal A search screen displayed as an icon can be output as shown in (a) of FIG. 5.
  • the search screen may include touch buttons 80, 90, and 100 for inputting a search command for a marker in which a viewing permission range is set to be for personal use, friend use, or public use.
  • the user terminal may display a location where a marker is created with an icon having a different color or shape according to the viewing permission range of each marker set for personal, friend, or public use. Thereafter, when the user terminal selects one of the icons displayed on the marker map of FIG. 5(a) from the user, as shown in FIG. 5(b), the marker map is enlarged and displayed by a preset ratio around the selected icon. can do.
  • the user can check the marker map output to the user terminal and move to the location where the desired marker is generated.
  • the user terminal may output a marker map indicating the location at which the marker is generated along with a notification message.
  • the user terminal may output a marker map enlarged by a preset ratio around an icon indicating the location of the corresponding marker.
  • step S420 the user terminal moves to the location where the desired marker exists and activates the camera according to the input of a scan command to scan whether a feature corresponding to the desired marker is photographed.
  • the user terminal determines that the terrain feature corresponding to the desired marker is detected in the case where the feature corresponding to the desired marker is photographed in the direction according to the photographing direction information among the viewing conditions of the augmented reality record corresponding to the desired marker. I can.
  • step S430 when a terrain feature corresponding to a marker desired by the user is detected, the user terminal outputs an augmented reality screen according to preset augmented reality screen setting information.
  • step S440 when an execution command is input from the user through the output augmented reality screen, the user terminal outputs personal content included in the augmented reality record.
  • the user terminal when a terrain feature corresponding to the marker of the selected icon is detected, the user terminal displays a ribbon-shaped icon 71 according to the augmented reality screen setting information, as shown in FIG. 6(b). It is possible to output an augmented reality screen that is overlapped with the feature image. Subsequently, the user terminal may output the personal content 72 according to the user's input of the execution command.
  • FIG. 7 is a diagram schematically illustrating a configuration of a user terminal that performs a method of providing an augmented reality recorded service according to an embodiment of the present invention.
  • a user terminal includes a processor 710, a memory 720, a camera unit 730, a location information acquisition unit 740, a sensor unit 750, and a display unit 760. ), a communication unit 770 and an interface unit 780.
  • the processor 710 may be a CPU or semiconductor device that executes processing instructions stored in the memory 720.
  • the memory 720 may include various types of volatile or nonvolatile storage media.
  • the memory 720 may include ROM, RAM, or the like.
  • the memory 720 may store instructions for performing a method of providing an augmented reality recording service according to an embodiment of the present invention.
  • the camera unit 730 is activated in response to an input of a driving command signal to capture an object, thereby obtaining an image.
  • the camera unit 730 may generate a feature image by capturing a feature desired by the user according to a user manipulation.
  • the location information acquisition unit 740 acquires the current location information of the user terminal.
  • the location information acquisition unit 740 may acquire current geographic location information of the user terminal using a GPS module.
  • the sensor unit 750 measures the direction of the user terminal.
  • the sensor unit 750 may include a magnetic field sensor, an acceleration sensor, a gyroscope sensor, a motion sensor, and the like, and may calculate photographing direction information of the user terminal using these sensors.
  • the display unit 760 displays information received or processed by the user terminal.
  • the display unit 760 may be implemented as a liquid crystal display or the like.
  • the display unit 760 may be used as a user input unit by applying a touch screen in which sensors for detecting a touch motion form a mutual layer structure.
  • the communication unit 770 is a means for transmitting and receiving data with other devices through a communication network.
  • the interface unit 780 may include a network interface and a user interface for accessing a network.
  • each component can be identified as a respective process.
  • the process of the above-described embodiment can be easily grasped from the viewpoint of the components of the device.
  • the above-described technical contents may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, and the like alone or in combination.
  • the program instructions recorded on the medium may be specially designed and configured for the embodiments, or may be known and usable to those skilled in computer software.
  • Examples of computer-readable recording media include magnetic media such as hard disks, floppy disks, and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic media such as floptical disks.
  • -A hardware device specially configured to store and execute program instructions such as magneto-optical media, and ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine language codes such as those produced by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • the hardware device may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé permettant de fournir un service d'enregistrement de réalité augmentée et concerne également un terminal utilisateur. Le procédé permettant de fournir un service d'enregistrement de réalité augmentée comprend les étapes consistant à : générer une image de caractéristique de terrain au moyen d'un utilisateur du terminal utilisateur qui photographie des caractéristiques de terrain souhaitées, puis acquérir des informations de localisation actuelle ; générer un marqueur par extraction d'une caractéristique de terrain sélectionnée par l'utilisateur à partir de l'image de caractéristique de terrain générée et combiner la caractéristique de terrain extraite avec les informations de localisation acquises ; recevoir un contenu personnel d'entrée créé par l'utilisateur ; générer un enregistrement de réalité augmentée en associant le contenu personnel d'entrée avec le marqueur généré ; et définir une condition de visualisation pour l'enregistrement de réalité augmentée généré.
PCT/KR2020/014096 2019-10-18 2020-10-15 Procédé permettant de fournir un service d'enregistrement de réalité augmentée et terminal utilisateur WO2021075878A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190129879A KR102204721B1 (ko) 2019-10-18 2019-10-18 증강현실 기록물 서비스 제공 방법 및 사용자 단말
KR10-2019-0129879 2019-10-18

Publications (1)

Publication Number Publication Date
WO2021075878A1 true WO2021075878A1 (fr) 2021-04-22

Family

ID=74237389

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/014096 WO2021075878A1 (fr) 2019-10-18 2020-10-15 Procédé permettant de fournir un service d'enregistrement de réalité augmentée et terminal utilisateur

Country Status (2)

Country Link
KR (1) KR102204721B1 (fr)
WO (1) WO2021075878A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439635A (zh) * 2022-06-30 2022-12-06 亮风台(上海)信息科技有限公司 一种呈现目标对象的标记信息的方法与设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050078136A (ko) * 2004-01-30 2005-08-04 삼성전자주식회사 증강현실을 이용한 지역 정보 제공 방법 및 이를 위한지역 정보 서비스 시스템
KR20150126289A (ko) * 2014-05-02 2015-11-11 한국전자통신연구원 증강현실 기반 소셜 네트워크 서비스 정보를 제공하는 내비게이션 장치와 메타데이터 처리장치 및 그 방법
KR20180106811A (ko) * 2017-03-20 2018-10-01 ㈜라이커스게임 벡터를 이용한 증강 현실 영상 구현 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101253644B1 (ko) 2012-12-28 2013-04-11 주식회사 맥스트 위치 정보를 이용한 증강 현실 콘텐츠 출력 장치 및 방법

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050078136A (ko) * 2004-01-30 2005-08-04 삼성전자주식회사 증강현실을 이용한 지역 정보 제공 방법 및 이를 위한지역 정보 서비스 시스템
KR20150126289A (ko) * 2014-05-02 2015-11-11 한국전자통신연구원 증강현실 기반 소셜 네트워크 서비스 정보를 제공하는 내비게이션 장치와 메타데이터 처리장치 및 그 방법
KR20180106811A (ko) * 2017-03-20 2018-10-01 ㈜라이커스게임 벡터를 이용한 증강 현실 영상 구현 방법

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
21 November 2017. non-official translation (LEE, Seunghyeon IT Columnist. Venture Square, Graffiti in the Space?. Social Apps in the AR Era). [online]. [Retrieved on 28 December 2020]. Retrieved from <URL: https://www.venturesquare.net/756192>. See pages 1-2. Y 4-6 *
ANONYMOUS: "Graffiti art now draws in AR... clean...", 16 October 2019 (2019-10-16), pages 1 - 1, XP055802237, Retrieved from the Internet <URL:https://blog.naver.com/tech-plus/221679212598> [retrieved on 20210506] *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439635A (zh) * 2022-06-30 2022-12-06 亮风台(上海)信息科技有限公司 一种呈现目标对象的标记信息的方法与设备
CN115439635B (zh) * 2022-06-30 2024-04-26 亮风台(上海)信息科技有限公司 一种呈现目标对象的标记信息的方法与设备

Also Published As

Publication number Publication date
KR102204721B1 (ko) 2021-01-19

Similar Documents

Publication Publication Date Title
US9699375B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
US9558559B2 (en) Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
JP2017187850A (ja) 画像処理システム、情報処理装置、プログラム
WO2015174729A1 (fr) Procédé et système de fourniture de réalité augmentée destinés à fournir des informations spatiales, ainsi que support d&#39;enregistrement et système de distribution de fichier
WO2011139115A2 (fr) Procédé pour accéder à des informations sur des personnages à l&#39;aide d&#39;une réalité augmentée, serveur et support d&#39;enregistrement lisible par ordinateur
US8094242B2 (en) Object management apparatus, mobile terminal, and object management method
WO2014163422A1 (fr) Procédé et appareil permettant de créer et d&#39;éditer une image dans laquelle est inséré un objet
EP2676186A2 (fr) Procédé et appareil mobile pour afficher un contenu à réalité augmentée
CN110869888A (zh) 创建虚拟导览的基于云端的系统和方法
WO2019017582A1 (fr) Procédé et système de collecte de modèles de contenu de réalité augmentée en fonction d&#39;une source en nuage et de génération automatique d&#39;un contenu de réalité augmentée
EP3070681A1 (fr) Dispositif, procédé et programme de commande d&#39;affichage
WO2011040710A2 (fr) Procédé, terminal et support d&#39;enregistrement lisible par ordinateur destinés à effectuer une recherche visuelle en fonction du mouvement ou de la position dudit terminal
WO2015102126A1 (fr) Procédé et système pour gérer un album électronique à l&#39;aide d&#39;une technologie de reconnaissance de visage
WO2019156543A2 (fr) Procédé de détermination d&#39;une image représentative d&#39;une vidéo, et dispositif électronique pour la mise en œuvre du procédé
WO2017200153A1 (fr) Procédé et système de correction de zone de lecture à l&#39;aide d&#39;informations d&#39;inclinaison de terminal d&#39;utilisateur lors de la lecture d&#39;une image à 360 degrés
WO2020054978A1 (fr) Dispositif et procédé de génération d&#39;image
WO2021075878A1 (fr) Procédé permettant de fournir un service d&#39;enregistrement de réalité augmentée et terminal utilisateur
JP2010011076A (ja) 画像処理装置及び画像処理方法
WO2014148691A1 (fr) Dispositif mobile et son procédé de commande
Zhang et al. Annotating and navigating tourist videos
JP6617547B2 (ja) 画像管理システム、画像管理方法、プログラム
JP2013097773A (ja) 情報処理装置、情報処理方法、及びプログラム
WO2020045909A1 (fr) Appareil et procédé pour logiciel intégré d&#39;interface utilisateur pour sélection multiple et fonctionnement d&#39;informations segmentées non consécutives
CN111242107B (zh) 用于设置空间中的虚拟对象的方法和电子设备
KR101934799B1 (ko) 파노라마 영상을 이용하여 새로운 컨텐츠를 생성하는 방법 및 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20875880

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08.09.2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20875880

Country of ref document: EP

Kind code of ref document: A1