WO2019098450A1 - Système destiné à guider le regard d'un utilisateur au moyen d'un dispositif mobile et procédé associé - Google Patents

Système destiné à guider le regard d'un utilisateur au moyen d'un dispositif mobile et procédé associé Download PDF

Info

Publication number
WO2019098450A1
WO2019098450A1 PCT/KR2017/014852 KR2017014852W WO2019098450A1 WO 2019098450 A1 WO2019098450 A1 WO 2019098450A1 KR 2017014852 W KR2017014852 W KR 2017014852W WO 2019098450 A1 WO2019098450 A1 WO 2019098450A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image data
wide
mobile device
virtual reality
Prior art date
Application number
PCT/KR2017/014852
Other languages
English (en)
Korean (ko)
Inventor
우운택
김기홍
전준기
Original Assignee
한국과학기술원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국과학기술원 filed Critical 한국과학기술원
Publication of WO2019098450A1 publication Critical patent/WO2019098450A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Definitions

  • An object of the present invention is to provide a virtual reality content to a user by complementarily using a high-definition camera and a wide-angle camera mounted on a mobile device.
  • the user's line of sight guidance system using a mobile device includes a portable device that acquires an image including a virtual reality content using a high-definition camera and a wide-angle camera, high-quality image data shot by the high- Angle camera; and an acquisition unit for generating synthetic data obtained by synthesizing the wide-angle image data photographed by the wide-angle camera, and for tracking the motion of the mobile device, And a user interface (UI) unit for providing a user interface (UI) for guiding the eyes of the user.
  • UI user interface
  • the portable device may include the wide-angle camera for photographing the surrounding environment including the virtual reality contents and the high-definition camera for intensively photographing the important area.
  • the mobile device may acquire the high-definition camera image, the wide-angle image data photographed from the wide-angle camera, and the image including the virtual reality contents according to the movement of the user.
  • the acquisition unit may track movement of the mobile device based on the virtual reality content and the high-definition image data and the wide-angle image data received from the mobile device.
  • the user interface providing unit provides the user interface including the visual information and the auditory information when the user's line of sight is out of the predetermined range of the high definition image data taken by the high definition camera, Can be provided.
  • the user interface providing unit may provide at least one or more of the visual information such as a scheme, an effect, and an emphasis and the auditory information such as an effect sound and music so that the user's eyes move according to the movement of the high- Can be guided to stay in the area of the high-quality image data.
  • the visual information such as a scheme, an effect, and an emphasis
  • the auditory information such as an effect sound and music
  • the user interface providing unit may provide at least one of the visual information such as a diagram, effect, emphasis, and the auditory information such as a sound effect and music to provide a trajectory of the high- And provides the user interface including the above.
  • the user's line of sight guidance system using a mobile device includes a mobile device that acquires an image including a virtual reality content using a high-definition camera and a wide-angle camera, high-quality image data photographed by the high- Angle camera, and a motion-tracking device for generating a composite image by synthesizing the wide-angle image data photographed by the wide-angle camera and tracking the movement of the mobile device,
  • a user interface providing a user interface (UI) for guiding a gaze, and a display unit displaying the user interface on the virtual reality content.
  • UI user interface
  • the display unit may display the virtual reality contents and the user interface on a display device located in a space for providing the virtual reality contents.
  • the display device may be at least one of an HMD (Head Mounted Display) or a digital display, a liquid crystal display, and a monitor.
  • HMD Head Mounted Display
  • digital display a liquid crystal display
  • liquid crystal display a liquid crystal display
  • monitor a monitor
  • the present invention provides a method for operating a user's line of sight guidance system using a mobile device according to an embodiment of the present invention, the method comprising: acquiring, from a mobile device that acquires an image including a virtual reality content using a high- And synthesizing synthetic data by receiving the wide-angle image data obtained by the wide-angle camera and the wide-angle image data obtained by the wide-angle camera, and tracking the movement of the mobile device, and based on the composite data and the motion- And providing a user interface (UI) for guiding the user's gaze to the area of the user interface (UI).
  • UI user interface
  • Synthesizing the composite data and tracking motion of the mobile device may combine the high-definition image data and the wide-angle image data using calibration information between the high-definition camera and the wide-angle camera.
  • the step of providing the user interface includes a step of providing visual information and auditory information when the user's line of sight deviates from a region of the high-definition image data photographed from the high- The user interface can be provided.
  • the step of providing the user interface may include providing at least one of the visual information such as a diagram, effect, emphasis, and the auditory information such as effect sound and music so that the user's eyes move according to the motion of the high- The user's gaze can be guided to stay in the area of the high-quality image data.
  • the visual information such as a diagram, effect, emphasis, and the auditory information such as effect sound and music
  • the providing of the user interface may include, in the virtual reality content, at least one of the visual information, such as a diagram, effect, emphasis, and the like, and at least one of the audio information such as effect sound, music, etc., to provide a trajectory of the high- And may provide the user interface including at least one or more than one.
  • the visual information such as a diagram, effect, emphasis, and the like
  • the audio information such as effect sound, music, etc.
  • a method for operating a user's line of sight guidance system using a mobile device includes the steps of: receiving high-quality image data and wide-angle image data photographed by the wide-angle camera to synthesize composite data, and tracking the motion of the mobile device; determining, based on the composite data and the motion- Providing a user interface (UI) for guiding a user's gaze to the area, and displaying the user interface on the virtual reality content.
  • UI user interface
  • the displaying the user interface comprises displaying the virtual reality content and the virtual reality content on at least one or more display devices such as a HMD (Head Mounted Display) or a digital display, a liquid crystal display,
  • the user interface can be displayed.
  • a high-definition camera and a wide-angle camera mounted on a mobile device can be complementarily used to provide a virtual reality content to a user.
  • a user interface and a virtual reality content can be simultaneously provided to a user in order to solve a problem caused by a mismatch between a motion of a mobile device and a motion of a user's gaze.
  • FIG. 1 is a block diagram illustrating a detailed configuration of a user's gaze guidance system using a mobile device according to an embodiment of the present invention. Referring to FIG. 1
  • FIG. 2 illustrates an example of a process of providing a user interface for guiding a user's gaze using a user's gaze guidance system according to an embodiment of the present invention.
  • FIG. 3 illustrates an example of photographing of wide-angle image data and high-quality image data according to an embodiment of the present invention.
  • FIG. 4 illustrates an example of synthesis of wide-angle image data and high-quality image data according to an embodiment of the present invention.
  • FIG. 5 illustrates a display example of a virtual reality content and a user interface according to an embodiment of the present invention.
  • FIG. 6 illustrates an example of providing a user interface according to a user's gaze.
  • FIG. 7 illustrates an example of a user's gaze guidance system for providing 360-degree virtual reality content according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a method of deriving a user's gaze using a mobile device according to an embodiment of the present invention.
  • FIG. 1 a user's gaze guidance system and method using a mobile device including a mobile device, an acquiring unit, and a user interface providing unit according to an embodiment of the present invention, a mobile device, an acquiring unit, The user's line of sight guidance system and method using a mobile device including an interface providing unit and a display unit will be described in more detail with reference to FIGS. 1 to 8.
  • FIG. 1 a user's gaze guidance system and method using a mobile device including a mobile device, an acquiring unit, and a user interface providing unit according to an embodiment of the present invention, a mobile device, an acquiring unit, The user's line of sight guidance system and method using a mobile device including an interface providing unit and a display unit will be described in more detail with reference to FIGS. 1 to 8.
  • FIG. 1 a user's gaze guidance system and method using a mobile device including a mobile device, an acquiring unit, and a user interface providing unit
  • a user's gaze guidance system using a mobile device guides a user's gaze to a virtual reality content using a high-definition camera and a wide-angle camera included in a mobile device.
  • a user's gaze guidance system 100 includes a mobile device 110, an acquiring unit 120, and a user interface providing unit 130.
  • the portable device 110 acquires an image including virtual reality contents using a high-definition camera and a wide-angle camera.
  • the mobile device 110 used in the present invention may be any one of at least one of a personal computer (PC), a laptop computer, a smart phone, a tablet, and a wearable computer And is a device including a wide-angle camera for photographing the surrounding environment including the virtual reality contents and a high-quality camera for intensively photographing the important area among the virtual reality contents.
  • PC personal computer
  • laptop computer laptop computer
  • smart phone smart phone
  • tablet tablet
  • wearable computer a device including a wide-angle camera for photographing the surrounding environment including the virtual reality contents and a high-quality camera for intensively photographing the important area among the virtual reality contents.
  • a user in the space can acquire the video virtual reality contents according to the movement with the portable device 110.
  • the mobile device 110 can acquire the high-quality image data and the wide-angle image data photographed from the high-definition camera and the wide-angle camera according to the movement of the user, and the image including the virtual reality contents.
  • the high-definition image data refers to image data in which important areas of the virtual reality contents are intensively photographed using a high-quality camera
  • the wide-angle image data is a video image of the surrounding environment including the virtual reality contents Data.
  • the acquisition unit 120 generates composite data obtained by synthesizing the high-definition image data photographed by the high-definition camera and the wide-angle image data photographed by the wide-angle camera, and tracks the motion of the mobile device 110.
  • the acquiring unit 120 uses the calibration information between the high- High-quality image data and wide-angle image data can be synthesized.
  • the calibration information represents calibration data for at least two cameras (wide-angle camera and high-definition camera) in a multi-view position, and for each of the wide-angle image data and the high- Analysis, homographic transformation, and feature detection processes.
  • the above-described calibration information and camera correction are not limited because they use techniques related to correction and combination of existing images.
  • the acquisition unit 120 may track the motion of the mobile device 110 through the high-definition image data and the wide-angle image data received from the mobile device 110 based on the virtual reality content.
  • the acquiring unit 120 may be configured to track movement of the mobile device 110 through the high-definition image data and the wide-angle image data photographed from the mobile device 110 in the space providing the virtual reality contents, And may acquire a point of time when the user is watching.
  • the user interface providing unit 130 provides a user interface (UI) for guiding the user's gaze to the area of the high-quality image data based on the combined data and the motion tracking result.
  • UI user interface
  • the user interface providing unit 130 provides at least one or more of visual information such as schematics, effects, emphasis, and auditory information such as sound effects and music so that the user's eyes move according to the movement of the high- So that the user's gaze can be guided to stay in the area of the high-quality image data.
  • the user interface providing unit 130 may provide visual information such as diagrams, effects, emphasis and audio information And may provide a user interface including at least one or more of them.
  • the virtual reality contents can provide an image according to the passage of time, and there can exist important areas (high-quality image data) in which the emphasis or focus of the user's attention should be concentrated according to the passage of time.
  • the user interface providing unit 130 may display visual information or provide auditory information through an output module to provide a trajectory of the high-quality image data according to the passage of time.
  • the user interface providing unit 130 may provide a user interface including visual information such as arrows, emphasis, sound effects, vibration, and auditory information to the virtual reality content image 130 so that the user's line of sight stays in the area of the high- As shown in FIG.
  • the virtual reality content may include at least one of a 2D image, a 3D image, a sound content, a video content, a virtual content, and a uniform resource locator (URL) information.
  • URL uniform resource locator
  • a user's gaze guidance system 100 further includes a display unit 140.
  • FIG. 2 illustrates an example of a process of providing a user interface for guiding a user's gaze using a user's gaze guidance system according to an embodiment of the present invention.
  • Fig. 2 shows an example of acquiring high-definition image data and wide-angle image data through a mobile device, an example of synthesizing composite data, and a user's gaze in a user's gaze guidance system according to an embodiment of the present invention
  • a mobile device an example of synthesizing composite data
  • a user's gaze in a user's gaze guidance system according to an embodiment of the present invention
  • a first step 200 captures virtual reality contents using a mobile device.
  • the mobile device includes a wide-angle lens of a wide-angle lens and a high-definition camera of a high-resolution lens.
  • the wide-angle camera captures the surrounding environment including the virtual reality contents, You can shoot.
  • the second process 300 may synthesize (a) the high-definition image data and the wide-angle image data photographed in the mobile device. For example, by using calibration information (calibration data) between a high-definition camera and a wide-angle camera, high-quality image data (hereinafter referred to as " (HD) can be generated.
  • calibration information calibration data
  • &quot high-quality image data
  • the third step 400 includes a user interface (GUI) for guiding the user's gaze to the area of the high definition image data based on the composite data and the motion tracking result of the mobile device (B) by providing a user interface (UI) to display the virtual reality contents simultaneously.
  • GUI user interface
  • UI user interface
  • a high-definition camera of a high-definition lens in the mobile device 210 can capture high-quality image data by intensively photographing an important region of the virtual reality contents.
  • the wide-angle camera of the wide-angle lens can acquire the wide-angle image data by photographing the surrounding environment including the important area as a whole. At this time, the high-quality image data and the wide-angle image data including the important region and the surrounding environment, which are intensively photographed according to the movement and movement of the user, can be changed in real time.
  • the wide-angle camera and the set of high-definition cameras included in the mobile device 210 shown in FIG. 3 are shown as a single camera, but may be plural according to the embodiment.
  • the types of the wide angle camera and the high quality camera are not limited.
  • the wide-angle camera photographs the surrounding environment such as conventional conventional virtual reality contents and images
  • the high-quality camera captures the important areas desired to be provided to the user.
  • the gaze guidance system can complement the two cameras.
  • FIG. 4 illustrates an example of synthesis of wide-angle image data and high-quality image data according to an embodiment of the present invention.
  • the user's line of sight guidance system using a mobile device includes a synthesizing unit that synthesizes the wide-angle image data 310 photographed from the wide-angle camera and the high-quality image data 320 photographed from the high- Data 330 is generated.
  • the user's gaze guidance system utilizes calibration information between a wide-angle camera and a high-definition camera to provide a relatively wide environment including virtual reality contents through various image processing techniques It is possible to synthesize the photographed wide-angle image data 310, WIDE and the high-quality image data 320, HD obtained by intensively photographing the important area.
  • the image processing technique is not limited because it uses a technique commonly used in related arts.
  • the calibration information represents calibration data for at least two cameras (wide-angle camera and high-definition camera) in a multi-view position, and for each of the wide-angle image data and the high- Analysis, homographic transformation, and feature detection processes.
  • the above-described calibration information and camera correction are not limited because they use techniques related to correction and combination of existing images.
  • FIG. 5 illustrates a display example of a virtual reality content and a user interface according to an embodiment of the present invention.
  • the user views the area 410 of the first high-definition image data in the virtual reality contents, but the mobile device 420 may also occur according to the movement of the user.
  • the focused area (or high-quality image data) can be changed. That is, according to the motion 420 of the mobile device, the area 430 of the second high-definition image data can be generated, but the viewpoint of the user by the movement of the mobile device is the area 430 of the second high- ).
  • the user's gaze guidance system in order to allow the user's gaze to stay in the region 430 of the second high-definition image data, which is a critical region, includes a second high- The edge of the area 430 of data can be highlighted in yellow and the area of the first high image quality image data 410 to the area of the second high image quality image data. Depending on the embodiment, it may further provide auditory information such as sound effects or music.
  • the user's line of sight guidance system may include, in addition to a simple scheme (e.g., an arrow) when the user's line of sight deviates from the area 430 of the second high-
  • a simple scheme e.g., an arrow
  • the user's eyes can be guided to stay in the area 430 of the second high-quality image data through a voice signal or other user interface (UI).
  • UI user interface
  • FIG. 6 illustrates an example of providing a user interface according to a user's gaze.
  • a user's visual guidance system may include a graphical user interface
  • the user interface-based critical region 640 can be displayed together with the virtual reality contents by using a user interface including visual information such as effects, emphasis, and audible information such as sound effects and music.
  • the user's line of sight guidance system can display the trajectory of the important area 620 according to the time with a user interface so that when the user 601 moves to a desired image frame, The user can be informed of whether he or she should take a look.
  • the user's gaze guidance system in another embodiment, in the case of a 360-degree video virtual reality content 610 with other interactions combined, provides an appropriate user interface that matches the interaction and content content It is possible.
  • a digital display a liquid crystal display (LCD), a light emitting diode display (LED), an organic light emitting diode display (OLED), an electronic paper display
  • a user's gaze guidance system for providing a virtual reality content through a display device, a user can carry a mobile device by photographing a virtual reality content.
  • a user can capture a virtual reality content in a 360-degree space using a portable device, and acquire wide-angle image data and high-quality image data for 360-degree virtual reality content through the mobile device.
  • the high-quality image data 720 which is an important area, can also be changed.
  • the user's gaze guidance system can provide a user interface for guiding the user's gaze to the high-quality image data 720 according to the user's motion 710.
  • the user's gaze guidance system may include a user interface including at least one of visual information such as a diagram, effect, and emphasis, and auditory information such as sound effects and music, And can provide the user with the 360 degree virtual reality content experience by guiding the user's gaze to stay in the high quality image data 720.
  • FIG. 8 is a flowchart illustrating a method of deriving a user's gaze using a mobile device according to an embodiment of the present invention.
  • the operation method shown in Fig. 8 is performed by a user's gaze guidance system using a mobile device according to an embodiment of the present invention shown in Fig.
  • the high-definition camera and the wide-angle camera are used to receive high-quality image data photographed by the high-definition camera and wide-angle image data photographed by the wide-angle camera from the portable device that acquires images containing the virtual reality content, And tracks the movement of the mobile device.
  • step 810 is performed by using calibration information between the high- And synthesizing the wide-angle image data.
  • step 810 may track movement of the mobile device based on the virtual reality content, via the high-definition image data and the wide-angle image data received from the mobile device. For example, in step 810, the motion of the mobile device and the current position can be tracked through the high-definition image data and the wide-angle image data shot from the mobile device in the space for providing the virtual reality contents, It can also be obtained.
  • step 820 a user interface (UI) for guiding the user's gaze to the area of the high-definition image data based on the composite data and the motion tracking result is provided.
  • UI user interface
  • step 820 when the user's gaze is out of a predetermined range or more in the area of the high-definition image data photographed from the high-definition camera, the synthesized data and the motion tracking result obtained in step 810, And providing a user interface.
  • at least one or more of visual information such as a scheme, effect, and emphasis and auditory information such as sound effects and music are provided so that the user's eyes move according to the motion of the high- So as to stay in the area of the high-quality image data.
  • the virtual reality content 820 may include at least one of visual information such as a diagram, effect, and emphasis, and auditory information such as effect sound and music to provide a trajectory of high-quality image data according to time in the virtual reality contents To provide a user interface.
  • visual information such as a diagram, effect, and emphasis
  • auditory information such as effect sound and music
  • a method of deriving a user's gaze using a mobile device further includes step 830.
  • step 830 a user interface is displayed on the virtual reality content.
  • step 830 may include displaying a virtual reality content and a user interface on at least one or more display devices, such as a HMD (Head Mounted Display), or a digital display, a liquid crystal display, and a monitor, Lt; / RTI > But may be displayed on at least one of the HMD and the display device depending on the embodiment.
  • a HMD Head Mounted Display
  • a digital display a liquid crystal display
  • monitor Lt
  • Lt liquid crystal display
  • RTI &gt monitor
  • the apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components.
  • the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA) A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to execution of the software.
  • OS operating system
  • the processing device may also access, store, manipulate, process, and generate data in response to execution of the software.
  • the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG.
  • the processing unit may comprise a plurality of processors or one processor and one controller.
  • Other processing configurations are also possible, such as a parallel processor.
  • the software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded.
  • the software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave.
  • the software may be distributed over a networked computer system and stored or executed in a distributed manner.
  • the software and data may be stored on one or more computer readable recording media.
  • the method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination.
  • the program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software.
  • Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.
  • program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système destiné à guider le regard d'un utilisateur vers un contenu de réalité virtuelle au moyen d'une caméra haute définition et d'une caméra grand angle contenues dans un dispositif mobile, ainsi qu'un procédé associé. Afin de résoudre le problème causé par le mésappariement entre le mouvement du dispositif mobile et le mouvement du regard de l'utilisateur, les système et procédé selon l'invention peuvent fournir de façon simultanée à l'utilisateur une interface utilisateur appropriée et un contenu de réalité virtuelle.
PCT/KR2017/014852 2017-11-15 2017-12-15 Système destiné à guider le regard d'un utilisateur au moyen d'un dispositif mobile et procédé associé WO2019098450A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0152049 2017-11-15
KR1020170152049A KR101923322B1 (ko) 2017-11-15 2017-11-15 이동형 디바이스를 이용한 사용자 시선 유도 시스템 및 그 방법

Publications (1)

Publication Number Publication Date
WO2019098450A1 true WO2019098450A1 (fr) 2019-05-23

Family

ID=64561282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/014852 WO2019098450A1 (fr) 2017-11-15 2017-12-15 Système destiné à guider le regard d'un utilisateur au moyen d'un dispositif mobile et procédé associé

Country Status (2)

Country Link
KR (1) KR101923322B1 (fr)
WO (1) WO2019098450A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE1950971A1 (en) * 2019-08-23 2021-02-24 Your Speech Factory Ab Electronic device and method for conducting a users gaze direction

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160148623A (ko) * 2014-04-29 2016-12-26 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 디스플레이 디바이스 뷰어 시선 유인
KR20170020069A (ko) * 2015-08-13 2017-02-22 엘지전자 주식회사 이동 단말기 및 그의 영상촬영 방법
KR20170023491A (ko) * 2015-08-24 2017-03-06 엘지전자 주식회사 카메라 및 이를 포함하는 가상 현실 시스템
KR20170035958A (ko) * 2014-07-25 2017-03-31 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 가상 현실 환경 내에서의 시선 기반의 오브젝트 배치
KR101748401B1 (ko) * 2016-08-22 2017-06-16 강두환 가상현실 어트랙션 제어 방법 및 시스템

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6147709A (en) * 1997-04-07 2000-11-14 Interactive Pictures Corporation Method and apparatus for inserting a high resolution image into a low resolution interactive image to produce a realistic immersive experience

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160148623A (ko) * 2014-04-29 2016-12-26 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 디스플레이 디바이스 뷰어 시선 유인
KR20170035958A (ko) * 2014-07-25 2017-03-31 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 가상 현실 환경 내에서의 시선 기반의 오브젝트 배치
KR20170020069A (ko) * 2015-08-13 2017-02-22 엘지전자 주식회사 이동 단말기 및 그의 영상촬영 방법
KR20170023491A (ko) * 2015-08-24 2017-03-06 엘지전자 주식회사 카메라 및 이를 포함하는 가상 현실 시스템
KR101748401B1 (ko) * 2016-08-22 2017-06-16 강두환 가상현실 어트랙션 제어 방법 및 시스템

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE1950971A1 (en) * 2019-08-23 2021-02-24 Your Speech Factory Ab Electronic device and method for conducting a users gaze direction
WO2021040602A1 (fr) * 2019-08-23 2021-03-04 Your Speech Factory Ab Dispositif électronique et procédé pour l'entraînement au contact visuel

Also Published As

Publication number Publication date
KR101923322B1 (ko) 2018-11-28

Similar Documents

Publication Publication Date Title
WO2020122488A1 (fr) Appareil de lunettes de réalité mixte basé sur une caméra et procédé d'affichage de réalité mixte
AU2016324578B2 (en) Camera module including multi-lens and electronic device having the same
WO2018143770A1 (fr) Dispositif électronique de création d'image panoramique ou d'image animée et procédé associé
EP3475924A1 (fr) Procédé, appareil et système de partage de région d'affichage de réalité virtuelle
WO2020111426A1 (fr) Procédé et système de présentation d'images ou de vidéos animées correspondant à des images fixes
WO2013077643A1 (fr) Appareil et procédé pour la fourniture d'un service de réalité augmentée destiné à un terminal mobile
JP6834976B2 (ja) マルチカメラシステム、マルチカメラシステムの制御方法およびカメラ
WO2015108232A1 (fr) Dispositif portable et son procédé de commande
WO2018174505A1 (fr) Procédé et appareil de génération de contenu vidéo
WO2018088730A1 (fr) Appareil d'affichage, et procédé de commande correspondant
WO2018021707A1 (fr) Système de publicité vidéo vr et système de production de publicité vr
WO2015030307A1 (fr) Dispositif d'affichage monté sur tête (hmd) et procédé pour sa commande
WO2013172636A1 (fr) Appareil d'affichage, et son un procédé de commande
WO2016021925A1 (fr) Appareil d'affichage d'images multivues et son procédé de commande
WO2018084536A1 (fr) Serveur de fourniture d'images de tranche de temps, procédé et terminal d'utilisateur
WO2019035581A1 (fr) Serveur, dispositif d'affichage et procédé de commande s'y rapportant
WO2017026705A1 (fr) Dispositif électronique pour générer une image tridimensionnelle sur 360 degrés, et procédé associé
WO2021167374A1 (fr) Dispositif de recherche vidéo et système de caméra de surveillance de réseau le comprenant
WO2019054611A1 (fr) Dispositif électronique et procédé de fonctionnement de celui-ci
WO2017119575A1 (fr) Dispositif de prise d'image photographique et procédé de prise d'image photographique
WO2018012727A1 (fr) Appareil d'affichage et support d'enregistrement
WO2018030795A1 (fr) Dispositif d'appareil photo, dispositif d'affichage et procédé de correction de mouvement dans le dispositif
WO2019098450A1 (fr) Système destiné à guider le regard d'un utilisateur au moyen d'un dispositif mobile et procédé associé
WO2019160237A1 (fr) Dispositif électronique, et procédé de commande d'affichage d'images
WO2023234532A1 (fr) Procédé, dispositif et système d'enregistrement de données pour production virtuelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17932331

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17932331

Country of ref document: EP

Kind code of ref document: A1