EP3281058A1 - Appareil d'affichage de réalité virtuelle et procédé d'affichage associé - Google Patents

Appareil d'affichage de réalité virtuelle et procédé d'affichage associé

Info

Publication number
EP3281058A1
EP3281058A1 EP16842274.9A EP16842274A EP3281058A1 EP 3281058 A1 EP3281058 A1 EP 3281058A1 EP 16842274 A EP16842274 A EP 16842274A EP 3281058 A1 EP3281058 A1 EP 3281058A1
Authority
EP
European Patent Office
Prior art keywords
virtual reality
user
display apparatus
image
object information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP16842274.9A
Other languages
German (de)
English (en)
Other versions
EP3281058A4 (fr
Inventor
Weiming Li
Do-Wan Kim
Jae-Yun Jeong
Yong-Gyoo Kim
Gengyu Ma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2016/009711 external-priority patent/WO2017039308A1/fr
Publication of EP3281058A1 publication Critical patent/EP3281058A1/fr
Publication of EP3281058A4 publication Critical patent/EP3281058A4/fr
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Definitions

  • Apparatuses and methods consistent with exemplary embodiments relate to virtual reality or augmented reality.
  • a representative example of a virtual reality apparatus is a head-mounted display apparatus, which is also referred to as virtual reality glasses.
  • a head-mounted display apparatus generates and displays a virtual reality image
  • a user wears a virtual reality display apparatus and sees the generated virtual reality image.
  • the user may not able to see an actual surrounding environment or an actual object while seeing the virtual reality image through the virtual reality display apparatus.
  • such a case may include an occurrence of a dangerous situation in a surrounding environment, an ingestion of food and drink, or the like.
  • such an interruption may decrease the user's sense of being immersed in the virtual environment.
  • One or more exemplary embodiments provide a virtual reality display apparatus and a display method thereof.
  • a virtual reality display apparatus that may be more convenient and enhance a sense of immersion and a display method thereof may be provided.
  • FIG. 1 illustrates an example of using a virtual reality apparatus
  • FIG. 2A and 2B are block diagrams showing an internal configuration of a virtual reality display apparatus according to various exemplary embodiments
  • FIG. 3 is a flowchart showing a display method of a virtual reality display apparatus according to an exemplary embodiment
  • FIG. 4 is a flowchart showing a method of displaying a physical keyboard in a virtual reality display apparatus according to an exemplary embodiment
  • FIG. 5 illustrates an example of requiring a virtual reality display apparatus to display a physical keyboard to a user
  • FIG. 6 illustrates a screen for inducing a user to rotate in a direction of a keyboard according to an exemplary embodiment
  • FIGS. 7A, 7B, 7C, and 7D illustrate a binocular view of a physical keyboard in a virtual reality display apparatus according to an exemplary embodiment
  • FIGS. 8A, 8B, 8C, and 8D illustrates a physical keyboard in virtual reality according to an exemplary embodiment
  • FIG. 9 is a flowchart showing a method of displaying food in virtual reality by a virtual reality display apparatus according to an exemplary embodiment
  • FIG. 10 illustrates a button according to an exemplary embodiment
  • FIG. 11 illustrates a framing operation according to an exemplary embodiment
  • FIG. 12 illustrates a screen for selecting an object to be displayed to a user according to an exemplary embodiment
  • FIGS. 13A and 13B illustrate a method of avoiding interference between virtual reality and an actual object according to an exemplary embodiment
  • FIG. 14 illustrates a method of deleting an actual object displayed in virtual reality according to an exemplary embodiment
  • FIG. 15 illustrates a method of displaying a display item in a virtual reality display apparatus according to an exemplary embodiment
  • FIG. 16 illustrates a method of displaying a screen of an external apparatus in a virtual reality display apparatus according to an exemplary embodiment.
  • One or more exemplary embodiments provide a virtual reality display apparatus and a display method thereof.
  • one or more exemplary embodiments provide a virtual reality display apparatus that may be more convenient and enhance a sense of immersion and a display method thereof.
  • a display method of a virtual reality display apparatus including: displaying a virtual reality image; acquiring object information regarding a real-world object based on a binocular view of the user; and displaying the acquired object information together with the virtual reality image.
  • a virtual reality display apparatus including: an object information acquisition unit configured to acquire object information regarding a real-world object based on a binocular view of a user, a display configured to display a virtual reality image and the acquired object information; and a controller configured to control the object information acquisition unit and the display to respectively acquire the object information and display the acquired object information together with the virtual reality image.
  • a virtual reality headset including: a camera configured to capture a real-world object around a user; a display configured to display a virtual reality image; and a processor configured to determine whether to display the real-world object together with the virtual reality image based on a correlation between a graphic user interface displayed on the display and a functionality of the real-world object.
  • the processor may be further configured to determine to overlay the real-world object on the virtual reality image in response to determining that the graphic user interface prompts the user to input data and the real-world object is an input device.
  • the processor may be further configured to determine to display the real-world object together with the virtual reality image in response to a type of the real-world object matching one of a plurality of predetermined types and a current time being within a predetermined time range.
  • the term "and/or" includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
  • a virtual view refers to a view which a user sees in a virtual reality apparatus.
  • a binocular view refers to a view which two eyes of a user who uses a virtual reality apparatus sees.
  • FIG. 1 is a view showing an example of using a virtual reality apparatus.
  • a virtual reality display apparatus 100 provides a user 110 with an image 120 of a virtual space different from a real space in which the user 110 is located.
  • the virtual reality display apparatus 100 may display the image 120 according to movement of the user 110.
  • the user 110 may move his or her entire body or just his or her head.
  • the virtual reality display apparatus 100 may display another image according to the movement of the user 110.
  • the virtual reality display apparatus 100 may be called a head-mounted display, a headset, virtual reality glasses, or the like.
  • FIG. 2A is a block diagram showing an internal configuration of a virtual reality display apparatus according to an exemplary embodiment.
  • a virtual reality display apparatus 200 may include an object information acquisition unit 210, a display 220, and a controller 230.
  • the object information acquisition unit 210 and the controller 230 may be implemented by one or more processors.
  • the object information acquisition unit 210 acquires object information regarding a real-world object on the basis of a binocular view of a user.
  • the object information acquisition unit 210 may include at least one or more of a sensor 211, a communication interface 212, and an imaging apparatus 213.
  • the sensor 211 may include various kinds of sensors capable of sensing external information, such as a motion sensor, a proximity sensor, a location sensor, an acoustic sensor, or the like, and may acquire object information through a sensing operation.
  • the communication interface 212 may be connected with a network via wired or wireless communication to receive data through communication with an external apparatus and acquire object information.
  • the communication interface may include a communication module, a mobile communication module, a wired/wireless Internet module, etc.
  • the communication interface 212 may also include one or more elements.
  • the imaging apparatus 213 may capture an image to acquire the object information.
  • the imaging apparatus 213 may include a camera, a video camera, a depth camera, or the like, and may include a plurality of cameras.
  • the display 220 displays virtual reality and the acquired object information.
  • the display 220 may display only the virtual reality or display the virtual reality and the acquired object information together according to control of the controller 230.
  • the controller 230 may acquire the object information and display the acquired object information together with the virtual reality by controlling an overall operation of the virtual reality display apparatus 200.
  • the controller 230 may control the display 220 to display object information at a location corresponding to an actual location of the object.
  • the controller 230 may include a random access memory (RAM) that stores signals or data received from an outside of the virtual reality display apparatus 200 or that is used as a storage area corresponding to various tasks performed by an electronic apparatus, a read-only memory (ROM) that stores a control program for controlling peripheral devices, and a processor.
  • the processor may be implemented as a system on chip (SoC) that integrates a core and a graphics processing unit (GPU).
  • SoC system on chip
  • GPU graphics processing unit
  • the processor may include a plurality of processors.
  • the processor may also include a GPU.
  • the controller 230 may acquire object information by controlling the object information acquisition unit 210 to collect data regarding a real-world object. Also, the controller 230 may control the display 220 to process data associated with virtual reality and object information to generate an image and display the generated image.
  • the virtual reality display apparatus 200 may include a sensor 211, a communication interface 121, a camera 213, a display 220, and a processor 230, as shown in FIG. 2B.
  • the processor 230 may include all of the features of the controller 230 illustrated in FIG. 2A.
  • the camera 213 may include all of the features of the imaging apparatus 213 illustrated in FIG. 2A.
  • the camera 213 may captures images of real-world objects and the processor 230 may perform image processing of the real-world objects.
  • the configuration of the virtual reality display apparatus 200 according to an exemplary embodiment has been described thus far.
  • a display method of the virtual reality display apparatus 200 will be described in greater detail below.
  • FIG. 3 is a flowchart showing a display method of a virtual reality display apparatus according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may display virtual reality to a user according to a virtual view.
  • a virtual view refers to a view which the user sees in the virtual reality apparatus.
  • the virtual reality display apparatus 200 provides the user with an image of a virtual space different from a real space in which the user is located as virtual reality.
  • the virtual reality display apparatus 200 acquires object information regarding a real-world object on the basis of a binocular view of the user.
  • a binocular view refers to a view which two eyes of the user who uses the virtual reality apparatus sees. A person may recognize a spatial sense through a view of his or her two eyes.
  • the virtual reality display apparatus 200 may acquire object information regarding a real-world object on the basis of a binocular view of the user in order to provide the user with a spatial sense regarding the object.
  • the object information may include an image of the real-world object.
  • the object information may include depth information of the object and information regarding a location and posture of the object in three-dimensional (3D) space.
  • the virtual reality display apparatus 200 may display the object in virtual reality using the acquired object information, and thus may provide the user with the same experience as that of actually showing the object to the user.
  • the object may be an object that is configured in advance according to attributes or an application scenario of the object and may include at least one or more of an object in the vicinity of the user, an object with a predetermined label, an object designated by the user, an object that an application running in the virtual reality display apparatus needs to use, and an object required for performing control of the virtual reality display apparatus.
  • the virtual reality display apparatus 200 may capture an image of the object using the imaging apparatus 213, acquire a different-view image of the object on the basis of the captured image, and acquire a binocular-view image of the object on the basis of the captured image and the different-view image of the object.
  • the virtual reality display apparatus 200 may perform viewpoint correction on the captured image and the acquired different-view image of the object on the basis of a location relationship between the imaging apparatus 213 and the eyes of the user.
  • an image of a real-world object may be acquired by a single imaging apparatus, and a binocular-view image for the object may be acquired on the basis of the captured image.
  • the single imaging apparatus may be a general imaging apparatus having a single view. Since an image captured using the single imaging apparatus does not have depth information, a different-view image of the real-world object may be acquired from the captured image.
  • a binocular-view image of the real-world object may be acquired on the basis of the captured image and the different-view image of the real-world object.
  • the image of the real-world object may be an image of an area where the real-world object is located in an entire captured image.
  • Various image recognition methods may be used to detect an image of an actual object from the capture image.
  • a binocular-view image of a real-world object may also be acquired on the basis of a stereo image having depth information.
  • the imaging apparatus 213 may include a depth camera or at least two or more single-view cameras.
  • the at least two or more single-view cameras may be configured to have overlapping fields-of-view.
  • a single-imaging apparatus, a depth camera, or a single-view camera may be an internal imaging apparatus of the virtual reality display apparatus 200 or may be an external apparatus connected to the virtual reality display apparatus 200, for example, a camera of another apparatus.
  • the virtual reality display apparatus 200 may widen an imaging angle of view in order to capture an image including the candidate object.
  • the virtual reality display apparatus 200 may direct the user to rotate in a direction toward the candidate object to capture an image including the candidate object.
  • the user may be guided to move in the direction toward the candidate object through images, text, audio, or video.
  • the user may be guided to rotate in the direction toward the candidate object on the basis of a pre-stored 3D space location of the candidate object and a 3D space location of the candidate object acquired by a positioning apparatus.
  • the virtual reality display apparatus 200 may determine whether object information needs to be displayed to a user and acquire the object information when it is determined that the object information needs to be displayed to the user. In particular, for at least one of when a user input to display the object information is received, when it is determined that the object information is set to be displayed to the user, when a control command requiring the object to perform a specific operation is detected on an application interface in virtual reality, when a body part of the user is detected close to the object, when a body part of the user moving in a direction of the object is detected, when it is determined that an application running in the virtual reality display apparatus 200 needs to immediately use the object information, or when it is determined that a time set to interact with the object in the vicinity of the user is reached, the virtual reality display apparatus 200 may determine that the object information needs to be displayed to the user.
  • the virtual reality display apparatus 200 may determine that the object information needs to be displayed to the user.
  • a user input to display the object information may be performed by at least one of a touch screen input, a physical button input, a remote control command, voice control, a gesture, a head movement, a body movement, an eye movement, and a holding operation.
  • the virtual reality display apparatus 200 may acquire at least one of a notice that an event has occurred and details of the event from an external apparatus.
  • the virtual reality display apparatus 200 may acquire a display item from an Internet of Things (IoT) device and may display the acquired display item.
  • the display item may include at least one of a manipulation interface, a manipulation status, notice information, and instruction information.
  • the notice information may be text, audio, a video, an image, or other information.
  • the notice information may be text information regarding a missed call.
  • the loT device is an access control device
  • the notice information may be a captured monitoring image.
  • the instruction information may be text, audio, a video, or an image used to instruct the user to search for an IoT device.
  • the instruction information is an arrow sign
  • the user may acquire a location of an IoT device associated with the user according to a direction indicated by the arrow.
  • the instruction information may be text that indicates a location relationship between the user and the IoT (e.g., a communication device is 2 meters ahead).
  • the virtual reality display apparatus 200 may acquire a display item of an IoT device in the following processing method.
  • the virtual reality display apparatus 200 may capture an image of the IoT device, search the captured image of the IoT device for a display item of the IoT device, receive the display item of the IoT device from the IoT device inside or outside a field-of-view of a user, detect a location of the IoT device outside the field-of-view of the user through its relationship with the virtual reality display apparatus 200, and acquire the detected location as instruction information.
  • the virtual reality display apparatus 200 may remotely control the IoT device to perform a process corresponding to a manipulation of the user.
  • the user when a user wears the virtual reality display apparatus 200, the user may acquire information regarding nearby IoT devices. Also, the user may use the virtual reality display apparatus 200 to remotely control an IoT device to perform a process corresponding to a manipulation of the user.
  • the virtual reality display apparatus 200 may determine whether to provide the object information to a user on the basis of at least one of importance and urgency of reality information.
  • the virtual reality display apparatus 200 may display the acquired object information to the user together with the virtual reality.
  • the virtual reality display apparatus 200 may display the object information at a location corresponding to an actual location of the object.
  • the user may see object information regarding a real-world object in a virtual reality image.
  • the user may see the real-world object in the virtual reality image.
  • the virtual reality display apparatus 200 may adjust a display method of at least one of the virtual reality image and the object information.
  • the virtual reality and the object information may be displayed to overlap each other. That is, the object information and the virtual reality image displayed to the user may be spatially combined and displayed. In this case, the user may interoperate with a real-world object which requires feedback in a general virtual reality image of the virtual reality display apparatus 200.
  • the virtual reality image displayed by the virtual reality display apparatus 200 may be an image that is displayed to a user according to a virtual view of the user in an application running in the virtual reality display apparatus 200.
  • the virtual reality image displayed to the user may be an image according to a virtual view of the user in the game.
  • the virtual reality image may reflect a virtual film screen scene displayed to the user according to the virtual view of the user.
  • the virtual reality display apparatus 200 may select one of the following methods to display the acquired object information together with the virtual reality image. That is, the virtual reality display apparatus 200 may spatially combine and display the virtual reality image and the object information, display the object information in the virtual reality image through picture-in-picture (PIP), or display the object information over the virtual reality through PIP.
  • PIP picture-in-picture
  • the object information may be displayed using at least one of translucency, an outline, and a 3D grid line.
  • the object information may be displayed using at least one of translucency, an outline, and a 3D grid line.
  • translucency an outline
  • 3D grid line For example, when a virtual object and the object information in virtual reality image obscure each other in a 3D space, the user is not hindered in seeing the virtual object in the virtual reality image by decreasing shading of the virtual object in the virtual reality image and displaying the object information using at least one of translucency, an outline, and a 3D grid line.
  • the virtual object and the object information in the virtual reality image obscure each other in a 3D space
  • the virtual object may be enlarged or reduced and/or shifted.
  • the virtual reality display apparatus 200 may determine a situation in which the virtual object and the object information in the virtual reality image obscure each other in a 3D space and may adjust a display method of the virtual object or the object information. Furthermore, it is possible to adjust the display method of the virtual object or the object information according to an input of the user.
  • the virtual display 220 may display the virtual reality image without the displayed object information.
  • the display 220 may display the virtual reality image without the object information.
  • the display 200 may display the virtual reality image without the object information when at least one of the following events occurs: a user input for preventing display of the object information is received; the controller 230 determines that the object information is set not to be displayed to the user; the controller 230 does not detect a control command requiring the object information to perform a specific operation on an application interface in the virtual reality; the distance between a body part of the user and the object corresponding to the object information is greater than a predetermined distance; a body part of the user is moving in a direction away from the object corresponding to the object information; the controller 230 determines that an application running in the virtual reality display apparatus 200 does not need to use the object information; the controller 230 does not receive, for a predetermined time, a user input that requires an operation using
  • the user input for preventing the display of the object information may be performed by at least one of a touch screen input, a physical button input, a remote control command, voice control, a gesture, a head movement, a body movement, an eye movement, and a holding operation.
  • the virtual reality display apparatus 200 may allow the user to smoothly experience virtual reality by adjusting a display method of a virtual object or the object information or by deleting object information and displaying the virtual reality.
  • the virtual reality display apparatus 200 when the virtual reality display apparatus 200 acquires at least one of a notice that an event has occurred and details of the event from an external apparatus, the virtual reality display apparatus 200 may display a location of the external apparatus.
  • the virtual reality display apparatus 200 may determine a method of displaying the object information on the basis of at least one of importance and urgency of reality information, and may display the object information to the user according to the determined display method.
  • the virtual reality display apparatus 200 may determine a display priority to determine the display method.
  • a display priority list may be predetermined, and the virtual reality display apparatus 200 may classify display priorities of the virtual object and the real-world object in the virtual reality according to importance and urgency.
  • the display priority list may be automatically set by the virtual reality display apparatus 200 or may be set by the user according to a pattern of use.
  • a method of displaying a physical keyboard in the virtual reality display apparatus 200 will be described below with reference to FIGS. 4 to 7 according to an exemplary embodiment.
  • FIG. 4 is a flowchart showing a method of displaying a physical keyboard in the virtual reality display apparatus 200 according to an exemplary embodiment.
  • the virtual reality display apparatus 200 determines whether a physical keyboard in the vicinity of a user needs to be displayed to the user.
  • the virtual reality display apparatus 200 may determine that the physical keyboard in the vicinity of the user needs to be displayed to the user.
  • the virtual reality display apparatus 200 may detect that the corresponding control command is a control command that needs to use an interactive device for performing a specific operation according to attribute information of the control command of the application interface in the virtual reality.
  • the virtual reality display apparatus 200 may determine that an interactive device in the vicinity of the user needs to be displayed.
  • the physical keyboard may be configured as the interactive device to be displayed to the user. This will be described below with reference to FIG. 5.
  • FIG. 5 is a view showing an example of requiring the virtual reality display apparatus 200 to display a physical keyboard to a user.
  • a dialog box 520 is displayed to instruct a user to enter text information into the virtual reality display apparatus 200.
  • the controller 230 may analyze attribute information of a control command of an application interface that instructs the dialog box 520 to be displayed, and may determine that the control command requires the physical keyboard to receive the text information. For example, when the controller 230 receives a control command that enables the display 220 to display an input field (e.g., input field to enter a user name) and/or a selection of inputs ("OK" button and "Cancel” button), the controller 230 may determine that input devices (e.g., mouse, keyboard, etc.) or interactive devices (e.g., touchpad) are candidate real-world objects. Accordingly, when the dialog box 520 is displayed, the virtual reality display apparatus 200 may determine that the physical keyboard needs to be displayed.
  • input devices e.g., mouse, keyboard, etc.
  • interactive devices e.g., touchpad
  • the physical keyboard has been described as an input device to be displayed to the user.
  • various devices may be determined as the input device to be displayed to the user according to an application.
  • the application that is currently running in the virtual reality display apparatus 200 is a virtual game application
  • a joystick or mouse in addition to the physical keyboard may be the input device to be displayed to the user.
  • the input device determined to be displayed to the user may be added to and managed in a list of objects to be displayed for future use.
  • the virtual reality display apparatus 200 may determine that an input device in the vicinity of a user needs to be displayed. Furthermore, when the virtual reality display apparatus 200 receives the user input to prevent the object information from being displayed, the virtual reality display apparatus 200 may display the virtual reality except for the interactive device in the vicinity of the user displayed by the virtual reality display apparatus 200.
  • the user input to display the object information may be at least one of a touch screen input, a physical button input, a remote control command, voice control, a gesture, a head movement, a body movement, an eye movement, and a holding operation.
  • the touch screen input or the physical button input may be an input using a touch screen or a physical button provided in the virtual reality display apparatus 200.
  • the remote control command may be a control command received from a physical button disposed at another device (e.g., such as a handle) that may remotely control the virtual reality display apparatus 200.
  • the virtual reality display apparatus 200 may determine that a physical keyboard in the vicinity of a user needs to be displayed to the user.
  • the virtual reality display apparatus 200 detects an input event of a physical button B
  • the virtual reality display apparatus 200 may determine that the physical keyboard in the vicinity of the user does not need to be displayed to the user. Also, it is possible to switch to display or not display the physical keyboard through one physical button.
  • the virtual reality display apparatus 200 may detect a user gesture that instructs the controller 230 to display the physical keyboard on the display 220 and may determine whether the physical keyboard needs to be displayed to the user. For example, when the virtual reality display apparatus 200 detects a gesture A used to indicate that the physical keyboard needs to be displayed, the virtual reality display apparatus 200 may determine that the physical keyboard needs to be displayed. When the virtual reality display apparatus 200 detects a gesture B used to indicate that the physical keyboard does not need to be displayed, the virtual reality display apparatus 200 may determine to not display the physical keyboard. In addition, it is possible to switch to display or not display the physical keyboard through the same gesture.
  • the virtual reality display apparatus 200 may detect a head movement, a body movement, and an eye movement of the user that instruct to display the physical keyboard through the imaging apparatus 213 and may determine whether the physical keyboard needs to be displayed to the user.
  • the virtual reality display apparatus 200 may detect a head rotation or a line-of-sight of the user and may determine whether the physical keyboard needs to be displayed to the user.
  • the virtual reality display apparatus 200 may determine that the physical keyboard needs to be displayed to the user.
  • the virtual reality display apparatus 200 may determine that the physical keyboard does not need to be displayed to the user.
  • a condition B e.g., a case in which a user sees a virtual object or a virtual film screen in virtual reality
  • the virtual reality display apparatus 200 may determine that the physical keyboard does not need to be displayed to the user.
  • the condition A and the condition B may or may not be complementary to each other.
  • the virtual reality display apparatus 200 may determine that the physical keyboard needs to be displayed to the user. For example, the virtual reality display apparatus 200 detects whether the user's hand is in the vicinity of the user, whether a keyboard is in the vicinity of the user, or whether the user's hand is on the keyboard (e.g., whether a skin color is detected) through the imaging apparatus 213. When all of the above three conditions are met, the virtual reality display apparatus 200 may determine that the physical keyboard needs to be displayed to the user. When any one of the above three conditions is not met, the virtual reality display apparatus 200 may determine that the physical keyboard does not need to be displayed to the user.
  • a condition of whether a user's hand is in the vicinity of the user and a condition of whether a keyboard is in the vicinity of a user may be determined simultaneously or sequentially, and their order is not limited.
  • the virtual reality display apparatus 200 may determine whether the user's hand is on the keyboard.
  • step 410 when the virtual reality display apparatus 200 determines that the physical keyboard need to be displayed to the user, the virtual reality display apparatus 200 proceeds to step 420 and captures an image of the physical keyboard.
  • the virtual reality display apparatus 200 may capture a user vicinity image using the imaging apparatus 213, detect a physical keyboard image from the captured image, and capture the physical keyboard image.
  • the virtual reality display apparatus 200 may detect a feature point in the captured image, compare the detected feature point with a pre-stored feature point of the keyboard image, and detect the physical keyboard image. For example, coordinates of four corners of the physical keyboard may be determined according to the pre-stored feature point of the physical keyboard image and a coordinate of a feature point in the captured image matching a coordinate of the pre-stored feature point of the physical keyboard image. Subsequently, an outline of the physical keyboard may be determined according to the coordinates of the four corners in the captured image. As a result, the virtual reality display apparatus 200 may determine a keyboard image in the captured image.
  • the feature point may be a scale-invariant feature transform (SIFT) or another feature point.
  • SIFT scale-invariant feature transform
  • a coordinate of a point of an outline of any object (that is, a point on an outline of an object) in the captured image may be calculated in the same or similar method.
  • the keyboard image may be detected from the captured image in another method.
  • a coordinate of a feature point of a pre-stored keyboard image is referred to as P world (in a local coordinate system of a keyboard).
  • a coordinate of an upper left corner on an outline of the pre-stored keyboard image is referred to as P corner (in the local coordinate system of a keyboard).
  • a coordinate of a feature point in the captured image matching a feature point in the pre-stored keyboard image is referred to as P image .
  • Transforms from the local coordinate system of a keyboard to a coordinate system of the imaging apparatus 213 are referred to as R and t. In this case, when R indicates rotation, t indicates shift, and a projection matrix of the imaging apparatus 213 is referred to as K, Equation 1 may be obtained as follows.
  • the coordinate of the feature point in the pre-stored keyboard image and the coordinate of the feature point in the captured image matching the coordinate of the feature point of the pre-stored physical keyboard image are substituted into Equation 1 to obtain R and t, respectively.
  • a coordinate of an upper left corner in the captured image may be obtained as K * (R * P corner + t).
  • Coordinates of the other three corners of the keyboard in the captured image may also be obtained in the same method.
  • the outline of the keyboard in the captured image may be acquired by connecting the corners. Accordingly, the virtual reality display apparatus 200 may also calculate a coordinate of an outline point of any object in the captured image in order to acquire an outline on which the object in the captured image is projected.
  • the virtual reality display apparatus 200 may enlarge an imaging angle-of-view and capture a larger image that the previously captured image in order to detect a physical keyboard from the newly captured image (e.g., using an optical angle imaging apparatus). Also, the virtual reality display apparatus 200 may instruct the user to rotate in a direction of the physical keyboard in order to recapture an image including the physical keyboard. This will be described below with reference to FIG. 6.
  • FIG. 6 is a view showing a screen for inducing a user to rotate in a direction of a keyboard according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may overlay a direction indicating image 620 in a virtual reality image 610 in order to instruct the user to change his/her line-of-sight in a direction of a physical keyboard.
  • the direction indicating image 620 may include images of an arrow, a finger, etc.
  • the direction indicating image 620 is shown using an arrow.
  • the virtual reality display apparatus 200 may also determine a location of the physical keyboard according to location information that is detected from an image previously captured and stored in the memory or that is detected in a wireless positioning method (e.g., Bluetooth transmission, a radio-frequency identification (RFID) label, infrared rays, ultrasonic waves, a magnetic field, etc.).
  • a wireless positioning method e.g., Bluetooth transmission, a radio-frequency identification (RFID) label, infrared rays, ultrasonic waves, a magnetic field, etc.
  • the virtual reality display apparatus 200 acquires a different-view image of the physical keyboard and a binocular-view image on the basis of the captured physical keyboard image.
  • the virtual reality display apparatus 200 may perform viewpoint correction on the captured physical keyboard image and the acquired different-view image of the physical keyboard on the basis of a location relationship between the user's eye and the imaging apparatus 213.
  • the virtual reality display apparatus 200 may perform a homography transform on the detected physical keyboard image according to a rotation and shift relationship between a coordinate system of the user's eye and a coordinate system of the imaging apparatus 213 in order to acquire the binocular-view image of the physical keyboard.
  • the rotation and shift relationship between the coordinate system of the user's eye and the coordinate system of the imaging apparatus 213 may be determined in an offline method or determined by reading and using data provided by a manufacturer.
  • the virtual reality display apparatus 200 may acquire the different-view image of the physical keyboard on the basis of the captured physical keyboard image. Subsequently the virtual reality display apparatus 200 may perform viewpoint correction on the captured physical keyboard image and the different-view image of the physical keyboard on the basis of a location relationship between the user's eye and the single imaging apparatus 213 to acquire the binocular-view image of the physical keyboard.
  • the imaging apparatus 213 is a single-view imagining apparatus, the captured physical keyboard image has only one view. Accordingly, there is a need of a method of transforming a physical keyboard image into a stereo image together with depth information.
  • the virtual reality display apparatus 200 may acquire a physical keyboard image from another view by performing a calculation on the basis of the physical keyboard image from the current view to acquire the stereo image.
  • the virtual reality display apparatus 200 may use a planar rectangle to generate a model for the physical keyboard.
  • a location and posture of the physical keyboard in a 3D coordinate system of the single-view imaging apparatus may be acquired on the basis of a homography transformation relationship.
  • the physical keyboard may be projected on a field-of-view of the user's left eye and a field-of-view of the user's right eye.
  • a binocular view of the user displayed in the virtual reality may be formed to have a stereo effect and a visual cue that reflect an actual posture of the physical keyboard.
  • the virtual reality display apparatus 200 may approximate an expression form of an object with a more complicated shape using a partial planar model. Also, a similar method may be used to estimate a location and posture of the object. The virtual reality display apparatus 200 may generate a binocular view of the object through the projection.
  • a physical keyboard image from one view will be used below as an example to describe the calculation of the binocular view of the physical keyboard.
  • the virtual reality display apparatus 200 may measure in advance or acquire a 3D coordinate of a feature point of the physical keyboard (in the local coordinate system of the keyboard) by capturing a plurality of images and performing a 3D restoration using a stereo visual method.
  • the 3D coordinate of the feature point of the physical keyboard in the local coordination system of the physical point may be referred to as P obj .
  • a coordinate of the feature point of the physical keyboard in a coordinate system of the imaging apparatus 213 may be referred to as P cam .
  • a rotation and a shift from the local coordinate system of the physical keyboard and the coordinate system of the imaging apparatus 213 may be referred to as R and t, respectively.
  • Rotations and shifts of the user's left eye and right eye in the coordinate system of the imaging apparatus 213 may be referred to as R l , t l , R r , and t r .
  • a projection point in a captured image corresponding to the feature point of the physical keyboard may be referred to as P img .
  • an internal parameter matrix K of the imaging apparatus 213 may be acquired through a previous setting.
  • R and t may be acquired through control of an observed projection point.
  • a captured physical keyboard image I cam may be transformed into an image I left seen by the left eye.
  • An image of the right eye may be acquired in a method similar to the method of acquiring an image of the left eye.
  • FIGS. 7A to 7D are views showing a binocular view of a physical keyboard on the basis of a physical keyboard image captured by a virtual reality display apparatus according to an exemplary embodiment.
  • the virtual reality display apparatus 200 captures a user vicinity image 710 using the imaging apparatus 213 and detects a physical keyboard image 720 in the user vicinity image 710.
  • the virtual reality display apparatus 200 may detect a location and posture of the physical keyboard in a 3D space according to a single view.
  • the virtual reality display apparatus 200 captures a nearby image 710a of the user using the imaging apparatus 213 and detects a location and posture of the physical keyboard in a 3D space in a different view 740 from a view 730.
  • the virtual reality display apparatus 200 may perform viewpoint correction on the captured image and the acquired different-view image of the object on the basis of a location relationship between the imaging apparatus 213 and the eyes of the user.
  • FIG. 7C shows a location and posture of a physical keyboard 750 in a 3D space that are detected in the different view 740.
  • the virtual reality display apparatus 200 may display a binocular view 760 of the physical keyboard acquired through the viewpoint correction in virtual reality.
  • the method of displaying a binocular view of a physical keyboard in virtual reality using a single-view imaging apparatus 213 has been described.
  • a depth camera or at least two or more single-view cameras as the imagining apparatus.
  • the imaging apparatus 213 is a depth camera
  • a location and posture of a physical keyboard may be acquired from a relationship between a 3D image and the depth camera.
  • the imaging apparatus 213 includes at least two single-view cameras
  • a location and posture of a physical keyboard may be acquired through the at least two single-view cameras.
  • the virtual reality display apparatus 200 displays an image of the physical keyboard to the user together with the virtual reality image.
  • the virtual reality display apparatus 200 may overlay the physical keyboard on the virtual reality image, or display the physical key board as a picture-in-picture image. This will be described with reference to FIG. 8.
  • FIGS. 8A to 8D illustrate a physical keyboard in virtual reality according to an exemplary embodiment.
  • the virtual reality display apparatus 200 captures a user vicinity image 810 using the imaging apparatus 213. As shown in FIG. 8B, the virtual reality display apparatus 200 acquires a physical keyboard image 820. Also, as shown in FIG. 8C, the virtual reality display apparatus 200 may display virtual reality 830 separately from the physical keyboard. Lastly, the virtual reality display apparatus 200 displays the physical keyboard in the virtual reality, as shown in FIG. 8D. According to an exemplary embodiment, the virtual reality display apparatus 200 may acquire the physical keyboard image 820 first or may display the virtual reality 830 first.
  • the virtual reality display apparatus 200 determines whether the physical keyboard needs to be continuously displayed to the user.
  • the virtual reality display apparatus 200 may determine that the physical keyboard no longer needs to be displayed to the user.
  • the virtual reality display apparatus 200 may continuously detect a keyboard input situation of the user to detect whether the use of the physical keyboard is finished.
  • the virtual reality display apparatus 200 may detect that the user has finished using the physical keyboard.
  • the virtual reality display apparatus 200 may determine that the user is not finished using the physical keyboard.
  • the virtual reality display apparatus 200 may determine that the user has finished using the physical keyboard.
  • the predetermined time may be automatically set by the virtual reality display apparatus 200 or may be set by the user.
  • the predetermined time may be 5 minutes.
  • the virtual reality display apparatus 200 may determine that the user has finished using the physical keyboard. For example, when a distance between the user's hand and the physical keyboard exceeding a first threshold usage distance is detected, the virtual reality display apparatus 200 may determine that the user has finished using the physical keyboard. In an exemplary embodiment, one hand of the user may be far from the physical keyboard and the other hand may remain on the physical keyboard. Even in this case, the virtual reality display apparatus 200 may determine that the user is no longer using the physical keyboard. Accordingly, when a distance between the user's hand and the physical keyboard exceeding a second threshold usage distance is detected, the virtual reality display apparatus 200 may determine that the user has finished using the physical keyboard.
  • the first threshold usage distance and the second threshold usage distance may be the same or different.
  • the first threshold usage distance and the second threshold usage distance may be automatically set by the virtual reality display apparatus 200 or may be set by the user.
  • a method of measuring the distance between the user's hand and the physical keyboard may be set by the virtual reality display apparatus 200 or may be set by the user.
  • the user when a user input to stop displaying the physical keyboard is detected as being received, the user may determine that the user has finished using the physical keyboard.
  • the user may enter a signal for stopping the display of the physical keyboard into the virtual reality display apparatus 200 in an input method such as by pressing a specific button.
  • the virtual reality display apparatus 200 may determine that the user has finished using the physical keyboard. For example, when no control command requiring the use of the physical keyboard is detected or when an application needing the physical keyboard is detected as having ended the virtual reality display apparatus 200 may determine that the user has finished using the physical keyboard in order to perform an operation of an application interface in the virtual reality.
  • the virtual reality display apparatus 200 may determine whether the switched to application needs to use the physical keyboard.
  • the virtual reality display apparatus 200 determines that the physical keyboard needs to be continuously displayed to the user in step 450 because, for example, the newly executed application also needs user inputs through the physical keyboard, the virtual reality display apparatus 200 continues to display the physical keyboard to the user.
  • the virtual reality display apparatus 200 determines that the physical keyboard does not need to be continuously displayed to the user in step 450, the virtual reality display apparatus 200 proceeds to step 460 and displays the virtual reality except for the physical keyboard. For example, when the sensor 211 detects that the user makes a gesture of swiping left or right at a location where the physical keyboard is displayed in the virtual reality image, the controller 230 may control the display 220 to display the virtual reality image without the physical keyboard.
  • the above-described method may also be applied to a handle (e.g., interactive remote controller including various sensors) that is used when a virtual game using the virtual reality display apparatus 200 is played.
  • a handle e.g., interactive remote controller including various sensors
  • the virtual reality display apparatus 200 detects whether the user grabs the handle.
  • the virtual reality display apparatus 200 may display only the virtual game to the user when the user grabs the handle.
  • the virtual reality display apparatus may capture a user vicinity image through the imaging apparatus 213 and may display the handle in the captured image when the user does not grab the handle.
  • the virtual reality display apparatus 200 may detect a temperature and/or humidity around the handle and may determine whether the user grabs the handle. Generally, since a temperature around the user is lower than that of the user's body and humidity of the user's hand is higher than that around the user, the virtual reality display apparatus 200 may include a temperature sensor and/or a humidity sensor provided in the handle and may determine whether the user grabs the handle. In greater detail, the virtual reality display apparatus 200 may determine whether the user grabs the handle through a comparison of a predetermined threshold temperature and/or a threshold humidity with a measured ambient temperature and/or humidity.
  • the virtual reality display apparatus 200 may detect a movement of the handle to determine whether the user grabs the handle.
  • the virtual reality display apparatus 200 may include a motion sensor (a gyroscope, an inertia accelerometer, etc.) to determine whether the user grabs the handle through intensity of the movement, a duration, etc.
  • a motion sensor a gyroscope, an inertia accelerometer, etc.
  • the virtual reality display apparatus 200 may detect electric current and/or inductance to determine whether the user grabs the handle. Since a human body is an electrical conductor containing moisture, the virtual reality display apparatus 200 may include electrodes provided on a surface of the handle and may measure electric current between the electrodes or measure inductance of each of the electrodes to determine whether the electrode is connected to the user's body.
  • the virtual reality display apparatus 200 may display a notice that no handle is around the user.
  • the virtual reality display apparatus 200 may display a binocular view of an actual object around the user to the user according to the user's determination to allow the user to find the handle in the vicinity or to switch a situation of an application such that the virtual game may be manipulated without the handle.
  • the virtual reality display apparatus 200 may determine whether the handle is located inside an actual field-of-view of the user (that is, a field-of-view of the user who does not wear the virtual reality display apparatus 200). When the handle is inside the field-of-view of the user, the virtual reality display apparatus 200 may display a binocular view of the handle along with the virtual reality. When the handle is outside the field-of-view of the user, the virtual reality display apparatus 200 may display a notice that no handle is in the current field-of-view of the user. In this case, the virtual reality display apparatus 200 may instruct the user to rotate in a direction in which the handle is located such that the handle may be included in the field-of-view of the user. In an exemplary embodiment, the user may be induced through images, text, audio, or a video.
  • the virtual reality display apparatus 200 may display an inducing box in the virtual reality such that the user may find the handle in the vicinity.
  • the inducing box may induce the user to adjust his or her view according to a location relationship between the handle and the user such that the user may find the handle.
  • the virtual reality display apparatus 200 may induce the user through a voice, an arrow, etc.
  • the method of displaying a real-world object using the virtual reality display apparatus 200 has been described in detail using an example thus far.
  • the virtual reality display apparatus 200 may be more convenient and may enhance a sense of immersion.
  • a method of eating food while wearing the virtual reality display apparatus 200 will be described below with reference to FIGS. 9 to 19.
  • FIG. 9 is a flowchart showing a method of displaying food in virtual reality by a virtual reality display apparatus according to an exemplary embodiment.
  • step 910 the virtual reality display apparatus 200 determines whether food needs to be displayed to a user.
  • the virtual reality display apparatus 200 may determine that the food needs to be displayed to the user.
  • a button according to an exemplary embodiment will be described with reference to FIG. 10.
  • FIG. 10 is a view showing a button according to an exemplary embodiment.
  • the button may be a hardware button 1030 or 1040 included on the virtual reality display apparatus 200 or a virtual button 1020 displayed on a screen 1010 of the virtual reality display apparatus 200.
  • the virtual reality display apparatus 200 may determine that food and/or drink need to be displayed to the user.
  • the predetermined method may be at least one of a short press, a long press, a predetermined number of short presses, alternate short and long presses, etc.
  • the virtual reality display apparatus 200 may determine whether the object with the specific label needs to be displayed to the user. In this case, all objects needing to be displayed to the user may have the same specific label. Alternatively, other objects needing to be displayed to the user may have different kinds of labels in order to identify different kinds of objects. For example, a first kind of label may be attached to a table in order to identify the table. A second kind of label may be attached to a chair in order to identify the chair. A third kind of label may be attached to a utensil in order to identify the utensil. When the third kind of label is detected around the user, the virtual reality display apparatus 200 may determine that food needs to be displayed to the user.
  • the specific label may be recognized and sensed in various ways.
  • the virtual reality display apparatus 200 may determine that food needs to be displayed to the user.
  • the predetermined meal time may be automatically set by the virtual reality display apparatus 200 and may also be set by the user.
  • a meal time is automatically set by the virtual reality display apparatus 200 and also a meal time is set by the user, it may be determined that food needs to be displayed to the user according to priorities. For example, when the meal time set by the user has a higher priority than the meal time automatically set by the virtual reality display apparatus 200 and only when the meal time set by the user is reached, the virtual reality display apparatus 200 may determine that the user wants to eat food. It is possible to respond to both of the meal time automatically set by the virtual reality display apparatus 200 and the meal time set by the user.
  • the virtual reality display apparatus 200 may recognize a nearby object in order to determine the type of an actual object. When at least one of food, drink, and a utensil is detected, the virtual reality display apparatus 200 may determine that food needs to be displayed to the user. The virtual reality display apparatus 200 may use an image recognition method to detect food, drink, and a utensil. Furthermore, the virtual reality display apparatus 200 may use other methods to detect food, drink, and a utensil.
  • the virtual reality display apparatus 200 may also determine that the user wants to eat food. That is, the virtual reality display apparatus 200 may make the determination in consideration of two or more conditions.
  • the virtual reality display apparatus 200 may determine that food needs to be displayed to the user.
  • the predetermined gesture may be made by one or two hands.
  • the predetermined gesture may be at least one of waving a hand, drawing a circle, drawing a quadrangle, drawing a triangle, a framing gesture, etc.
  • the virtual reality display apparatus 200 may determine that food needs to be displayed to the user.
  • the predetermined posture may be at least one of rotating a head, leaning a body to the left, leaning a body to the right, etc.
  • FIG. 11 is a view showing a framing operation according to an exemplary embodiment.
  • FIG. 11 is a view showing a framing operation according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may determine objects included in a framing area 1120, which is displayed as a quadrangle by a framing gesture of a user 1110, as objects to be displayed to a user.
  • a gesture or posture may be detected through a gesture detection device or a posture detection device.
  • the virtual reality display apparatus 200 may determine that food needs to be displayed to the user.
  • the virtual reality display apparatus 200 may detect a remote control command that the user enters into another device and determine that food needs to be displayed to the user.
  • the other device may include at least one of a mobile terminal, a personal computer (PC), a tablet PC, an external keyboard, a wearable device, a handle, etc.
  • the wearable device may include at least one of a smart bracelet, a smart watch, etc.
  • the other device may be connected with the virtual reality display apparatus 200 in a wired or wireless manner.
  • a wireless connection may include Bluetooth, Ultra Wide Band, Zigbee, WiFi, a macro network, etc.
  • the virtual reality display apparatus 200 may determine that food needs to be displayed to the user.
  • a voice or other sound signals of the user may be collected through a microphone.
  • the virtual reality display apparatus 200 may recognize a voice command or a sound control command of the user using voice recognition technology. For example, when the user makes a voice command "Start eating," the virtual reality display apparatus 200 may receive and recognize the voice command.
  • a correspondence relationship between the voice command and a command to display food to the user may be pre-stored in the virtual reality display apparatus 200 in the form of a table.
  • the virtual reality display apparatus 200 is not bound to a language, and the voice command is also not limited to the above-example, but may be applied in various ways.
  • the voice command may be set by the virtual reality display apparatus 200 and may also be set by the user.
  • step 910 when the virtual reality display apparatus 200 determines that food does not need to be displayed to the user, the virtual reality display apparatus 200 determines whether the food needs to be continuously displayed to the user.
  • step 910 when the virtual reality display apparatus 200 determines that the food needs to be displayed to the user, the virtual reality display apparatus 200 proceeds to step 920 and determines food to be displayed to the user.
  • the virtual reality display apparatus 200 pre-stores images of various kinds of objects (such as food) and compares a detected image of an actual object with the pre-stored images of food. When the detected image of the actual image matches the pre-stored image of food, the virtual reality display apparatus 200 determines that the actual object detected from the captured image includes the food and determines that the food detected from the captured image is an object to be displayed to the user.
  • objects such as food
  • the user hopes that as few as possible of actual objects detected from the captured image will be displayed. Accordingly, when the actual object detected from the captured image includes food, the virtual reality display apparatus 200 may separate the food from other actual objects included in the captured image, and the virtual reality display apparatus 200 may determine that only the food is the object to be displayed to the user and may not display the other actual objects to the user. Furthermore, since a relative location between the user's hand and the food may be important to accurately grab the food, the virtual reality display apparatus 200 may detect an image of the user's hand from the captured image according to various algorithms. When the user's hand is detected, the virtual reality display apparatus 200 may determine that the user's hand is the object to be displayed to the user.
  • the virtual reality display apparatus 200 may use at least one of a label, a gesture, a voice command, and a remote control command to determine the object to be displayed to the user. Also, as shown in FIG. 12, the virtual reality display apparatus 200 may select the object to be displayed to the user.
  • FIG. 12 is a view showing a screen for selecting an object to be displayed to a user according to an exemplary embodiment.
  • FIG. 12 shows the check box 1220 as a unit for selecting an object, but is not limited thereto. Accordingly, various units for selecting an object to be displayed to the user may be provided.
  • the virtual reality display apparatus 200 may receive a user input through a mouse.
  • the mouse may be a physical mouse and may also be a virtual mouse.
  • the user may manipulate the virtual mouse to select several objects using the check box 1220 in the screen 1210 displayed in the virtual reality display apparatus 200.
  • the virtual reality display apparatus 200 may detect the manipulation and select an object displayed to the user.
  • the virtual reality display apparatus 200 acquires a binocular-view image to be displayed to the user.
  • a user vicinity image may be captured using the imaging apparatus 213.
  • An image of food to be displayed to the user may be detected from the captured image.
  • a binocular view of food to be displayed to the user may be acquired from the detected image of food to be displayed to the user.
  • the virtual reality display apparatus 200 may display the food to the user together with the virtual reality and may delete a displayed actual object according to the user's input.
  • the virtual reality display apparatus 200 may display the food in the virtual reality such that the food may be superimposed on the virtual reality.
  • the virtual reality and the food may be covered in a 3D space by each other, and may be displayed in various methods in order to decrease shading and interference between each other.
  • the virtual reality display apparatus 200 may decrease shading and interference between the virtual reality and the food by displaying the food to be displayed in the virtual reality by PIP (that is, displaying a binocular view of a zoomed-out actual object n at a specific location of a virtual scene image), displaying only food without displaying the virtual reality (that is, displaying only a binocular view of an actual object in a virtual scene image as if the user sees the actual object through glasses), displaying the virtual reality by PIP (that is, displaying a zoomed-out virtual scene image at a specific location of a binocular view of food), or spatially combining and displaying the binocular view of the food and the virtual reality (that is, translucently display a binocular view of an actual object over a virtual scene image).
  • PIP that is, displaying a binocular view of a zoomed-out actual object n at a specific location of a virtual scene image
  • displaying only food without displaying the virtual reality that is, displaying only
  • the virtual reality display apparatus 200 may display the food in a translucent manner.
  • the virtual reality display apparatus 200 may determine whether to display the food in a translucent manner depending on a content type of an application interface displayed in the virtual reality and/or an interaction situation between the application interface and the user. For example, when the user plays a virtual game using the virtual reality display apparatus 200 or when a large amount of user interaction input and frequent shifts in the interface of the virtual game are required, the virtual reality display apparatus 200 may display the food in a translucent manner. Also, when a control frequency of a virtual movie theater or a user's input decreases in an application interface displayed in the virtual reality, the virtual reality display apparatus 200 may finish displaying the food in a translucent manner. In a similar way, the virtual reality display apparatus 200 may also display the food as an outline or a 3D grid line.
  • At least one of the virtual object and the food may be enlarged or reduced and/or shifted to effectively avoid shading between and the displayed food and the virtual object in the virtual reality.
  • a virtual screen displayed in the virtual reality may be zoomed out or shifted in order to avoid obscuring the food. This will be described with reference to FIG. 13.
  • FIGS. 13A and 13B are views showing a method of avoiding interference between virtual reality and an actual object according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may maintain the size of the actual object 1321 and may zoom out the virtual reality image 1311 to be placed at a corner of the screen.
  • this is merely one exemplary embodiment, and thus it is possible to avoid interference between the virtual reality and the actual object in various ways.
  • the actual object 1321 may be zoomed out or shifted.
  • the virtual reality display apparatus 200 may determine a display priority to determine a display method.
  • a display priority list may be predetermined, and the virtual reality display apparatus 200 may classify display priorities of a virtual object and a real-world object in the virtual reality according to importance and urgency.
  • the display priority list may be automatically set by the virtual reality display apparatus 200 or may be set by the user according to a pattern of use.
  • the virtual reality display apparatus 200 may receive a user input to select which object in the virtual reality will be displayed or deleted. This will be described below with reference to FIG. 14.
  • FIG. 14 is a view showing a method of deleting an actual object displayed in virtual reality according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may receive a user input through a gesture 1410 of sweeping an object to be deleted and may delete an actual object being displayed.
  • the virtual reality display apparatus 200 may determine whether food needs to be continuously displayed to a user. When it is determined that an actual object no longer needs to be displayed to the user, the virtual reality display apparatus 200 may delete and no longer display a corresponding food. In an exemplary embodiment, the virtual reality display apparatus 200 may detect that the user has finished eating the food and may determine that the food no longer needs to be displayed to the user. In this case, the virtual reality display apparatus 200 may receive a user input through at least one of a button, a gesture, a label, a remote control command, and a voice command and may determine whether the food needs to be continuously displayed to the user.
  • the method of eating the food while wearing the virtual reality display apparatus 200 has been described in detail using an example thus far.
  • the virtual reality display apparatus 200 is not limited thereto and thus may display the virtual reality depending on various situations.
  • the virtual reality display apparatus 200 may display a direction in which the user moves or a real-world object which the part of the body is approaching together with the virtual reality in order to prevent such a collision.
  • the virtual reality display apparatus 200 may determine whether there is an object that the user may collide with around the user.
  • the virtual reality display apparatus 200 may acquire an object near the user, a location of the user, an operation, a movement of the user, or the like using at least one of the imaging apparatus 213 and a sensor 211.
  • the virtual reality display apparatus 200 may determine that the user is too close to a nearby object (e.g., when a distance is smaller than a dangerous distance threshold).
  • the virtual reality display apparatus 200 may determine that the object near the user needs to be displayed.
  • the virtual reality display apparatus 200 may capture an image of the object that the user may collide with, perform viewpoint correction on the image of the object the user may collide with on the basis of a location relationship between the imaging apparatus 213 and the user's eye, generate a binocular view of the object, and display the generated binocular view together with the virtual reality.
  • the object that the user may collide with may be displayed using at least one of translucency, an outline, and a 3D grid line.
  • the virtual reality display apparatus 200 may display only an edge of the object that the user may collide with.
  • the virtual reality display apparatus 200 may remind the user of the object that the user may collide with through text, an image, audio, and a video.
  • the virtual reality display apparatus 200 may display a distance between the user and the object that the user may collide with as inducting information (e.g., in the form of text and/or graphic).
  • a method of displaying a display item in the virtual reality display apparatus 200 will be described below with reference to FIGS. 15 and 16.
  • a method of displaying a display item of an external apparatus in the virtual reality display apparatus 200 will be described.
  • the user may be aware of information regarding the external apparatus, a task status of the external apparatus, etc.
  • FIG. 15 is a view showing a method of displaying a display item in a virtual reality display apparatus according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may receive a display item from these external apparatuses and display the received display item in virtual reality 1560.
  • the display item may be an item indicating a manipulation interface, a manipulation state, notice information, indication information, etc.
  • the external apparatus may be an apparatus capable of communicating with the virtual reality display apparatus 200, for example, an IoT apparatus.
  • the virtual reality display apparatus 200 may monitor an actual field-of-view of the user in real time.
  • the virtual reality display apparatus 200 may acquire a corresponding display item according to the type of the external apparatus.
  • the virtual reality display apparatus 200 may use information measured through various kinds of sensors and information such as a facility map of a room in which the user is located in order to monitor the field-of-view of the user in real time.
  • the virtual reality display apparatus 200 may analyze a view of the imaging apparatus 213 installed in the virtual reality display apparatus 200 to acquire the field-of-view of the user.
  • the virtual reality display apparatus 200 may acquire and display information corresponding to the external apparatus, for example, a cooking completion notice 1511, a screen 1521 captured by the security camera 1520, a temperature 1531 of the air conditioner 1530, a time 1541, a mobile terminal interface 1551, etc.
  • the virtual reality display apparatus 200 may receive a display item from an external apparatus outside the actual field-of-view of the user and may display the received display item. For example, when a guest arrives at a door, an intelligent doorbell installed in the door may transmit a notice and an image of an outside of the door to the virtual reality display apparatus 200. Also, the virtual reality display apparatus 200 may communicate with a mobile terminal of the user to adjust an interface of the mobile terminal. This will be described with reference to FIG. 16.
  • FIG. 16 is a view showing a method of displaying a screen of an external apparatus in a virtual reality display apparatus according to an exemplary embodiment.
  • the virtual reality display apparatus 200 may manipulate a display item to remotely control a mobile terminal 1640.
  • the mobile terminal 1640 and the virtual reality display apparatus 200 communicate with each other.
  • the virtual reality display apparatus 200 may display an interface 1620 of the mobile terminal 1640, and the user may manipulate the interface 1620 of the mobile terminal 1640 displayed in the virtual reality display apparatus 200 to receive a call.
  • the virtual reality display apparatus 200 may receive a user input to disconnect the call directly or may disconnect the call by remotely controlling the mobile terminal 1640. Furthermore, the user may not perform any operation.
  • the virtual reality display apparatus 200 may be set to call again or may remotely control the mobile terminal 1640 to set a reminder to call again.
  • the interface 1620 of the mobile terminal 1640 may be displayed in the virtual reality display apparatus 200.
  • the user may manipulate the interface 1620 displayed in the virtual reality display apparatus 200 to respond to the message.
  • the virtual reality display apparatus 200 may set reply task information or may remotely control the mobile terminal 1640 to set a reply reminder.
  • the virtual reality display apparatus 200 may call the message sender using the virtual reality display apparatus 200 according to the user's manipulation (e.g., when a head-mounted display is used as a Bluetooth earphone).
  • the virtual reality display apparatus 200 may be convenient and enhance a sense of immersion because the user may manipulate the mobile terminal 1640 using the virtual reality display apparatus 200 while the user wears the virtual reality display apparatus 200 and experience the virtual reality 1610.
  • the virtual reality display apparatus 200 may display an indicator 1630 such as an arrow, an indication signal, and text to inform the user of the location of the mobile terminal 1640. Furthermore, when the user finishes using the mobile terminal 1640, the virtual reality display apparatus 200 may also remove and no longer display the display item.
  • the virtual reality display apparatus 200 may display an acquired display item in various ways.
  • the display item may be displayed and superimposed on the virtual reality.
  • the display item may be displayed according to an appropriate layout such that the user may better interact with the external device. It may be considered that the interaction between the user and the virtual reality and the interaction between the user and the external apparatus are performed at the same time.
  • the virtual reality display apparatus 200 may also select a kind of a display item to be displayed.
  • external apparatuses may be listed and managed as a list.
  • the virtual reality display apparatus 200 may display only a display item acquired from an external apparatus selected from the list according to the user's input.
  • detailed settings for the external apparatus are possible. For example, types of messages that may be received from the external apparatus may be listed and managed as a list.
  • the virtual reality display apparatus 200 may display only a message selected according to the user's input.
  • the virtual reality display apparatus 200 may set a blocking level that allows information to be received according to whether an application running in the virtual reality display apparatus 200 is hindered and may display the display item according to the set level. For example, when an application (e.g., an intense fight in a real-time virtual network game) is not hindered during the execution of the application, the virtual reality display apparatus 200 may set the blocking level to be high and may display the display item in a method that has as little influence as possible. An application with a low blocking level may freely display the display item. It is also possible to set a plurality of blocking levels according to a single application situation.
  • an application e.g., an intense fight in a real-time virtual network game
  • the operations or steps of the methods or algorithms according to the above exemplary embodiments may be embodied as computer-readable codes on a computer-readable recording medium.
  • the computer-readable recording medium may be any recording apparatus capable of storing data that is read by a computer system. Examples of the computer-readable recording medium include read-only memories (ROMs), random-access memories (RAMs), CD-ROMs, magnetic tapes, floppy disks, and optical data storage devices.
  • the computer-readable recording medium may be a carrier wave that transmits data via the Internet, for example.
  • the computer-readable medium may be distributed among computer systems that are interconnected through a network so that the computer-readable code is stored and executed in a distributed fashion.
  • the operations or steps of the methods or algorithms according to the above exemplary embodiments may be written as a computer program transmitted over a computer-readable transmission medium, such as a carrier wave, and received and implemented in general-use or special-purpose digital computers that execute the programs.
  • a computer-readable transmission medium such as a carrier wave
  • one or more units of the above-described apparatuses and devices can include or implemented by circuitry, a processor, a microprocessor, etc., and may execute a computer program stored in a computer-readable medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un appareil d'affichage de réalité virtuelle et un procédé d'affichage associé. Le procédé d'affichage consiste à afficher une image de réalité virtuelle ; à acquérir des informations d'objet concernant un objet du monde réel sur la base d'une vue binoculaire de l'utilisateur ; et à afficher les informations d'objet acquises conjointement avec l'image de réalité virtuelle.
EP16842274.9A 2015-08-31 2016-08-31 Appareil d'affichage de réalité virtuelle et procédé d'affichage associé Ceased EP3281058A4 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201510549225.7A CN106484085B (zh) 2015-08-31 2015-08-31 在头戴式显示器中显示真实物体的方法及其头戴式显示器
KR1020160106177A KR20170026164A (ko) 2015-08-31 2016-08-22 가상 현실 디스플레이 장치 및 그 장치의 표시 방법
PCT/KR2016/009711 WO2017039308A1 (fr) 2015-08-31 2016-08-31 Appareil d'affichage de réalité virtuelle et procédé d'affichage associé

Publications (2)

Publication Number Publication Date
EP3281058A1 true EP3281058A1 (fr) 2018-02-14
EP3281058A4 EP3281058A4 (fr) 2018-04-11

Family

ID=58236359

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16842274.9A Ceased EP3281058A4 (fr) 2015-08-31 2016-08-31 Appareil d'affichage de réalité virtuelle et procédé d'affichage associé

Country Status (3)

Country Link
EP (1) EP3281058A4 (fr)
KR (1) KR20170026164A (fr)
CN (2) CN110275619B (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020051490A1 (fr) * 2018-09-07 2020-03-12 Ocelot Laboratories Llc Insertion d'imagerie d'un environnement réel dans un environnement virtuel
CN114327044A (zh) * 2021-11-30 2022-04-12 歌尔光学科技有限公司 头戴显示设备的控制方法、装置、头戴显示设备及介质

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11308670B2 (en) 2017-03-22 2022-04-19 Sony Corporation Image processing apparatus and method
CN107168515A (zh) * 2017-03-31 2017-09-15 北京奇艺世纪科技有限公司 一种vr一体机中手柄的定位方法和装置
CN106896925A (zh) * 2017-04-14 2017-06-27 陈柳华 一种虚拟现实与真实场景融合的装置
CN107222689B (zh) * 2017-05-18 2020-07-03 歌尔科技有限公司 基于vr镜头的实景切换方法及装置
CN108960008B (zh) * 2017-05-22 2021-12-14 华为技术有限公司 Vr显示的方法和装置、vr设备
CN107229342A (zh) * 2017-06-30 2017-10-03 宇龙计算机通信科技(深圳)有限公司 文件处理方法及用户设备
CN107577337A (zh) * 2017-07-25 2018-01-12 北京小鸟看看科技有限公司 一种头戴显示设备的键盘显示方法、装置及头戴显示设备
US10627635B2 (en) 2017-08-02 2020-04-21 Microsoft Technology Licensing, Llc Transitioning into a VR environment and warning HMD users of real-world physical obstacles
CN107422942A (zh) * 2017-08-15 2017-12-01 吴金河 一种沉浸式体验的控制系统及方法
WO2019067642A1 (fr) * 2017-09-29 2019-04-04 Zermatt Technologies Llc Présentation d'application basée sur un environnement
DE102017218215B4 (de) * 2017-10-12 2024-08-01 Audi Ag Verfahren zum Betreiben einer am Kopf tragbaren elektronischen Anzeigeeinrichtung und Anzeigesystem zum Anzeigen eines virtuellen Inhalts
KR102389185B1 (ko) * 2017-10-17 2022-04-21 삼성전자주식회사 컨텐츠의 적어도 일부를 통해 표시된 입력 인터페이스를 이용하여 기능을 실행하기 위한 전자 장치 및 방법
CN108169901A (zh) * 2017-12-27 2018-06-15 北京传嘉科技有限公司 Vr眼镜
CN108040247A (zh) * 2017-12-29 2018-05-15 湖南航天捷诚电子装备有限责任公司 一种头戴式增强现实显示设备及方法
CN108572723B (zh) * 2018-02-02 2021-01-29 陈尚语 一种防晕车方法及设备
KR102076647B1 (ko) * 2018-03-30 2020-02-12 데이터얼라이언스 주식회사 가상 현실과 증강 현실을 이용한 IoT 디바이스 제어 시스템 및 방법
CN108519676B (zh) * 2018-04-09 2020-04-28 杭州瑞杰珑科技有限公司 一种头戴式助视装置
CN108764152B (zh) * 2018-05-29 2020-12-04 北京物灵智能科技有限公司 基于图片匹配实现互动提示的方法、装置及存储设备
CN108922115B (zh) * 2018-06-26 2020-12-18 联想(北京)有限公司 一种信息处理方法及电子设备
JP6739847B2 (ja) * 2018-09-12 2020-08-12 株式会社アルファコード 画像表示制御装置および画像表示制御用プログラム
EP3671410B1 (fr) * 2018-12-19 2022-08-24 Siemens Healthcare GmbH Procédé et dispositif pour commander une unité d'affichage de réalité virtuelle
US10992926B2 (en) * 2019-04-15 2021-04-27 XRSpace CO., LTD. Head mounted display system capable of displaying a virtual scene and a real scene in a picture-in-picture mode, related method and related non-transitory computer readable storage medium
US11137908B2 (en) * 2019-04-15 2021-10-05 Apple Inc. Keyboard operation with head-mounted device
US20200327867A1 (en) * 2019-04-15 2020-10-15 XRSpace CO., LTD. Head mounted display system capable of displaying a virtual scene and a map of a real environment in a picture-in-picture mode, related method and related non-transitory computer readable storage medium
US11265487B2 (en) 2019-06-05 2022-03-01 Mediatek Inc. Camera view synthesis on head-mounted display for virtual reality and augmented reality
CN110475103A (zh) * 2019-09-05 2019-11-19 上海临奇智能科技有限公司 一种头戴式可视设备
CN111124112A (zh) * 2019-12-10 2020-05-08 北京一数科技有限公司 一种虚拟界面与实体物体的交互显示方法及装置
JP6754908B1 (ja) * 2020-02-07 2020-09-16 株式会社ドワンゴ 視聴端末、視聴方法、視聴システム及びプログラム
CN111427447B (zh) * 2020-03-04 2023-08-29 青岛小鸟看看科技有限公司 虚拟键盘的显示方法、头戴显示设备及系统
CN112445341B (zh) 2020-11-23 2022-11-08 青岛小鸟看看科技有限公司 虚拟现实设备的键盘透视方法、装置及虚拟现实设备
CN112462937B (zh) 2020-11-23 2022-11-08 青岛小鸟看看科技有限公司 虚拟现实设备的局部透视方法、装置及虚拟现实设备
CN112581054B (zh) * 2020-12-09 2023-08-29 珠海格力电器股份有限公司 一种物料管理方法及物料管理装置
CN114827338A (zh) * 2021-01-29 2022-07-29 北京外号信息技术有限公司 用于在设备的显示媒介上呈现虚拟对象的方法和电子装置
CN114035732A (zh) * 2021-11-04 2022-02-11 海南诺亦腾海洋科技研究院有限公司 一键控制vr头显设备虚拟体验内容的方法及装置
WO2023130435A1 (fr) * 2022-01-10 2023-07-13 深圳市闪至科技有限公司 Procédé d'interaction, dispositif de visiocasque, système, et support de stockage
CN114419292A (zh) * 2022-01-21 2022-04-29 北京字跳网络技术有限公司 图像处理方法、装置、设备及存储介质
CN114972692B (zh) * 2022-05-12 2023-04-18 北京领为军融科技有限公司 基于ai识别和混合现实的目标定位方法
WO2024190187A1 (fr) * 2023-03-14 2024-09-19 パナソニックIpマネジメント株式会社 Dispositif de commande d'affichage, procédé de commande d'affichage et programme
CN116744195B (zh) * 2023-08-10 2023-10-31 苏州清听声学科技有限公司 一种参量阵扬声器及其指向性偏转方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6037882A (en) * 1997-09-30 2000-03-14 Levy; David H. Method and apparatus for inputting data to an electronic system
GB2376397A (en) * 2001-06-04 2002-12-11 Hewlett Packard Co Virtual or augmented reality
JP2005044102A (ja) * 2003-07-28 2005-02-17 Canon Inc 画像再生方法及び画像再生装置
JP2009025918A (ja) * 2007-07-17 2009-02-05 Canon Inc 画像処理装置、画像処理方法
CN101893935B (zh) * 2010-07-14 2012-01-11 北京航空航天大学 基于真实球拍的协同式增强现实乒乓球系统构建方法
US8884984B2 (en) * 2010-10-15 2014-11-11 Microsoft Corporation Fusing virtual content into real content
JP2012173772A (ja) * 2011-02-17 2012-09-10 Panasonic Corp ユーザインタラクション装置、ユーザインタラクション方法、ユーザインタラクションプログラム、及び集積回路
KR101591579B1 (ko) * 2011-03-29 2016-02-18 퀄컴 인코포레이티드 증강 현실 시스템들에서 실세계 표면들에의 가상 이미지들의 앵커링
US20120249587A1 (en) * 2011-04-04 2012-10-04 Anderson Glen J Keyboard avatar for heads up display (hud)
US9547438B2 (en) * 2011-06-21 2017-01-17 Empire Technology Development Llc Gesture based user interface for augmented reality
JP5765133B2 (ja) * 2011-08-16 2015-08-19 富士通株式会社 入力装置、入力制御方法及び入力制御プログラム
US8941560B2 (en) * 2011-09-21 2015-01-27 Google Inc. Wearable computer with superimposed controls and instructions for external device
CN103018905A (zh) * 2011-09-23 2013-04-03 奇想创造事业股份有限公司 头戴式体感操控显示系统及其方法
US9081177B2 (en) * 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
JP6079614B2 (ja) * 2013-12-19 2017-02-15 ソニー株式会社 画像表示装置及び画像表示方法
EP3098689B1 (fr) * 2014-01-23 2019-04-03 Sony Corporation Dispositif d'affichage d'images et procédé d'affichage d'images

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020051490A1 (fr) * 2018-09-07 2020-03-12 Ocelot Laboratories Llc Insertion d'imagerie d'un environnement réel dans un environnement virtuel
CN112771473A (zh) * 2018-09-07 2021-05-07 苹果公司 将来自真实环境的影像插入虚拟环境中
US11790569B2 (en) 2018-09-07 2023-10-17 Apple Inc. Inserting imagery from a real environment into a virtual environment
US11880911B2 (en) 2018-09-07 2024-01-23 Apple Inc. Transitioning between imagery and sounds of a virtual environment and a real environment
US12094069B2 (en) 2018-09-07 2024-09-17 Apple Inc. Inserting imagery from a real environment into a virtual environment
CN114327044A (zh) * 2021-11-30 2022-04-12 歌尔光学科技有限公司 头戴显示设备的控制方法、装置、头戴显示设备及介质

Also Published As

Publication number Publication date
CN106484085B (zh) 2019-07-23
CN106484085A (zh) 2017-03-08
EP3281058A4 (fr) 2018-04-11
CN110275619A (zh) 2019-09-24
CN110275619B (zh) 2024-08-23
KR20170026164A (ko) 2017-03-08

Similar Documents

Publication Publication Date Title
WO2017039308A1 (fr) Appareil d'affichage de réalité virtuelle et procédé d'affichage associé
EP3281058A1 (fr) Appareil d'affichage de réalité virtuelle et procédé d'affichage associé
WO2018155892A1 (fr) Procédé d'affichage d'une image, support de stockage et dispositif électronique associé
WO2020045947A1 (fr) Dispositif électronique de commande de propriété d'écran sur la base de la distance entre un dispositif d'entrée de stylet et le dispositif électronique et son procédé de commande
WO2016175412A1 (fr) Terminal mobile et son procédé de commande
WO2017086508A1 (fr) Terminal mobile et procédé de commande associé
WO2018038439A1 (fr) Appareil d'affichage d'image et son procédé de fonctionnement
WO2018070624A2 (fr) Terminal mobile et son procédé de commande
WO2017119664A1 (fr) Appareil d'affichage et ses procédés de commande
WO2017126741A1 (fr) Visiocasque et procédé de commande de celui-ci
WO2020159302A1 (fr) Dispositif électronique permettant d'assurer diverses fonctions dans un environnement de réalité augmentée et procédé de fonctionnement associé
WO2015108234A1 (fr) Dispositif de visiocasque amovible et son procédé de commande
WO2019156480A1 (fr) Procédé de détection d'une région d'intérêt sur la base de la direction du regard et dispositif électronique associé
WO2019164092A1 (fr) Dispositif électronique de fourniture d'un second contenu pour un premier contenu affiché sur un dispositif d'affichage selon le mouvement d'un objet externe, et son procédé de fonctionnement
WO2014069722A1 (fr) Dispositif d'affichage en trois dimensions, et procédé correspondant pour la mise en œuvre d'une interface utilisateur
WO2018048092A1 (fr) Visiocasque et son procédé de commande
WO2018030567A1 (fr) Hmd et son procédé de commande
WO2015046899A1 (fr) Appareil d'affichage et procédé de commande d'appareil d'affichage
WO2016192438A1 (fr) Procédé d'activation de système d'interaction de détection de mouvement, et procédé et système d'interaction de détection de mouvement
WO2020138602A1 (fr) Procédé d'identification de main réelle d'utilisateur et dispositif vestimentaire pour cela
WO2021133053A1 (fr) Dispositif électronique et son procédé de commande
WO2022098204A1 (fr) Dispositif électronique et procédé de fourniture de service de réalité virtuelle
WO2016080662A1 (fr) Procédé et dispositif de saisie de caractères coréens sur la base du mouvement des doigts d'un utilisateur
WO2018034377A1 (fr) Procédé, dispositif et système de recherche d'évènement
WO2017039061A1 (fr) Dispositif portable et procédé de commande s'y rapportant

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20171106

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

A4 Supplementary search report drawn up and despatched

Effective date: 20180308

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/70 20170101ALI20180303BHEP

Ipc: G02B 27/01 20060101ALI20180303BHEP

Ipc: G02B 27/22 20180101AFI20180303BHEP

Ipc: G06T 19/00 20110101ALI20180303BHEP

Ipc: G06F 3/01 20060101ALI20180303BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20181128

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20210320