KR20170094574A - Head-mounted display device - Google Patents

Head-mounted display device Download PDF

Info

Publication number
KR20170094574A
KR20170094574A KR1020160015441A KR20160015441A KR20170094574A KR 20170094574 A KR20170094574 A KR 20170094574A KR 1020160015441 A KR1020160015441 A KR 1020160015441A KR 20160015441 A KR20160015441 A KR 20160015441A KR 20170094574 A KR20170094574 A KR 20170094574A
Authority
KR
South Korea
Prior art keywords
head
screen
gesture
input
user
Prior art date
Application number
KR1020160015441A
Other languages
Korean (ko)
Inventor
조미진
이지은
김희수
노은영
유지혜
김진욱
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020160015441A priority Critical patent/KR20170094574A/en
Publication of KR20170094574A publication Critical patent/KR20170094574A/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Abstract

In the head mount display device, the control unit controls the screen to be moved on the display unit when the user's head is rotated in the first lateral direction and then the fixed gesture is input, with the head mounted display device mounted.

Description

[0001] HEAD-MOUNTED DISPLAY DEVICE [0002]

The present invention relates to a head-mounted display device that allows the convenience of the user to be further considered to expand the versatility of the device.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

The mobile terminal can be used as a wearable device that can be worn on the body. Such a wearable device includes a watch-type mobile terminal, a glass-type mobile terminal, and a head mounted display (HMD).

Among them, the head-mounted display device is a device that can be worn on the head. It is combined with the augmented reality technology and the N-screen technology beyond the simple display function to provide various convenience to the user.

The conventional head-mounted display device can not use the touch function of the display unit, and performs the image control by operating the external button. The external button key is provided on the outside of the head mount display device so that it is inconvenient for the user to find the installation position of the external button key in a state where the head mount display device is mounted on the head. Furthermore, there is a problem that a wrong operation of the image control is generated by operating a button key other than the desired button key.

The present invention is directed to solving the above-mentioned problems and other problems.

Another object of the present invention is to provide a head-mounted display device capable of improving user's convenience by enabling various and accurate screen control without using an external button key.

According to an aspect of the present invention, there is provided a head-mounted display device including: a display unit for displaying a screen; An input unit for inputting at least one gesture; And a controller for controlling the screen in response to the gesture.

The controller controls the screen to be moved on the display unit when a gesture to be fixed after the head of the user is turned in the first lateral direction is input to the input unit in a state in which the head mounted display device is mounted.

The effect of the head mount display device according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, it is possible to easily control the screen without using the external button key, thereby improving the convenience of the user.

According to at least one of the embodiments of the present invention, it is possible to control a desired screen using a minimum of gestures, thereby improving the convenience of the user.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

FIG. 1 shows a view of an image using a head-mounted display device related to the present invention.
2 is a block diagram illustrating a head-mounted display device related to the present invention.
3 is a view illustrating a first screen illustrating a method of moving a screen using a gesture in a head mount display device according to an embodiment of the present invention.
4 is a diagram illustrating a second screen illustrating a method of moving a screen using a gesture in a head mount display device according to an embodiment of the present invention.
5 is a view illustrating a method of displaying an image around a specific object using a gesture in a head mount display device according to an exemplary embodiment of the present invention.
6 is a diagram illustrating a method of moving an object from a first region to a second region using a gesture in a head mount display device according to an exemplary embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

FIG. 1 shows a view of an image using a head-mounted display device related to the present invention.

As shown in Fig. 1, the head-mounted display device 100 may be mounted on the head of the user.

The head mount display device 100 can be divided into a mobile device mounted type and a mobile device connected type.

The mobile device may be, but is not limited to, a smartphone.

In the case of a mobile device mount type, a smart phone can be mounted on the head mount display device 100. In case of a mobile device connection type, a smart phone can be mounted to the head mounted display device 100 by wire / wireless.

Accordingly, an image to be implemented on the smartphone can be provided in combination with the virtual reality provided by the head-mounted display device 100.

Examples of images provided by the head mount display device 100 include a 2D image, a 3D image, a 180 ° wide image, and a 360 ° panorama image.

A screen may be displayed on the display unit 151 of the head mount display device 100. [

All of the images of one frame can be displayed on the display unit 151. However, all the 360 ° panoramic images are not displayed on the display unit 151. Accordingly, the 360-degree panoramic image is partially displayed on the display unit 151 through screen movement (or rotation). For example, an image positioned on the left side of the 360 ° panorama image is first displayed on the display unit 151, an image positioned in the middle of the 360 ° panorama image is displayed on the display unit 151, An image positioned on the right side of the 360 ° panoramic image can be displayed on the display unit 151 by another screen movement. If all the regions of the image are not displayed on the display unit 151 at once, a gesture that can be easily manipulated to quickly see a desired region among the regions of the image may be required.

Although the display portion 151 is located in front of the head mounted display device 100 for convenience of illustration in FIG. 1, the display portion 151 may actually be mounted on the head mounted display device 100,

The present invention can variously control screens provided with images by various gestures.

The head-mounted display device 100 according to the present invention can control the screen to be moved in a specific direction in response to a gesture of the user's head being turned. For example, when a gesture in which the user's head is turned in the horizontal direction, that is, in the left-to-right direction is inputted, the screen may be shifted to the left direction and displayed on the display unit 151 in response to the gesture. That is, the moving direction of the screen may be opposite to the direction in which the user's head turns, but the present invention is not limited thereto.

The head mount display device 100 according to the present invention is capable of flicking movement in which the screen is moved faster in a specific direction in response to a gesture that the user's head is rotated at a speed higher than the reference value. For example, the reference value may be set to three times the speed at which the head is normally turned, but the present invention is not limited thereto.

The head-mounted display device 100 according to the present invention can control the moving screen to be stopped in response to a gesture in which the user's head is nodded. For example, the number of times the user's head is nodded to prevent malfunction may be two or more times, but this is not limitative.

The head mount display device 100 according to the present invention can control so that an object corresponding to a specific area of the screen is selected in response to a gesture in which a specific area of the screen is gazed for a predetermined time.

The head mount display device 100 according to the present invention controls the movement of the specific object in the first area of the screen to the second area of the screen in response to the gesture of the user's head being turned while maintaining the gesture that the device is peeled off can do. If the gesture for taking off the device is released, the specific object moved to the area may be fixed to the area.

More screen controls are possible by the combination of gestures described above. In addition, more new screen controls may be possible by a gesture not described above, and these embodiments may also be construed as the present invention without departing from the technical spirit of the present invention.

2 is a block diagram illustrating a head-mounted display device related to the invention.

The head mount display device 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, 190), and the like. 2 are not required to implement the head mounted display device 100, the head mounted display device 100 described herein may have more or less than the components listed above Elements.

More specifically, the wireless communication unit 110 among the above-described components can communicate with the head mounted display device 100 and the wireless communication system, between the head mounted display device 100 and another mobile terminal, or between the head mounted display device 100 ) And an external server. ≪ RTI ID = 0.0 > [0002] < / RTI > In addition, the wireless communication unit 110 may include one or more modules that connect the head mounted display device 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The broadcast receiving module 111 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.

The wireless Internet module 113 refers to a module for wireless Internet access and may be built in or enclosed in the head mounted display device 100. [ The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

The wireless Internet module 113 may include a private internet module for configuring a private network. Accordingly, the head-mounted display device and other mobile terminals to be connected to the head-mounted display device through a private network may each include a private Internet module.

The short-range communication module 114 is for short-range communication and may be a wireless LAN (WLAN), a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), a UWB Wideband), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Wireless Universal Serial Bus).

The position information module 115 is a module for obtaining the position (or current position) of the head mounted display device, and representative examples thereof include a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like).

The user input unit 123 receives information from the user. When information is input through the user input unit 123, the control unit 180 controls the operation of the head mounted display device 100 to correspond to the input information . The user input unit 123 may include a mechanical input means (or a mechanical key such as a button located on the front, rear or side of the head mounted display device 100, a dome switch, Wheel, jog switch, etc.) and touch-type input means.

The sensing unit 140 may include one or more sensors for sensing at least one of the information in the head mounted display device, the surrounding information surrounding the head mounted display device, and the user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). On the other hand, the head-mounted display device 100 disclosed in this specification can combine and utilize the information sensed by at least two of these sensors.

The gestures described in the present invention can be detected by a gyroscope sensor, an acceleration sensor, a motion sensor, an optical sensor, an illuminance sensor, and / or an infrared sensor.

For example, a gesture of the user's head, a gesture of the user's head nod, a gesture of the user's head at a speed above the reference level, etc. may be sensed by the gyroscope sensor, the acceleration sensor and / or the motion sensor, Not limited.

For example, a gesture in which a specific area of the screen is stared for a certain period of time can be sensed by the optical sensor, but the present invention is not limited thereto.

For example, the gesture from which the head-mound display device is peeled may be sensed by the illuminance sensor and / or the infrared sensor, but this is not limiting.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. Such a touch screen may function as a user input 123 that provides an input interface between the head mounted display device 100 and a user and may provide an output interface between the head mounted display device 100 and a user .

The display unit 151 displays (outputs) information to be processed in the head-mounted display device 100. For example, the display unit 151 displays execution screen information of an application program driven by the head-mounted display device 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

The audio output unit 152 can output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, .

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the head mount display device 100. Examples of events that occur on the head mounted display device 100 may include message reception, call signal reception, missed call, notification, calendar notification, email reception, receiving information via an application, and the like.

The signal output by the light output unit 154 is implemented as the head-mounted display device 100 emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the head-mounted display device 100 detecting the user's event confirmation.

The interface unit 160 serves as a path to various kinds of external devices connected to the head-mounted display device 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the head mount display device 100, corresponding to the connection of an external device to the interface unit 160, it is possible to perform appropriate control relating to the connected external device.

In addition, the memory 170 stores data that supports various functions of the head mounted display device 100. The memory 170 may store a plurality of application programs or applications driven by the head mounted display device 100, data for operation of the head mounted display device 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication. At least some of these applications may also be stored on the head mounted display device 100 from the time of shipment for basic functions (e.g., phone call incoming, outgoing, message receiving, origination functions) of the head mounted display device 100 Can exist. On the other hand, the application program is stored in the memory 170, installed on the head mounted display device 100, and can be driven by the control unit 180 to perform the operation (or function) of the head mounted display device.

In addition to the operations associated with the application program, the control unit 180 typically controls the overall operation of the head-mounted display device 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components discussed with reference to FIG. 2 in order to drive an application program stored in the memory 170. FIG. Further, the controller 180 may operate at least two of the components included in the head-mounted display device 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the head mounted display device 100. The power supply unit 190 may include a battery, and the battery may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operating, controlling, or controlling a head-mounted display device according to various embodiments described below. In addition, the method of operation, control, or control of the head mounted display device may be implemented on the head mounted display device by driving at least one application program stored in the memory 170. [

Hereinafter, embodiments related to a control method that can be implemented in a head mount display device configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

3 is a view illustrating a first screen illustrating a method of moving a screen using a gesture in a head mount display device according to an embodiment of the present invention.

First, the head mount display device 100 may be mounted on the user's head.

Then, the image can be displayed on the display unit 151 by operating the head-mounted display device 100. [ That is, a screen including an image on the display unit 151 may be displayed.

The current screen displayed on the display unit 151 in response to the specific gesture can be replaced with another screen.

As shown in FIG. 3A, for example, a gesture that is fixed for at least n seconds after the head of the user is turned from the left to the right can be input. Here, n seconds may be, for example, one second or more, but the present invention is not limited thereto.

These gestures may be sensed by, but not limited to, gyroscope sensors, acceleration sensors, and / or motion sensors.

In response to the gesture, the first screen 1 displayed on the display unit 151 may be shifted to the left and disappear. Subsequently, the second screen 2 may be moved from the right side of the display unit 151 to the left side instead of the first screen 1 and displayed on the display unit 151. The second screen 2 can be moved from the right side of the display unit 151 to the left side while the second screen 2 is not shown on the display unit 151 but is viewed on the display unit 151. [

While the user's head is turned and then fixed continuously, the following screens 3 and 4 may be sequentially displayed on the display unit 151. For example, instead of moving the second screen 2 to the left, the third screen 3 is displayed on the display unit 151, and the fourth screen 4 is displayed instead of the third screen 3 being moved. May be displayed on the display unit 151. [

If the fixed user's head is input to a gesture that is moved to its original position, for example, from right to left, the screen currently displayed on the display unit 151 may be fixed. Due to the fixed screen, the movement of the screen no longer proceeds.

When the head of the user is turned to the front and then turned to the right and fixed, the screen is moved. When the user's head is turned from the original position, i.e., from the right to the forward position, Can be fixed.

Although the above description is limited to the movement of the screens, the same can be applied to each area included in the 360 ° panoramic image.

It is assumed that the left area of the panoramic image is displayed on the display unit 151 at present.

For example, the panoramic image may be displayed on the display unit 151 while being shifted from the right side to the left side in response to a gesture in which the user's head is rotated from the left direction to the right direction and fixed for n seconds or more.

The movement of the panoramic image can be continued while the fixed gesture is maintained.

A specific area of the panorama image displayed on the display unit 151 may be fixed in response to a gesture in which the user's head is turned from the right direction to the left direction. For example, the middle area of the panoramic image can be fixed and displayed on the display unit 151. [

The particular area of the screen or panoramic image may be moved in a direction opposite to the direction in which the user's head is rotated, but this is not limiting.

Although not shown, when a gesture is input that is fixed for n seconds or more after the user's head is turned to the right again, a specific area of the screen or panoramic image fixed on the display unit 151 is moved to the left direction again The screen or another area of the panoramic image can be displayed on the display unit 151. [

Although not shown, a specific area of a screen or a panoramic image may be shifted to the right. That is, in response to the fixed gesture after the head of the user is turned from the right direction to the left direction, the specific area of the screen or the panorama image is moved to the right direction, and another screen is displayed on the display unit 151 Lt; / RTI > Here, the fixed time during the gesture may or may not be n seconds, which is used when a specific region of the screen or panoramic image is shifted to the left.

Therefore, according to the present invention, it is possible to move a specific area of a screen or a panoramic image by a simple gesture so that another area of the screen or the panoramic image is displayed on the display part 151, have.

4 is a diagram illustrating a second screen illustrating a method of moving a screen using a gesture in a head mount display device according to an embodiment of the present invention.

The 360 ° panoramic image may be difficult to move to a desired area by a single operation by the gesture described in FIG. In this case, it is required to be moved as quickly as possible to the desired area. One of these functions is the flicking movement.

4A, a part of the panoramic image may be displayed on the display portion 151 of the head mount display device 100. [0033] FIG. As described above, all areas of the panoramic image can not be displayed on the display unit 151 at one time. Therefore, a flicking movement is required so that a specific area of the unseen panoramic image is displayed on the display unit 151. [

For example, when a gesture in which the user's head is rotated from the left side to the right side at a speed higher than the reference value is input, the panorama image displayed on the display unit 151 can be flicked in the left direction.

The flicking movement can be faster than the screen movement described in FIG.

For example, the reference value may be set to three times the speed at which the head is normally turned, but the present invention is not limited thereto.

For example, when the user's head is normally turned at a speed of 5 km / h, the reference value may be set to 15 km / h. Accordingly, when a gesture that the user's head is rotated from the left side to the right side at a speed of at least 15 km / h is input, the panorama image displayed on the display unit 151 can be flicked from the right side to the left side have.

If you want to stop the screen while flicking, additional gestures should be entered. That is, when a gesture in which the user's head is nudged is input while the screen is flicking in the left direction, the screen currently being flicking is stopped on the display unit 151. Therefore, the current screen displayed on the display unit 151 can be fixed without being moved any more.

Although not shown, when a gesture in which the head of the user is rotated from the right direction to the left direction at a constant speed, for example, n seconds or more is input, the panorama image displayed on the display unit 151 can be flicked in the right direction have.

Although not shown, the flicking movement does not proceed continuously, and may be stopped after a certain period of time. For example, the flicking movement may be stopped, but not limited thereto, for example, one second after the start time. In such a case, by repeatedly using the same gesture, the flicking movement can be continued.

Although not shown, it is possible to distinguish the speed at which the user's head is rotated, thereby varying the speed of flicking movement.

For example, when the reference value is 5 km / h, the speed range in which the user's head is rotated is the first speed section (5 km / h to 10 km / h), the second speed section (25 km / h to 20 km / h), a fourth speed section (20 km / h to 25 km / h), and a fifth speed section (25 km / h to 30 km / h).

For example, the screen displayed on the display unit 151 may be flicked at the first to fifth speeds whenever the speed at which the user's head is turned corresponds to each speed section.

The greater the speed at which the user's head is turned, the greater the flicking speed of the screen.

The flicking movement speed of the screen may correspond to or smaller than the speed range to which the speed at which the user's head is rotated. For example, when the speed at which the user's head is rotated is included in the third speed section (15 km / h to 20 km / h), the flicking moving speed of the screen may be 15 km / h or less, .

Although not shown, when a gesture in which the head of the user returns from the left direction to the right direction is inputted at the highest speed, the panorama image is flicked to the left region and the right region of the panorama image is displayed on the display portion 151 .

Although the above description is limited to the panoramic image, the same can be applied to the virtual reality (VR) content and the virtual reality home (VR home). Virtual reality content includes album views, photos, videos, and games. The virtual reality home may be equipped with various applications over several screens. Such a virtual reality content or a virtual reality home can also be quickly moved so that an image distant from the image displayed on the display unit 151 is displayed on the display unit 151 using the flicking function.

5 is a view illustrating a method of displaying an image around a specific object using a gesture in a head mount display device according to an exemplary embodiment of the present invention.

The gesture can be used to control the display of an image around a specific object.

As shown in Fig. 5A, a moving picture can be displayed on the display portion 151 of the head-mounted display device 100. Fig.

It is possible to display an image around a specific object having motion in the moving image.

For example, when a first input is input from an external button key while a moving image is displayed, a specific scene (a scene or a frame to be replaced with a screen or a frame of the moving image displayed on the display unit 151 in response to the input of the corresponding button key Can be stopped. That is, the current scene is not passed to the next scene. The first input may mean that the external button key is depressed, but this is not limiting.

The specific scene may include not only objects to be selected by the user but also other objects. Each object may be, for example, a person with a motion, an animal, etc., but is not limited thereto.

When a gesture for gazing a specific scene for p seconds or more is input while a specific scene is stopped, each object included in a specific scene can be set as a selectable object as shown in FIG. 5B. To inform the user of the setting of these objects, each object may be displayed with an outline having a dotted line. In addition, although the outline of each object can be displayed in, for example, a gray color, it is not limited thereto. The gesture of gazing may be sensed by the optical sensor, but this is not limiting.

Then, when a gesture for gazing a specific object in a specific scene, for example, an elephant object 303 for more than s seconds, is inputted, the elephant object 303 can be selected as shown in Fig. 5B.

The gesture at which the particular object 303 is taken may be sensed by the optical sensor, but this is not limiting.

s second and p second may or may not be the same.

Unlike the above method, the specific object 303 of a specific scene is gazed for p seconds or more, so that each object included in a specific scene can be set as a selectable object, and a specific object can be selected.

When the second input is again input from the external button key, the moving picture can be reproduced as a moving picture from the stopped specific scene as shown in FIG. 5C. At this time, it can be controlled so that the selected specific object 303 is always included in the moving image to be displayed and displayed on the display unit 151.

For example, a scene in which the specific object 303 does not exist may be changed to a scene in which the specific object 303 exists and displayed on the display unit 151. That is, the specific object 303 is a scene in which the object A and the object B interact with each other in a specific scene, and when the specific object 303 is separated from the object A and the object B, And a scene around the background of the surrounding scene may be generated, and the generated scene may be displayed on the display unit 151 instead of the specific scene.

6 is a diagram illustrating a method of moving an object from a first region to a second region using a gesture in a head mount display device according to an exemplary embodiment of the present invention.

The gesture can be used to control the movement of a specific object from the first area to the second area.

6A, after the head-mounted display device 100 is mounted, a screen may be displayed on the display unit 151 of the corresponding device. The screen may be displayed in the entire area or a part of the area of the display unit 151. [

The screen may contain still images or photographs. A plurality of objects may be included in a still image or a photograph, but the present invention is not limited thereto. Each object may be a region on the screen, a person or an object, or a photograph.

When a gesture in which the device is to be peeled is input, the screen can be displayed and can be switched to an edit mode.

When switched to the edit mode, each of the plurality of objects is individually selectable.

When a gesture in which a particular object 381 is tilted for more than t seconds among a plurality of objects is input while a gesture for peeling the device is maintained, the specific object 381 can be selected as shown in Fig. 6B.

t seconds may be equal to at least one of n seconds, p seconds, and s seconds described above, or they may be different from n seconds, p seconds, and s seconds, respectively.

The selected specific object 381 can be moved from the first area to the second area when a gesture in which the head of the user is rotated in a specific direction while inputting a gesture to peel the device is input. The first area may be an area where the specific object 381 is originally located, and the second area may be a newly moved area of the specific object 381.

The specific object 381 may be moved from the first area in the screen to the second area or from the first area of the first screen to the second area of the second screen. For example, the first screen is currently displayed on the display unit 151, and the second screen is not currently displayed, but can be displayed by screen movement.

For example, as shown in FIG. 6C, although the specific object 381 is initially located on the first screen 375, the gesture described above may cause the first screen 375 to jump to a specific area of the second screen 377 Lt; / RTI >

Then, when the gesture for peeling the device is released, as shown in FIG. 6D, the specific object 381 moved to the corresponding screen or area may be fixed to the corresponding screen or area.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). The computer may also include a controller 180 of the head-mounted display device 100. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: Head mount display device
110: wireless communication unit 120: input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit

Claims (8)

A display unit for displaying a screen;
An input unit for inputting at least one gesture; And
And a control unit for controlling the screen in response to the gesture,
Wherein,
Mounted on the display unit, and controls the screen to be moved on the display unit when a gesture to be fixed after the user's head is turned in the first lateral direction with the head-mounted display device mounted is input to the input unit.
The method according to claim 1,
Wherein,
Wherein the moving direction of the screen is a direction opposite to a direction in which the user's head is rotated.
3. The method of claim 2,
Wherein,
And controls the screen movement to be continued while the fixed gesture is maintained.
The method of claim 3,
Wherein,
And controls the screen movement to be stopped when a gesture in which the head of the user is turned in the direction opposite to the first lateral direction is input.
The method according to claim 1,
Wherein,
And controls the screen to be flicking when a gesture in which a user's head is rotated at a speed higher than a reference value is input.
6. The method of claim 5,
Wherein,
The user's head is divided into a plurality of speed sections,
And controls the screen to be flicking in accordance with a speed section including the rotated speed when a speed at which the user's head is turned is included in one of the plurality of speed sections.
The method according to claim 1,
A moving picture is reproduced on the screen,
Wherein,
When a first input is input from a specific button key, controls to stop a specific scene of the moving picture to be reproduced,
Wherein the control unit controls the at least one or more objects to be selected as a selectable object and the selected specific object to be selected when a gesture in which a specific object among the at least one object included in the suspended specific scene is input is controlled,
And controls the moving picture including the scene including the selected specific object to be reproduced when the second input is inputted from the specific button key.
The method according to claim 1,
Wherein,
When a gesture for peeling off the mounted head-mounted display device is input, the mode is switched to a letter mode for editing the screen,
And controlling the selected specific object to be selected when a gesture in which a particular object is taken is input from among at least one or more objects included in the screen while the gesture for peeling the device is maintained,
Wherein the control unit controls the selected specific object to move from the first area to the second area when a gesture in which the head of the user is rotated in a certain direction is input while the gesture to peel off the device is maintained.
KR1020160015441A 2016-02-11 2016-02-11 Head-mounted display device KR20170094574A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160015441A KR20170094574A (en) 2016-02-11 2016-02-11 Head-mounted display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160015441A KR20170094574A (en) 2016-02-11 2016-02-11 Head-mounted display device

Publications (1)

Publication Number Publication Date
KR20170094574A true KR20170094574A (en) 2017-08-21

Family

ID=59757497

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160015441A KR20170094574A (en) 2016-02-11 2016-02-11 Head-mounted display device

Country Status (1)

Country Link
KR (1) KR20170094574A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2559133A (en) * 2017-01-25 2018-08-01 Avantis Systems Ltd A method of navigating viewable content within a virtual environment generated by a virtual reality system
CN111580652A (en) * 2020-05-06 2020-08-25 Oppo广东移动通信有限公司 Control method and device for video playing, augmented reality equipment and storage medium
WO2022231224A1 (en) * 2021-04-28 2022-11-03 주식회사 피앤씨솔루션 Augmented reality glasses providing panoramic multi-screens and panoramic multi-screen provision method for augmented reality glasses

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2559133A (en) * 2017-01-25 2018-08-01 Avantis Systems Ltd A method of navigating viewable content within a virtual environment generated by a virtual reality system
CN111580652A (en) * 2020-05-06 2020-08-25 Oppo广东移动通信有限公司 Control method and device for video playing, augmented reality equipment and storage medium
CN111580652B (en) * 2020-05-06 2024-01-16 Oppo广东移动通信有限公司 Video playing control method and device, augmented reality equipment and storage medium
WO2022231224A1 (en) * 2021-04-28 2022-11-03 주식회사 피앤씨솔루션 Augmented reality glasses providing panoramic multi-screens and panoramic multi-screen provision method for augmented reality glasses

Similar Documents

Publication Publication Date Title
KR20170112491A (en) Mobile terminal and method for controlling the same
KR20170037466A (en) Mobile terminal and method of controlling the same
KR20170006559A (en) Mobile terminal and method for controlling the same
KR20160131720A (en) Mobile terminal and method for controlling the same
KR20160029536A (en) Mobile terminal and control method for the mobile terminal
KR20150109756A (en) Mobile terminal and method for controlling the same
KR20170021159A (en) Mobile terminal and method for controlling the same
KR20160039453A (en) Mobile terminal and control method for the mobile terminal
KR20170014609A (en) Mobile terminal and method of controlling the same
KR20170112493A (en) Mobile terminal and method for controlling the same
KR20180017638A (en) Mobile terminal and method for controlling the same
KR20170115863A (en) Mobile terminal and method for controlling the same
KR20170094574A (en) Head-mounted display device
KR20170037123A (en) Mobile terminal and method for controlling the same
KR101510704B1 (en) Mobile terminal and control method for the mobile terminal
KR20170058756A (en) Tethering type head mounted display and method for controlling the same
KR20170075579A (en) Mobile terminal and method for controlling the same
KR20160086167A (en) Mobile terminal and method for controlling the same
KR20160029348A (en) Glass type mobile terminal
KR101750872B1 (en) Mobile terminal and method for controlling the same
KR20150093519A (en) Mobile terminal and method for controlling the same
KR20170119955A (en) Mobile terminal
KR20170029330A (en) Mobile terminal and method for controlling the same
KR101504238B1 (en) Mobile terminal and control method for the mobile terminal
KR20160009416A (en) Mobile terminal and control method for the mobile terminal