KR20170055296A - Tethering type head mounted display and method for controlling the same - Google Patents

Tethering type head mounted display and method for controlling the same Download PDF

Info

Publication number
KR20170055296A
KR20170055296A KR1020150158307A KR20150158307A KR20170055296A KR 20170055296 A KR20170055296 A KR 20170055296A KR 1020150158307 A KR1020150158307 A KR 1020150158307A KR 20150158307 A KR20150158307 A KR 20150158307A KR 20170055296 A KR20170055296 A KR 20170055296A
Authority
KR
South Korea
Prior art keywords
hmd
mobile terminal
image information
display unit
controller
Prior art date
Application number
KR1020150158307A
Other languages
Korean (ko)
Inventor
김수미
변준원
오현주
남경덕
권용진
김상호
정우찬
박성준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150158307A priority Critical patent/KR20170055296A/en
Priority to US15/773,230 priority patent/US20180321493A1/en
Priority to PCT/KR2015/013413 priority patent/WO2017082457A1/en
Publication of KR20170055296A publication Critical patent/KR20170055296A/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Abstract

The present invention relates to a tethering type head mounted display (HMD) connected to a mobile terminal and a control method thereof. The present invention comprises: a communication unit performing wired or wireless communication with the mobile terminal; a display unit outputting image information; a sensing unit sensing movement of the HMD; and a control unit for controlling the display unit to output image information to be controlled according to a result of the HMD motion. The control unit controls the display unit to output image information controlled according to a motion sensed by the mobile terminal when any one of predetermined conditions occurs. When the situation mentioned above is finished, the control unit controls the display unit so that image information controlled according to the motion of the HMD can be outputted. Therefore, the HMD can be controlled through a device selected by the user or a device more suitable for the sensed situation.

Description

TECHNICAL FIELD [0001] The present invention relates to an HMD (HMD)

The present invention relates to a tethering type HMD connected to a mobile terminal and a control method of the HMD.

Description of the Related Art A wearable-type glass-type terminal is recently developed which can be mounted on a part of a human body. And a glass type terminal mounted on a user's head may correspond to a head mounted display (HMD).

The head-mounted display (HMD) is a display device which is worn on the head of a user and can present an image directly in front of the user's eyes. The HMD allows a user to enjoy image contents with a larger image than a TV or a screen . Alternatively, a virtual space screen may be displayed so that the user can enjoy the virtual space experience.

On the other hand, the functions of mobile terminals are becoming diversified these days due to the development of technology. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs. Such a terminal has various functions according to the development of technology. For example, it is implemented in the form of a multimedia device having a combination of functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game, and broadcasting.

Accordingly, a method of using the excellent mobile terminal function in cooperation with the HMD is emerging. As a part of such a scheme, an HMD (Tethering Type HMD) in which the HMD and the mobile terminal are connected to each other so that the mobile terminal can share the workload of the HMD can be processed.

The tethering type HMD connected to the mobile terminal can reduce the workload of the HMD by interlocking the connected mobile terminal with the HMD. Accordingly, the tethering type HMD does not require high performance as compared with the stand type HMD in which the HMD carries out all the tasks, and thus can be produced at a lower cost. Also, various functions using the connected mobile terminal can be provided.

Meanwhile, such a mobile terminal may be used as an input device for inputting an input signal to the tethering HMD. Accordingly, a method for controlling a function executed in the HMD using a signal sensed or input from the HMD or the mobile terminal is actively researched.

The present invention relates to an HMD capable of preventing a problem of deadlock of control signals generated by simultaneously inputting control signals from the plurality of devices when a plurality of devices capable of inputting control signals of the HMD are provided, And to provide a control method thereof.

Another object of the present invention is to provide an HMD and a method of controlling the HMD, which allow the HMD or the controller device to control the HMD through a device selected by a user or a device according to a detected specific situation.

According to one aspect of the present invention, there is provided an HMD (Head Mounted Display) connected to a mobile terminal, the HMD comprising: a communication unit performing wired or wireless communication with the mobile terminal; And a control unit for controlling the display unit to output image information to be controlled according to a result of sensing the motion of the HMD, Controls the display unit to output image information controlled according to a motion sensed by the mobile terminal when any one of the set conditions occurs, and outputs image information controlled according to the motion of the HMD when the generated situation is finished And controls the display unit.

In one embodiment, the controller detects when a predetermined user's touch input is sensed on the touch screen of the mobile terminal, or when a specific touch input gesture is sensed through the mobile terminal, .

In one embodiment, the controller may be configured to determine whether the predetermined function executed in accordance with the touch input of the predetermined user or the specific touch input gesture is terminated, or when the touch input of the preset user or the specific touch input gesture is detected again It is detected that the generated situation is terminated.

In one embodiment, when the predetermined head movement of the user is detected through the HMD, the controller further detects that the predetermined situation has occurred, and when the predetermined head movement is detected again, It is further detected that it is finished.

In one embodiment, the mobile terminal operates in a Doze mode when connected to the HMD, and the doze mode is a mode in which the light emitting element of the touch screen of the mobile terminal is off A touch input on the touch screen of the mobile terminal, and a movement of the mobile terminal.

In one embodiment, the control unit further detects that any one of the predetermined conditions has occurred when specific image information is displayed on the display unit, or when the remaining amount of power of the HMD is less than a predetermined level, When the display of the specific image information is terminated or when the remaining amount of power of the HMD is equal to or higher than the predetermined level, it is detected that the generated situation is terminated.

In one embodiment, the specific image information corresponds to a specific graphic object, and when the user gazes over one area on the display unit on which a specific graphic object is displayed for a predetermined time or longer, And the corresponding specific image information is displayed on the display unit.

In one embodiment, the control unit terminates the display of the specific image information when the user gazes at an area other than the area on the display unit on which the specific graphic object is displayed for a predetermined time or more.

In one embodiment, at least a part of the preset statuses is set in advance in correspondence with a specific device, and the control unit, when a situation in which the specific device is set to correspond in advance occurs, Is set as a controller for controlling image information output on the display unit.

In one embodiment, when changing the device controlling image information output on the display unit to another device in accordance with occurrence or termination of the predetermined specific situation, the control unit may change the amount of power remaining in the other device And determines whether or not the change is made.

In one embodiment, the image information may be image processing and rendering processing performed by any one of the HMD and the mobile terminal, according to a result of detecting motion of an apparatus for controlling image information output on the display unit .

In one embodiment, when the image processing and rendering processing is performed in the mobile terminal, the control unit controls the display unit based on the detection result of the sensing unit and the motion detection And receiving the image information generated by the mobile terminal according to any one of the results from the mobile terminal and outputting the received image information on the display unit.

In one embodiment, when the image processing and rendering processing is performed in the HMD, the control unit controls the display unit based on the detection result of the sensing unit and the motion detection result of the mobile terminal , And outputs the generated image information to the display unit.

In one embodiment, when the device controlling the image information displayed on the display unit is changed to another device, the control unit displays notification information on the display unit to inform the user of the change.

According to an aspect of the present invention, there is provided a method of controlling an HMD (Head Mounted Display) connected to a mobile terminal, the method comprising: The method comprising the steps of: outputting image information related to a selected content to a display unit provided in the HMD; sensing a head movement of a user through a sensor provided in the HMD; Detecting the occurrence of a predetermined situation, detecting a motion of the mobile terminal based on the generated specific situation, and controlling the motion of the mobile terminal Controlling the image information displayed on the display unit, and when the end of the preset situation is detected, the H And controlling the image information displayed on the display unit based on the motion detected through the MD.

Effects of the HMD according to the present invention and the control method of the HMD will be described as follows.

According to at least one of the embodiments of the present invention, the HMD and the controller device are determined by a user as a device for inputting a control signal of the HMD according to a selection or a sensed situation of the HMD and the controller device, It is possible to prevent a problem that arises when control signals are inputted from the controller device at the same time.

According to at least one of the embodiments of the present invention, the HMD is controlled through any one of the HMD and the controller connected to the HMD, based on a user's selection or a sensed situation, There is an effect that the HMD can be controlled through the device according to the detected condition or the device more suitable for the detected situation.

1 is a block diagram for explaining a tethering type HMD related to the present invention.
2A and 2B are block diagrams illustrating a mobile terminal serving as an HMD and a controller related to the present invention.
3 is a flowchart for explaining an operation procedure of a device for controlling image information displayed in the HMD in the HMD related to the present invention.
4 is a flowchart illustrating an operation of changing an apparatus for controlling image information displayed in the HMD according to the amount of power of the mobile terminal connected to the HMD and the HMD in the HMD related to the present invention.
5 is a flowchart illustrating an operation of changing an apparatus for controlling image information displayed in the HMD according to a graphic object displayed on a display unit in an HMD related to the present invention.
6 is an exemplary diagram illustrating an example in which image information displayed in the HMD is controlled according to the motion of the HMD or the controller device in the HMD related to the present invention.
7A and 7B are diagrams illustrating an example in which, in the HMD related to the present invention, a user input for changing a device for controlling image information displayed in the HMD is detected.
8A and 8B show examples of screens displayed differently according to devices controlling image information displayed in the HMD in the HMD related to the present invention.
9 is a diagram illustrating an example in which, in an HMD related to the present invention, an apparatus for controlling image information displayed in the HMD is changed according to a graphic object displayed on a display unit.
10A and 10B are diagrams illustrating an example in which, in an HMD related to the present invention, a device for controlling image information displayed in the HMD is changed according to the remaining amount of power of the devices.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar elements are denoted by the same or similar reference numerals, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device (e.g., a smart watch, a glass glass, a head mounted display (HMD), etc.) .

First, FIG. 1 shows an example of a tethering type HMD 100 according to the present invention connected to a mobile terminal 200.

As shown in FIG. 1, the HMD 100 according to the embodiment of the present invention can be connected to the mobile terminal 200. Here, the mobile terminal 200 may be various devices. For example, the mobile terminal 200 may be a smart phone, a tablet PC, or the like. The HMD 100 may receive information or sensed signals input through the mobile terminal 200 or may share various information and data stored in the mobile terminal 200.

An image of a virtual space displayed on the display unit of the HMD 100 may be generated in the HMD 100 or may be generated in the mobile terminal 200 connected to the HMD 100. [ For example, when the HMD 100 generates an image of the virtual space, the HMD 100 performs image processing and rendering processing for processing the image of the virtual space, and the image processing and rendering processing result And the generated image information can be output through the display unit. Meanwhile, when the mobile terminal 200 generates an image of the virtual space, the mobile terminal 200 performs the image processing and rendering processing, and transmits the image information generated as a result of the image processing and rendering processing to the HMD 100). Then, the HMD 100 can output the image information received from the mobile terminal 200.

Meanwhile, the HMD 100 according to the embodiment of the present invention may receive a control signal for controlling the function of the HMD 100 from the mobile terminal 200. For example, the HMD 100 may receive a result of sensing movement of the mobile terminal 200 from the mobile terminal 200, and may control the function of the HMD 100 based on the detection result. Or such control signal may be that the HMD 100 itself is sensed. That is, the HMD 100 senses the head movement of the wearer wearing the HMD 100 using the sensors provided in the HMD 100, and controls the function of the HMD 100 based on the detection result It is possible.

The control of the function of the HMD 100 according to the movement of the mobile terminal 200 or the movement of the HMD 100 is controlled according to the movement of the mobile terminal 200 or the movement of the HMD 100 It may mean that image information is displayed. In other words, the HMD 100 according to the embodiment of the present invention is capable of displaying, in a virtual space displayed on the display unit of the HMD 100, a direction corresponding to the movement of the mobile terminal 200 or the movement of the HMD 100 The image of the virtual space of the display unit 151 may be displayed on the display unit 151. [ The HMD 100 may simulate the movement of the user in the virtual space or may display the virtual space image of the other direction according to the movement of the head of the user.

Meanwhile, the mobile terminal 200 may share various information with the HMD 100. Accordingly, a variety of information related to the mobile terminal 200 can be displayed on the display unit of the HMD 100, and the user can view the content through the HMD 100, The detected events can be confirmed.

In addition, various information related to the controller device 200 may be provided to the HMD 100 according to functions provided by the mobile terminal 200. 1, when the mobile terminal 200 is connected to the HMD 100 and functions as a controller, functions that can be provided through the mobile terminal 200, that is, e-mail ), A message function such as a call function, a social network service (SNS) function, a short messaging service (SMS), or a multimedia messaging service (MMS), and various applications installed in the mobile terminal 200, And can be displayed through the HMD 100.

Accordingly, the user can notify the mobile terminal 200 of the event, that is, the reception of a call or the reception of a message, or the news of the SNS community, or various status information related to the mobile terminal 200 through the HMD 100 Can be confirmed through the HMD 100.

2A is a block diagram for explaining the HMD 100 related to the present invention.

2A, the HMD 100 according to the embodiment of the present invention includes a wireless communication unit 110, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180 And a power supply unit 190 and the like. 2A are not essential for implementing the HMD 100 according to the embodiment of the present invention, so that the HMD 100 described in this specification is more or less than the above listed components Components.

The wireless communication unit 110 may be connected between the HMD 100 and various peripheral devices such as the mobile terminal 200 and the HMD 100 or between the HMD 100 and the external server, Lt; RTI ID = 0.0 > wireless < / RTI > It may also include one or more modules that connect the HMD 100 to one or more networks.

Meanwhile, the sensing unit 140 may include at least one sensor for sensing a head movement of a user wearing the HMD 100. For example, the sensing unit 140 may include an acceleration sensor 141 and a gyro sensor 143. Here, the acceleration sensor 141 and the gyro sensor 143 can sense the acceleration and the angular velocity according to the movement of the head of the user.

The sensing unit 140 may further include an eye tracking sensor 142 for tracking a user's eyes and detecting where the user's eyes are staying. For example, the eye tracking sensor 142 can detect the direction of the user's gaze toward a specific area on the display unit 151 by detecting an area on the display unit 151 corresponding to the position of the user's eyes.

The output unit 150 may include at least one of a display unit 151, an audio output unit 152, and a haptic module 153 to generate an output related to a visual, auditory or tactile sense. The display unit 151 may be installed at a position corresponding to both sides of the user when the user wears the HMD 100 so as to provide a larger size image to the user. The sound output unit 152 may be formed in the form of a headphone that can be brought into close contact with both ears of the user when the user wears the HMD 100 so that the sound signal related to the content to be reproduced can be transmitted . In addition, the haptic module 153 generates vibrations related to the currently playing content to the user, if necessary, so that the user can view the content more realistically.

Meanwhile, the interface unit 160 serves as a channel with various kinds of external devices connected to the HMD 100. The interface unit 160 may include at least one of various ports such as a wired / wireless headset port, an external charger port, and a wired / wireless data port. For example, when the HMD 100 and the mobile terminal 200 are connected by a wire, the interface unit 160 may be a channel through which various data and information are exchanged between the HMD 100 and the mobile terminal 200 Can be performed.

In addition, the memory 170 stores data supporting various functions of the HMD 100. The memory 170 may store a plurality of application programs (application programs or applications) driven by the HMD 100, data for operation of the HMD 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. At least a part of these application programs may exist on the HMD 100 from the time of shipment for the basic functions of the HMD 100 (for example, reproduction of contents and output of video signals and audio signals of contents to be reproduced) . The application program may be stored in the memory 170 and may be installed on the HMD 100 and may be driven by the controller 180 to perform the operation (or function) of the HMD 100.

The control unit 180 of the HMD 100 typically controls the overall operation of the HMD 100, in addition to the operations associated with the application program. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 2A to drive an application program stored in the memory 170. FIG. In addition, the controller 180 can operate at least two of the components included in the HMD 100 in combination with each other for driving the application program.

Under the control of the control unit 180, the power supply unit 190 receives external power and internal power, and supplies power to the components included in the HMD 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

2B is a block diagram illustrating a mobile terminal 200 connected to the HMD 100 related to the present invention.

The mobile terminal 200 includes a wireless communication unit 210, an input unit 220, a sensing unit 240, an output unit 250, an interface unit 260, a memory 270, a control unit 280, ), And the like. The components shown in FIG. 2B are not essential for implementing the mobile terminal 200, so that the mobile terminal 200 described herein may have more or fewer components than the components listed above have.

The wireless communication unit 210 may be connected between the mobile terminal 200 and the wireless communication system or between the mobile terminal 200 and another mobile terminal or between the mobile terminal 200 and the external server, And may include one or more modules that enable wireless communication. In addition, the wireless communication unit 210 may include one or more modules that connect the mobile terminal 200 to one or more networks.

The wireless communication unit 210 may include at least one of a broadcast receiving module 211, a mobile communication module 212, a wireless Internet module 213, a short distance communication module 214 and a location information module 215 .

The input unit 220 includes a camera 221 or an image input unit for inputting an image signal, a microphone 222 for inputting an audio signal, an audio input unit, a user input unit 223 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 220 may be analyzed and processed by a user's control command.

The sensing unit 240 may include at least one sensor for sensing at least one of information in the mobile terminal 200, surrounding environment information surrounding the mobile terminal 200, and user information. For example, the sensing unit 240 may include a proximity sensor 241, an illumination sensor 242, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 221, a battery gauge, an environmental sensor (such as a barometer, a hygrometer, a thermometer, a radiation detection sensor, a temperature sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 250 includes at least one of a display unit 251, an acoustic output unit 252, a haptrip module 253, and a light output unit 254 for generating an output related to a visual, auditory, can do. The display unit 251 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 223 for providing an input interface between the mobile terminal 200 and a user and may provide an output interface between the mobile terminal 200 and a user.

The interface unit 260 serves as a channel with various types of external devices connected to the mobile terminal 200. The interface unit 260 may be configured to connect a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 200, corresponding to the connection of the external device to the interface 260, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 270 stores data supporting various functions of the mobile terminal 200. The memory 270 may store a plurality of application programs or applications driven by the mobile terminal 200, data for operation of the mobile terminal 200, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 200 from the time of departure for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 200. The application program may be stored in the memory 270 and may be installed on the mobile terminal 200 and may be operated by the control unit 280 to perform the operation (or function) of the mobile terminal 200.

In addition to the operations related to the application program, the control unit 280 typically controls the overall operation of the mobile terminal 200. The control unit 280 may process or process signals, data, information or the like inputted or outputted through the above-mentioned components or may drive an application program stored in the memory 270 to provide or process appropriate information or functions to the user.

In addition, the controller 280 may control at least some of the components illustrated in FIG. 2B in order to drive an application program stored in the memory 270. FIG. Further, the control unit 280 may operate at least two or more of the components included in the mobile terminal 200 in combination with each other for driving the application program.

The power supply unit 290 receives external power and internal power under the control of the controller 280 and supplies power to the respective components included in the mobile terminal 200. The power supply unit 290 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the above-described components may operate in cooperation with one another to implement the method of operation, control, or control of the mobile terminal 200 according to various embodiments described below. Also, the operation, control, or control method of the mobile terminal 200 may be implemented on the mobile terminal by driving at least one application program stored in the memory 270.

Hereinafter, the various components of the mobile terminal 200 will be described in detail with reference to FIG. 2B.

First, referring to the wireless communication unit 210, the broadcast receiving module 211 of the wireless communication unit 210 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. More than one broadcast receiving module may be provided to the mobile terminal 200 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 212 may be a mobile communication module or a mobile communication module that is capable of communicating with one or more mobile communication devices in a mobile communication environment using technology standards or communication methods (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 213 is a module for wireless Internet access, and may be embedded in the mobile terminal 200 or externally. The wireless Internet module 213 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 213 transmit and receive data according to at least one wireless Internet technology in a range including internet technologies not listed above.

The wireless Internet module 213 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE and LTE- May be understood as a kind of the mobile communication module 212.

The short-range communication module 214 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, NFC (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short range communication module 214 may be connected to the mobile terminal 200 and the wireless communication system through the wireless area networks or between the mobile terminal 200 and another mobile terminal, And may support wireless communication between the network where the mobile terminal (or the external server) is located. The short-range wireless communication network may be a short-range wireless personal area network.

Here, another mobile terminal may be a wearable device (e.g., a smart watch, smart smart, etc.) capable of interchanging data with the mobile terminal 200 according to the present invention glass, HMD (head mounted display)). The short range communication module 214 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 200 around the mobile terminal 200. If the detected wearable device is a device authenticated to communicate with the mobile terminal 200 according to the present invention, the control unit 280 may transmit at least a part of the data processed by the mobile terminal 200 to the short- 214 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 200 through the wearable device. For example, according to this, when a phone is received in the mobile terminal 200, the user performs a phone call through the wearable device, or when a message is received in the mobile terminal 200, It is possible to check the message.

The position information module 215 is a module for obtaining the position (or current position) of the mobile terminal 200, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, using the GPS module, the mobile terminal 200 can acquire the position of the mobile terminal 200 using a signal transmitted from the GPS satellite. As another example, when the mobile terminal 200 utilizes the Wi-Fi module, the location of the mobile terminal 200 is determined based on information of a wireless access point (AP) that transmits or receives a wireless signal with the Wi- Can be obtained. Optionally, the location information module 215 may replace or, in addition, perform at least one of the other modules of the wireless communication unit 210 to obtain data regarding the location of the mobile terminal 200. The location information module 215 is a module used for obtaining the location (or the current location) of the mobile terminal 200, and is not limited to a module for directly calculating or acquiring the location of the mobile terminal 200.

Next, the input unit 220 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 221 may be provided. The camera 221 processes an image frame such as a still image or a moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 251 or stored in the memory 270. [ The plurality of cameras 221 provided in the mobile terminal 200 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 200 through the camera 221 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 221 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 222 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 200. Meanwhile, the microphone 222 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 223 is for receiving information from a user and when the information is inputted through the user input unit 223, the control unit 280 can control the operation of the mobile terminal 200 to correspond to the input information . The user input unit 223 may include a mechanical input means (or a mechanical key, for example, a button located on the rear or side of the mobile terminal 200, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, And a touch key disposed on the touch panel. Meanwhile, the virtual key or the visual key can be displayed on a touch screen having various forms, for example, a graphic, a text, an icon, a video, As shown in FIG.

Meanwhile, the sensing unit 240 senses at least one of information in the mobile terminal 200, surrounding environment information surrounding the mobile terminal 200, and user information, and generates a corresponding sensing signal. The control unit 280 may control the driving or operation of the mobile terminal 200 or may perform data processing, function or operation related to the application program installed in the mobile terminal 200 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 240 will be described in more detail.

First, the proximity sensor 241 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 241 may be disposed in the inner area of the mobile terminal 200 or proximity sensor 241 near the touch screen.

Examples of the proximity sensor 241 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 241 can be configured to detect the proximity of the object with a change in the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 241 is capable of sensing proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 280 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 241 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Further, the control unit 280 can control the mobile terminal 200 so that different operations or data (or information) are processed depending on whether the touch to the same point on the touch screen is a proximity touch or a touch contact .

The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 251) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated at a specific portion into an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the control unit 280. Thus, the control unit 280 can know which area of the display unit 251 is touched or the like. Here, the touch controller may be a separate component from the control unit 280, and may be the control unit 280 itself.

On the other hand, the control unit 280 may perform different controls or perform the same control according to the type of the touch object, which touches the touch screen (or a touch key provided in the touch screen). Whether to perform different controls or to perform the same control depending on the type of the touch object may be determined according to the current state of the mobile terminal 200 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. On the other hand, the controller 280 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 221 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 221 and the laser sensor can be combined with each other to sense a touch of a sensing object with respect to a three-dimensional stereoscopic image. The photosensor may be laminated to the display element, which is configured to scan the movement of the object proximate the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 251 displays (outputs) information processed by the mobile terminal 200. For example, the display unit 251 may display execution screen information of an application program driven by the mobile terminal 200 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 251 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The audio output unit 252 may output audio data received from the wireless communication unit 210 or stored in the memory 270 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 252 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, and the like) performed by the mobile terminal 200. [ The sound output unit 252 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 253 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 253 may be vibration. The intensity and pattern of the vibration generated in the haptic module 253 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 253 may combine and output different vibrations or sequentially output the vibrations.

In addition to the vibration, the haptic module 253 may include a pin arrangement vertically moving with respect to the contacted skin surface, a spraying force or suction force of the air through the injection port or the suction port, a contact with the skin surface, contact with the electrode, And various effects of tactile effect such as effect of reproducing the cold sensation using the heat absorbing or heatable element can be generated.

The haptic module 253 can not only transmit the tactile effect through the direct contact but also can be implemented so that the user can feel the tactile effect through the muscular sense such as the finger or the arm. The haptic module 253 may include two or more haptic modules according to the configuration of the mobile terminal 200.

The light output unit 254 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 200. Examples of events that occur in the mobile terminal 200 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output by the optical output unit 254 is implemented as the mobile terminal 200 emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated when the mobile terminal 200 senses the event confirmation of the user.

The interface unit 260 serves as a path for communication with all external devices connected to the mobile terminal 200. The interface unit 260 receives data from an external device or supplies power to each component in the mobile terminal 200 or allows data in the mobile terminal 200 to be transmitted to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio input / output port, a video input / output port, an earphone port, and the like may be included in the interface unit 260.

The identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 200 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the mobile terminal 200 through the interface unit 260.

When the mobile terminal 200 is connected to an external cradle, the interface unit 260 may be a path through which the power from the cradle is supplied to the mobile terminal 200, May be a channel through which various command signals transmitted to the mobile terminal 200 may be transmitted. Various command signals or power from the cradle can be operated as a signal to recognize that the mobile terminal 200 is correctly mounted on the cradle.

The memory 270 may store a program for the operation of the control unit 280 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 270 may store data on vibrations and sounds of various patterns that are output upon touch input on the touch screen.

The memory 270 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read -only memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk. The mobile terminal 200 may operate in association with a web storage that performs a storage function of the memory 270 on the Internet.

Meanwhile, as described above, the control unit 280 controls an operation related to an application program and an overall operation of the mobile terminal 200. [ For example, when the state of the mobile terminal satisfies a set condition, the control unit 280 can execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 280 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing capable of recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the control unit 280 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 200 according to the present invention.

The power supply unit 290 receives external power and internal power under the control of the controller 280 and supplies power required for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 290 may include a connection port, and the connection port may be configured as an example of an interface 260 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 290 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Meanwhile, in the mobile terminal 200 according to the present invention, in order to sense the tab through the acceleration sensor or the touch sensor, when the display unit 251 of the mobile terminal 200 is in a deactivated state, It can operate in a specific mode. This particular mode may be referred to as a 'doze mode'.

For example, in the doze mode, in a touch screen structure in which a touch sensor and a display unit 251 have a mutual layer structure, only a light emitting element for outputting a screen in the display unit 251 is turned off, The sensor may be in the on-state state. Alternatively, the dose mode may be a mode in which the display unit 251 is turned off and the acceleration sensor is kept on. Alternatively, the dose mode may be a mode in which the display unit 251 is turned off, and the touch sensor and the acceleration sensor are all kept on.

Therefore, in the doze mode state, that is, when the display unit 251 is turned off, or when the display unit 251 is turned off (the display unit 251 is inactive) When the tap is applied to at least one point on the mobile terminal 200 or a specific point of the main body of the mobile terminal 200, it is possible to detect that the tap is applied from the user through at least one of the on- have.

Hereinafter, embodiments related to a control method that can be implemented in the HMD 100 according to an embodiment of the present invention will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

3 is a flowchart for explaining an operation procedure of a device for controlling image information displayed in the HMD in the HMD related to the present invention.

3, the controller 180 of the HMD 100 according to the embodiment of the present invention displays the initial image information corresponding to the selected content on the display unit 151 (S300). The image information may be generated in the HMD 100 or may be generated in the mobile terminal 200 connected to the HMD 100. If the control information is generated from the mobile terminal 200, the controller 180 controls the mobile terminal 200 to generate the image information through the mobile terminal 200, It is possible to receive image information and output the received image information on the display unit 151 of the HMD 100. [

On the other hand, the generated image information may be related to the virtual space experience or related to the specific multi-image data according to the selected contents. However, in the following description, it is assumed that it is related to the virtual space experience for convenience of explanation.

Meanwhile, when the image information is output as described above, the controller 180 of the HMD 100 can detect the motion of the device set to be the current motion detection target. The image information displayed on the display unit 151 may be controlled according to the motion detection result (S302).

The HMD 100 according to the embodiment of the present invention is displayed on the display unit 151 of the HMD 100 according to an input signal detected from the HMD 100 or the mobile terminal 200 It has been mentioned that image information can be controlled. The input signal sensed by the HMD 100 or the mobile terminal 200 may be a result of sensing the movement of the HMD 100 or the mobile terminal 200.

That is, when the HMD 100 is set as an apparatus to control image information currently displayed in the HMD 100, the HMD 100 controls the movement of the HMD 100, (For example, the acceleration sensor 141 and the gyro sensor 143) included in the controller 140. Then, by displaying the virtual space image in the direction corresponding to the detected motion on the display unit 151, the image information controlled according to the motion detection result of the HMD 100 can be outputted on the display unit 151 have.

Here, if the image information is generated in the HMD 100, the controller 180 of the HMD 100 generates image information on the virtual space image in the specific direction based on the result of detecting the motion of the HMD 100, Can be generated and output. On the other hand, if the image information is generated in the mobile terminal 200, the controller 180 of the HMD 100 can transmit the result of detecting the motion of the HMD 100 to the mobile terminal 200. [ The mobile terminal 200 may control the mobile terminal 200 to generate image information on a virtual space image in a specific direction based on the motion detection result of the HMD 100. The image information generated by the mobile terminal 200 may be received and displayed on the display unit 151.

On the other hand, when the mobile terminal 200 is set as an apparatus to control the image information currently displayed in the HMD 100, the HMD 100 controls the image information to be controlled according to the result of detecting the motion of the mobile terminal 200 On the display unit 151. [0154] FIG. In this case, the HMD 100 may control the control unit 280 of the mobile terminal 200 to detect movement of the mobile terminal 200. The control unit 280 of the mobile terminal 200 may sense the movement of the mobile terminal 200 using at least one of the sensors included in the sensing unit 240 of the mobile terminal 200.

Here, if the image information is generated in the HMD 100, the controller 180 of the HMD 100 may receive the result of detecting the motion from the mobile terminal 200. [ Image information on a virtual space image in a specific direction corresponding to the motion detection result of the mobile terminal 200 can be generated and output. On the other hand, if the image information is generated in the mobile terminal 200, the controller 180 of the HMD 100 generates the image information for the virtual space image in the specific direction corresponding to the motion detection result of the mobile terminal 200 The mobile terminal 200 can control the mobile terminal 200 to generate the mobile terminal 200. The image information generated from the mobile terminal 200 can be received and displayed on the display unit 151.

Meanwhile, in step S302, the controller 180 may detect whether a preset specific situation has occurred in a state of displaying image information according to a result of detecting movement of any one device (S304). The control unit 180 may change a device (hereinafter, referred to as 'controller') for controlling the image information displayed in the HMD 100 when the predetermined condition occurs in step S304 S306).

Here, the preset specific situation may be various.

For example, the preset specific situation may be a gesture of a predetermined user, a touch input, or the like. That is, when the user takes a specific gesture while wearing the HMD 100 or takes a specific gesture while holding the mobile terminal 200, the control unit 180 of the HMD 100 controls the It can be detected that the set specific situation has occurred. The control unit 180 can detect that the preset specific situation has occurred when a preset touch input is detected on the touch screen of the mobile terminal 200. [ The gesture may be sensed through sensors provided in the HMD 100 and sensors provided in the mobile terminal 200. When the gesture or touch input is sensed, The gesture or the touch input of the user can be detected as the input of the user for changing the target device to be detected, that is, the controller.

According to the input of the user, the control unit 180 can change the currently set controller to another controller. Then, the controller 150 senses the motion of the changed controller, and outputs the image information controlled according to the sensed motion on the display unit 151 (S308).

Therefore, if the controller is set as the HMD 100 in step S302, the controller 180 can change the controller to the mobile terminal 200 in step S306. In step S308, the controller 180 detects the motion of the mobile terminal 200 and outputs the image information controlled on the basis of the detected motion on the display unit 151. [ On the other hand, if the controller is set as the mobile terminal 200 in step S302, the controller 180 can change the controller to the HMD 100 in step S306. In step S308, the controller 180 detects the motion of the HMD 100 and outputs the image information, which is controlled according to the detected motion, on the display unit 151. [

In step S304, the controller 180 may detect whether a predetermined situation has occurred. If it is determined in step S304 that no preset condition has occurred, the controller 180 proceeds to step S302 to detect the motion of the currently set controller, and controls the display unit 151 ). ≪ / RTI > On the other hand, if it is determined in step S304 that the predetermined condition has occurred, the controller 180 may change the currently set controller to another controller in step S306. In step S308, the controller 150 senses the motion of the controller and outputs the image information to the display unit 151 according to the detected motion.

In the above description, it is assumed that the predetermined condition is generated according to the user's selection, that is, the user takes a specific gesture while gripping the tofu wearing the HMD 100 or the mobile terminal 200, It is needless to say that a situation that occurs irrespective of the user's selection may be the predetermined condition.

For example, the preset specific situation may be a situation where the amount of power of the currently set controller drops below a predetermined level. In this case, even if the specific gesture or the touch input is not detected, the controller 180 may proceed to step S306 to change the controller to another device.

Alternatively, the control unit 180 may detect that the preset specific situation has occurred according to a function currently being executed in the HMD 100. [ For example, the control unit 180 of the HMD 100 displays image information related to a specific function on the display unit 151 based on the motion of the HMD 100 or the mobile terminal 200 and the direction of the user's gaze , It can be detected that the preset specific situation has occurred.

In the above description, only the controller currently set is changed to another device in the case where the preset specific situation occurs in step S304, but in step S304, it is detected whether the preset specific situation is ended Of course.

For example, when image information related to a specific function is displayed on the display unit 151 as described above, the controller 180 can detect that a preset situation has occurred and change the currently set controller. In this state, if the display of the image information related to the specific function is ended on the display unit 151, the controller 180 can detect that the current situation has ended. In this case, the controller 180 may change the currently set controller again.

Accordingly, in the present invention, when a specific situation occurs, the image information output on the display unit 151 from the time when the occurrence of the specific situation is detected until the occurrence of the specific situation occurs, But can be controlled through a different device. That is, when the currently set controller is the HMD 100, the controller 180 controls the display unit 151 to display the image displayed on the display unit 151 based on the motion sensed through the mobile terminal 200 until the specific situation ends Information can be controlled.

Accordingly, when image information related to a specific function, that is, specific image information, is displayed on the display unit 151 as described above, the control unit 180 controls the display unit 151 to display The display unit 151 can be controlled so that image information controlled according to the detected motion is output on the display unit 151. [

In addition, as described above, when the predetermined condition occurs based on the remaining amount of power of the HMD 100, the controller 180 controls the controller 180 until the amount of power of the HMD 100 reaches a predetermined level, (200). Then, image information controlled according to the motion sensed by the mobile terminal 200 may be output on the display unit 151. When the mobile terminal 200 is set as a controller and the electric power of the HMD 100 is equal to or more than a predetermined level due to the charging of the HMD 100 or the like, the controller 180 may change the controller back to the HMD 100 have.

On the other hand, when the currently set controller is the mobile terminal 200, the control unit 180 controls the display unit 151 to display the image information (e.g., image information) displayed on the display unit 151 based on the motion sensed through the HMD 100 Can be controlled. In this case, the control unit 180 can display the image information controlled according to the motion detected by the HMD 100 on the display unit 151 in a state where the specific image information is displayed on the display unit 151 have. Further, when the remaining power of the mobile terminal 200 is less than a predetermined level, the controller can be changed to the HMD 100.

Meanwhile, in the above description, it is mentioned that the preset specific situation may be a gesture or a touch input of a predetermined user. In this case, the end of the predetermined specific situation may be a case where the specific function executed by the gesture of the predetermined user or the touch input is terminated. That is, for example, when a specific function is executed according to the user's gesture or touch input, while the specific function is being executed, the device controlling the image information displayed on the display unit 151 can be changed to another device And when the specific function is terminated, the image information can be controlled to be controlled according to the motion detected by the original device.

That is, if the currently set controller is the HMD 100, if the specific function corresponding to the gesture or the touch input is the function of browsing the image or the function of browsing the specific information, the controller 180 controls the image browsing function or the information browsing function The image information displayed on the display unit 151 can be controlled according to the motion sensed through the mobile terminal 200. [ When the image browsing function or the information browsing function is terminated, the image information displayed on the display unit 151 can be controlled according to the motion sensed through the HMD 100. [

Meanwhile, the preset specific situation may be the situation where the user's gesture or touch input is detected. Accordingly, when the gesture or touch input of the user is detected, the control unit 180 can change the currently set controller to another device. In this situation, the controller 180 may detect that a gesture of the user or a situation in which the touch input is sensed is terminated when the user's gesture or touch input is detected again.

The control unit 180 controls the display unit 151 according to the motion sensed through the mobile terminal 200. The display unit 151 displays the gesture or the specific function corresponding to the touch input when the currently set controller is the HMD 100. [ It is possible to control the image information displayed on the display device. When the gesture or the specific function corresponding to the touch input is detected again, the control unit may control the image information displayed on the display unit 151 according to the motion sensed through the HMD 100 again.

In the above description, it is assumed that the controller is changed to a controller different from the currently set controller in the case where the predetermined condition occurs. Alternatively, a specific controller corresponding to a specific situation may be set in advance Of course.

For example, the specific gesture sensed from the HMD 100 worn on the user's head may be a device for controlling the image information displayed on the HMD 100, and the HMD 100 may be preset. The specific gesture sensed by the mobile terminal 200 may be a device that controls the image information displayed in the HMD 100 and that the mobile terminal 200 is preset.

Alternatively, when the touch input of the user is sensed through the touch screen of the mobile terminal 200, the control unit 180 may preset the mobile terminal 200 as an apparatus to control the image information displayed in the HMD 100 have. However, when the touch input forms a predetermined pattern, the controller 180 may set a specific device corresponding to the touch input pattern as a device to control image information displayed in the HMD 100 Of course.

Or image information related to a specific function is displayed on the display unit 151 based on the movement of the mobile terminal 200 and the direction of the user's gaze, the HMD 100 or the mobile terminal 200 may be displayed on the display unit 151 according to the type of the displayed image information. The controller 200 may be previously selected as an apparatus for controlling image information displayed in the HMD 100. [ For example, when the displayed image information requires control by a finer user (for example, selection of a specific graphic object or selection of a specific content), if the image information is displayed on the display unit 151, 180 may configure the mobile terminal 200 as an apparatus to control image information displayed in the HMD 100. [

If a specific controller corresponding to a specific situation is set in advance, the controller 180 determines whether a predetermined controller corresponding to the currently generated situation is present in the current HMD 100, It is possible to confirm whether or not the device is set as the device to control the image information displayed on the screen. As a result, it is also possible to change the controller currently set in step S306 to another controller only when the predetermined controller previously set to correspond to the current situation is different from the device set to control the current image information.

Any one of the HMD 100 and the mobile terminal 200 may be preset as a basic controller for controlling image information displayed on the display unit 151. [ In this case, the controller 180 can detect the motion of any one of the devices set in the basic controller and output the image information according to the detection result on the display unit 151 without the user's selection or the occurrence of the specific situation have. Here, the basic controller may be set by a user and may be changed at any time according to a user's selection.

Meanwhile, when the controller 180 of the HMD 100 outputs image information to be controlled according to the motion detected by any one of the HMD 100 and the mobile terminal 200 in step S302, The device may not detect motion. For example, when the image information is controlled according to the motion of the HMD 100 in step S302, the controller 180 of the HMD 100 may turn off the motion detection function of the mobile terminal 200 It is possible. In this case, the control unit 281 of the mobile terminal 200 may turn off an acceleration sensor or a gyroscope sensor for detecting the movement of the mobile terminal 200.

On the other hand, when the image information is controlled according to the motion of the mobile terminal 200, the controller 180 of the HMD 100 may turn off the motion detection function of the HMD 100 in step S302. In this case, the HMD 100 may turn off the acceleration sensor, the gyro sensor, or the like for detecting the motion of the HMD 100.

4 is a flowchart illustrating an operation of changing the device controlling the HMD 100 according to the amount of power of the HMD 100 and the mobile terminal 200 connected to the HMD 100 in the HMD 100 related to the present invention Fig.

4, when the controller 180 of the HMD 100 according to the embodiment of the present invention detects that a preset specific situation has occurred according to the detection result of step S304 of FIG. 3, (S400). Here, the 'target device' may be a device that controls the image information currently displayed in the HMD 100, that is, a device other than the device set in the controller, or a device corresponding to the detected specific situation.

In step S402, the control unit 180 may check whether the amount of power of the target device to be detected is greater than a predetermined level as a result of the power amount check in step S400. If it is determined in step S402 that the checked amount of power is equal to or greater than the predetermined level, the control unit 180 may change the currently set controller to the 'target device' (S404). 3, the control unit 180 may detect the motion of the 'target device' and output the image information controlled according to the detected motion on the display unit 151. In this case,

On the other hand, if it is determined in step S402 that the checked power of the 'target device' is less than a preset level, the controller 180 may not change the currently set controller. In this case, the control unit 180 may display notification information on the display unit 151 to inform that the 'target device' is insufficient.

Meanwhile, in FIG. 4, it is described that, in the case where a preset specific situation occurs, whether or not the currently set controller is changed according to the amount of power of the 'target device' is explained. Alternatively, the controller may be changed Of course it is possible. In this case, the control unit 180 may display notification information on the display unit 151 to notify the user that the battery power is insufficient and the controller is changed accordingly.

In addition, even if the remaining power of the battery is insufficient, the image information displayed on the display unit 151 may be controlled according to the motion sensed by the device currently set to the controller according to the user's selection to be. Or if the HMD 100 and the mobile terminal 200 are both in an insufficient amount of power, the control unit 180 determines whether the amount of power consumed by the HMD 100 or the user of the mobile terminal 200 It is needless to say that the image information displayed on the HMD 100 can be controlled based on the motion detected by any one of the devices according to the selection.

According to the above description, the controller 180 of the HMD 100 according to the embodiment of the present invention may change the currently set controller based on the specific image information displayed on the display unit 151 There is a bar. FIG. 5 is a flowchart illustrating an operation of changing an apparatus for controlling image information displayed in the HMD according to a graphic object displayed on a display unit in the HMD related to the present invention. In the following description, it is assumed that the HMD 100 is set as a controller for convenience of explanation. In this case, the controller 180 detects motion of the HMD 100, for example, a rotational motion state or a linear motion state, and outputs a virtual space image of a specific direction corresponding to the sensed result on the display unit 151 .

5, the control unit 180 of the HMD 100 according to the embodiment of the present invention may output image information controlled according to the motion detection result of the HMD 100 (S500). Accordingly, the control unit 180 can output the virtual space image in the direction of the movement of the HMD 100 on the display unit 151. [

In this state, the controller 180 can sense the direction in which the user's gaze is directed. For example, the control unit 180 tracks the position of the user's pupil using the sensed value of the eye tracking sensor 142, and recognizes one area on the display unit 151 that the user is looking at according to the tracked pupil position can do. For example, if the user views the one area on the display unit 151 for more than a preset time, the control unit 180 can determine that the user is looking at the one area.

In this case, the control unit 180 may detect whether a specific graphic object is displayed in an area on the display unit 151 that the user is looking at (S504). If it is detected in step S504 that the user is gazing at the area where the specific graphic object is displayed, the information related to the predetermined specific graphic object may be displayed on the display unit 151 in step S506.

Here, if the information related to the predetermined graphic object is displayed, the controller 180 may determine that the preset status of step S304 has occurred. Accordingly, the currently set controller can be changed from the HMD 100 to the mobile terminal 200.

Accordingly, the control unit 180 may cause the display unit 151 to output image information to be controlled according to a result of the motion of the mobile terminal 200 (S506). For example, if the image information is generated and output by the HMD 100, the controller 180 receives a result of detecting movement of the mobile terminal 200 from the mobile terminal 200 in step S506 And generates image information according to the received motion detection result and outputs the generated image information. However, if the image information is generated in the mobile terminal 200, the controller 180 controls the mobile terminal 200 to generate image information according to the result of detecting the motion of the mobile terminal 200 in step S506, ), And can receive the generated image information from the mobile terminal 200 and output it.

In this state, the controller 180 can check whether the display of the information related to the specific graphic object is finished (S510). For example, when the user gazes at a display area other than the area where the image information is displayed, or when a touch command or a touch gesture of a predetermined user applied to the touch screen of the mobile terminal 200 If so, the display of the information related to the specific graphic object can be terminated. When the display of the information related to the specific graphic object is terminated, the controller 180 may change the currently set controller from the mobile terminal 200 to the HMD 100 again. In this case, image information according to the movement of the head of the user, which is sensed through the HMD 100, may be output on the display unit 151.

In the above description, it is assumed that the currently set controller is changed by a touch gesture according to a user's selection, a touch input is detected, or specific information is displayed or a remaining amount of power is insufficient. However, But is not limited thereto.

For example, the predetermined state may be a state in which a specific event occurs in the mobile terminal 200. In this case, the controller 280 of the mobile terminal 200 may transmit information related to the event occurring in the mobile terminal 200 to the HMD 100. In this case, the controller 180 of the HMD 100 may display notification information for notifying the event generated in the mobile terminal 200 on the display unit 151 of the HMD 100. In this case, the notification information may include information related to the generated event, and the state in which the notification information is displayed may be a predetermined state in which the controller is changed.

Accordingly, the controller 180 controls the image information displayed on the display unit 151 so that the mobile terminal 200 can be set. The control unit 180 of the HMD 100 may receive information on a specific event according to the selection of the user from the mobile terminal 200. Here, the selection of the user may be applied through the touch screen of the mobile terminal 200.

The control unit 180 of the HMD 100 may display the information received from the mobile terminal 200 on the display unit 151. [ Accordingly, information on an event that has occurred in the mobile terminal 200 can be displayed on the display unit 151 of the HMD 100. In this case, the user's touch input sensed through the touch screen of the mobile terminal 200 may include image information displayed on the display unit 151 of the HMD 100 (event related information received from the mobile terminal 200 In the corresponding area of the display screen.

In the above description, in the HMD 100 according to the embodiment of the present invention, a controller for controlling the image information displayed in the HMD 100 according to the user's selection or a predetermined specific situation is provided to the HMD 100, (200) is described in detail with reference to a plurality of flowcharts.

In the following description, an example in which a controller for controlling image information displayed in the HMD 100 is changed in the HMD 100 according to the embodiment of the present invention, Let's take a closer look with reference to the drawings.

6 is an exemplary diagram illustrating an example in which the HMD is controlled according to the motion of the HMD or the controller device in the HMD related to the present invention.

Referring to FIG. 6, the first diagram of FIG. 6 shows an example in which a controller for controlling image information displayed in the HMD 100 is set as the HMD 100. In this case, roll, yaw, and pitch are sensed according to the movement of the head of the wearer wearing the HMD 100, as shown in the first drawing of FIG. 6, The image information displayed on the display unit 151 of the HMD 100 can be controlled according to the roll, yaw, and pitch values.

For example, the vertical line of sight of the image 600 in the virtual space displayed on the display unit 151 of the HMD 100 may be changed according to the change of the pitch value. That is, as the pitch value increases, the viewing angle of the user looking at the image 600 of the virtual space may be higher, and accordingly, as the user views the ceiling portion of the virtual space image 600, The image information can be changed so that the portion becomes larger.

In addition, the viewing angle of the image 600 in the virtual space displayed on the display unit 151 of the HMD 100 may be changed according to the change of the yaw rate. That is, as the yaw rate increases, the viewing angle of the user looking at the image 600 of the virtual space may be shifted to the left or right side. Accordingly, when the user views the left or right side of the virtual space image 600 The image information can be changed so that the left side wall or the right side wall is made larger, as if looking at the wall portion.

Meanwhile, in this state, when a preset situation occurs, a controller for controlling image information displayed in the HMD 100 may be changed to the mobile terminal 200. [ In this case, the control unit 180 of the HMD 100 may change the image information displayed on the display unit 151 based on the movement of the mobile terminal 200. 6, when the user rotates the mobile terminal 200 forward or backward in the longitudinal direction 650, the user rotates the mobile terminal 200 on the display portion 151 of the HMD 100 The vertical line of sight of the image 600 of the displayed virtual space can be changed. That is, as the user tilts the mobile terminal 200 forward or backward, the gaze angle of the user looking at the image 600 in the virtual space may become higher or lower, The image information may be changed so that the ceiling or the floor is made larger.

When the user rotates the mobile terminal 200 left or right in the lateral direction 660, the left and right viewing angles of the image 600 in the virtual space displayed on the display unit 151 of the HMD 100 are changed . That is, as the angle at which the user rotates the mobile terminal to the left or right is greater, the viewing angle of the user looking at the image 600 of the virtual space may be shifted to the left or right side, The image information can be changed so that the left wall or the right wall is made larger as if looking at the left wall or the right wall portion of the screen 600. [

7A and 7B are diagrams illustrating an example in which user input for changing the device controlling the HMD 100 is detected in the HMD 100 related to the present invention.

7A shows an example in which a user inputs an input for changing a controller for controlling image information displayed in the HMD 100 through the mobile terminal 200. [ For example, as shown in FIG. 7A, the user inputs a predetermined touch input to the touch screen of the mobile terminal 200, or receives a predetermined gesture while holding the mobile terminal 200 There may be cases when you take.

7A shows a state in which a user views contents via the HMD 100 and a touch of the mobile terminal 200 connected to the HMD 100 wirelessly or by wire is shown in FIG. And shows an example in which a user applies a preset touch input on the screen 251.

For example, when the user is watching the content through the HMD 100, the mobile terminal 200 may be in the dose mode state described above. 7A, only the light emitting element for outputting a screen on the touch screen 251 is turned off, and the touch sensor 251 or the acceleration sensor 252 is turned off, The gyroscope sensor or the like may be in a state of maintaining the on state. Accordingly, there is no image information to be displayed, but it may be in a state capable of sensing an applied touch input or detecting the movement of the mobile terminal 200 or the like.

7A, when the touch input is applied, the control unit 280 of the mobile terminal 200 can sense the touch input and notify the control unit 180 of the HMD 100 of the touch input . The controller 180 of the HMD 100 may sense the touch input as a user input for changing a controller for controlling currently displayed image information. Accordingly, the control unit 180 may set a device other than the currently set controller as a device that controls image information displayed in the HMD 100. [ The controller 180 of the HMD 100 may detect the input of the user to set the controller to the mobile terminal 200 when the touch input as shown in FIG. .

On the other hand, FIG. 7 (a) may be a plurality of touch inputs forming a predetermined touch pattern. The touch pattern may be set to correspond to a specific device. Accordingly, when a plurality of touch inputs sensed through the touch screen 251 form a predetermined pattern, the controller 180 of the HMD 100 controls the corresponding device to display image information displayed on the HMD 100 It is also possible to set the device to be controlled.

Meanwhile, when the mobile terminal 200 is in the doze mode, the controller 280 of the mobile terminal 200 can sense the movement of the mobile terminal 200. [ Accordingly, as shown in FIGS. 7 (b) and 7 (c), when the user rotates the mobile terminal 200 or the mobile terminal 200 swings up and down in a state of holding the mobile terminal 200, It is possible to detect a position shift caused by a gesture or the like. The control unit 180 of the mobile terminal 200 may notify the controller 180 of the HMD 100 of the sensed gesture.

Then, the control unit 180 of the HMD 100 may sense the user's gesture as an input of the user for changing the controller for controlling the currently displayed image information. Accordingly, the control unit 180 may set a device other than the currently set controller as a device that controls image information displayed in the HMD 100. [

Meanwhile, FIG. 7B shows an example of detecting the head gesture of the user from the HMD 100, rather than the mobile terminal 200. FIG. For example, the gesture of such a user may be a gesture such as that shown in (a), (b), or (c) of Fig. 7b, in which the user shakes his head left or right or back and forth, . If the user's gesture is detected more than a preset number of times or a preset time, the controller 180 can detect that the user's gesture is to change the currently set controller.

Therefore, if the gesture similar to the one shown in FIG. 7B is repeated more than a preset time or a preset number of times, the controller 180 outputs the currently set controller and the other device to the HMD As an apparatus for controlling image information displayed on the display unit 100. [ Alternatively, the controller 180 of the HMD 100 may detect the gesture as an input of the user for setting the controller to the HMD 100.

7 (b), the controller 180 of the HMD 100 controls the movement of the HMD 100 in a direction in which the front surface of the HMD 100 faces the specific surface (e.g., the front surface or the rear surface) of the mobile terminal 200 State, it can be detected that the currently set controller is to be changed. For example, the control unit 180 of the HMD 100 detects that the specific surface of the mobile terminal 200 faces the HMD 100 within a predetermined distance using a camera provided in the HMD 100 . The control unit 180 of the mobile terminal 200 may determine whether the mobile terminal 200 is connected to a specific surface of the HMD 100 from a camera provided on the front surface (the surface on which the display unit 251 is formed) It is possible to detect that the specific surface of the display device faces the display device within a predetermined distance. And informs the control unit 180 of the HMD 100 of this. For example, the detection may be performed through an infrared sensor, a laser sensor, an optical sensor (or a photosensor) provided in the HMD 100 or the mobile terminal 200, and the like.

Meanwhile, when a controller for controlling image information displayed on the display unit 151 of the HMD 100 is set, the controller 180 of the HMD 100 can display information related to the device currently set as the controller have. 8A and 8B show examples of screens displayed differently according to devices controlling image information displayed in the HMD in the HMD related to the present invention.

Referring to FIG. 8A, FIG. 8A shows an example of a screen displayed when the controller is set as the HMD 100. FIG. Here, the case where the controller is set as the HMD 100 means that the image information displayed on the display unit 151 of the HMD 100 is controlled according to the movement of the head of the wearer wearing the HMD 100 .

In this case, the control unit 180 may display the graphic object 800 including information on the controller currently set in at least a part of the display unit 151. [ 8A shows an example in which the controller is set in the HMD 100 in this case.

Meanwhile, the HMD 100 may be set as a basic controller by a user. In this case, when the image information displayed on the display unit 151 is controlled by the motion sensed by the HMD 100 (when the HMD 100 is set to the controller), the control unit 180 displays 151 may not display an indication to inform the user. Thus, as shown in FIG. 8A (b), image information without special display can be displayed on the display unit 151.

On the other hand, when the currently set controller is the mobile terminal 200, the control unit 180 may display information for informing the controller on the display unit 151. [ 8B, the control unit 180 may display the graphic object 850 on the display unit 151 to indicate that the currently set controller is the mobile terminal 200. In this case, When the controller is changed to the mobile terminal 200, the controller 180 controls the operation of the HMD 100 according to the motion detected by the motion detected by the mobile terminal 200, Information on the display unit 151 can be displayed.

8A, if the controller is changed to the mobile terminal 200 while the HMD 100 is set as the primary controller, the controller 180 determines whether the currently set controller is the primary controller, It is possible to display on the display unit 151 that the device is not set as the device. In this case, the control unit 180 determines whether the image information displayed on the display unit 151 is in a state in which the controller (e.g., the HMD 100) Are set to the controller).

8B, when the controller is set to the HMD 100 and the controller is changed to the mobile terminal 200 according to a specific situation or a user's selection, a graphic object 852 in the form of a border formed on the boundary of an area where image information is displayed on the display unit 151 as shown in FIG. In this case, the graphic object 852 in the form of a border may be for indicating to the user that the device currently set as the controller is not the device set as the basic controller.

The controller 180 of the HMD 100 according to the embodiment of the present invention not only selects the user but also displays information related to a specific graphic object on the display unit 151, It is mentioned that the device to control the image information displayed in the HMD 100 can be changed. Fig. 9 shows an example of such a case. In the following description, it is assumed that the HMD 100 is set as a controller for controlling image information to be displayed for convenience of explanation.

9, the control unit 180 of the HMD 100 according to the embodiment of the present invention detects movement of the head of the user by sensors provided in the HMD 100, that is, an acceleration sensor 141 and / or the gyro sensor 143. [0033] The image information displayed on the display unit 151 can be controlled according to the sensed head movement.

In addition, the control unit 180 may track the position of the user's eyes through the eye tracking sensor 142 and recognize a specific area on the display unit 151 to be examined by the user. Accordingly, as shown in the first diagram of FIG. 9, when the user gazes at the area where the TV 900 is displayed among the virtual objects in the virtual space displayed on the display unit 151, the user can recognize this. For example, when the user views the area on the display unit 151 on which the TV 900 is displayed for more than a preset time, the controller 180 can recognize that the TV 900 is staring the TV 900.

In this case, the controller 180 may determine that the TV 900 of the virtual objects is selected by the user. Then, the control unit 180 may display on the display unit 151 information related to the virtual object selected by the user, i.e., the TV 900 and the function. Accordingly, as shown in the second diagram of FIG. 9, the control unit 180 controls the functions associated with the TV 900, that is, different graphic objects 920, 922, and 924 can be displayed on the display unit 151.

If the information (920, 922, 924) related to a specific graphic object (TV 900) is displayed on the display unit 151 as described above, the controller 180 determines whether the currently set controller It can be judged that it has occurred. This is because, as shown in the second drawing of FIG. 9, since a plurality of pieces of information are displayed, more detailed control for selecting one of them may be required for the user.

Accordingly, the control unit 180 can change the device currently set as the controller to another device. Accordingly, if the currently set controller is the HMD 100 device as in the above-described assumption, the controller 180 can change the controller to the mobile terminal 200. If the controller is changed as described above, the controller 180 may display information on the display unit 151 to inform the controller of the change. For example, the control unit 180 may display a graphical object 930 on the display unit 151, including information about the currently set controller and the controller to be changed, as shown in the third diagram of FIG.

9, when the controller is changed to the mobile terminal 200, the controller 180 displays the display on the display unit 151 according to the motion detected by the mobile terminal 200, Can be controlled. Accordingly, the controller 180 displays any one of the graphic objects 922 corresponding to the motion of the mobile terminal 200, separately from the other graphic objects 920 and 924 displayed on the display unit 151 . That is, as shown in the fourth diagram of FIG. 9, the controller 180 may further include a graphic object 950 in the form of a border around one of the graphic objects 922 corresponding to the motion of the mobile terminal 200 And may indicate that any one of the graphic objects 922 is selected by the user.

According to the above description, the controller 180 of the HMD 100 according to the embodiment of the present invention can determine whether the controller is changed according to the amount of power remaining in each device. 10A and 10B are diagrams illustrating an example in which an apparatus for controlling image information displayed in the HMD 100 is changed in the HMD 100 according to the present invention in accordance with the power state of the apparatus.

10A, when the currently set controller is changed from the HMD 100 to the mobile terminal 200, notification information 1000 for informing the controller is displayed on the display unit 151 FIG. In this case, the control unit 180 may check the amount of power remaining in the device to be set as the controller, that is, the mobile terminal 200.

If the amount of power remaining in the mobile terminal 200 is less than a predetermined level as a result of the check, the controller 180 may display information 1010 for informing the user. If the amount of power of the device to be set as the controller is insufficient, the controller 180 may not change the currently set controller.

Meanwhile, as described above, not only when the controller changes, but also when the user views the content, the controller can be changed according to the amount of power. That is, the control unit 180 can measure the amount of power remaining in the HMD 100 while the user is viewing the content. If the measured amount of power is less than a predetermined level, the control unit 180 displays notification information 1050 indicating that the current amount of power of the HMD 100 is insufficient, as shown in the first drawing of FIG. 10B, 151).

The control unit 180 can change the device currently set as the controller. That is, as shown in the first drawing of FIG. 10B, if the device currently set as the controller is the HMD 100, the controller 180 can change the controller to the mobile terminal 200. When the controller is changed as described above, the controller 180 may display the information 1060 on the display unit 151 to inform the user of the change as shown in the second drawing of FIG. 10B.

10A and 10B, even if a specific situation occurs according to the remaining amount of power of the device, even if the device currently set as the controller remains in the controller or if the specific situation does not occur, For example. However, it is needless to say that the device set by the controller may be determined according to the user's selection. That is, even if the amount of power of the specific device is insufficient, the specific device may be set as a controller or may be maintained as a controller.

In the above description, the image information displayed on the display unit 151 is controlled only in accordance with the motion of the HMD 100 or the mobile terminal 200. However, The motion of the HMD 100 and the mobile terminal 200 may be detected according to the function or the user's selection. In this case, the device for generating image information of the HMD 100 or the mobile terminal 200 receives the motion detection result from the HMD 100 and the mobile terminal 200, It is needless to say that image information may be generated. In this case, the functions controlled according to the respective movements of the HMD 100 and the mobile terminal 200 on the image information or the content may be interlocked with each other, and the HMD 100, And image information according to the motion of the mobile terminal 200 may be displayed on the display unit 151. [

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. The foregoing detailed description, therefore, should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (15)

1. An HMD (Head Mounted Display) connected to a mobile terminal,
A communication unit for performing wired or wireless communication with the mobile terminal;
A display unit for outputting image information;
A sensing unit for sensing movement of the HMD; And
And a controller for controlling the display unit to output image information to be controlled according to a result of sensing the motion of the HMD,
Wherein,
Controlling the display unit to output image information controlled according to a motion sensed by the mobile terminal when any one of the predetermined conditions occurs, and when the generated situation is finished, image information controlled according to the motion of the HMD is output Wherein the controller controls the display unit to control the display unit.
The apparatus of claim 1,
Wherein the controller detects that a predetermined situation occurs when a touch input of a predetermined user is detected on a touch screen of the mobile terminal or a specific touch input gesture is detected through the mobile terminal.
3. The apparatus of claim 2,
When the predetermined function executed according to the touch input of the preset user or the specific touch input gesture is terminated or the touch input of the predetermined user or the specific touch input gesture is detected again, Wherein the HMD device senses the position of the HMD device.
3. The apparatus of claim 2,
When the predetermined head movement of the user is detected through the HMD,
And further detects that the generated situation is terminated when the predetermined head movement is detected again.
The mobile terminal of claim 2,
When connected to the HMD, it operates in a Doze mode,
In the dose mode,
Wherein the controller is operable to detect at least one of a touch input on the touch screen of the mobile terminal and a motion of the mobile terminal in a state that the light emitting element of the touch screen of the mobile terminal is off, HMD device.
The apparatus of claim 1,
When the specific image information is displayed on the display unit or when the remaining amount of power of the HMD is less than a preset level,
When the display of the specific image information is terminated or when the remaining amount of power of the HMD is equal to or higher than the preset level, the detected situation is terminated.
The method according to claim 6,
The specific image information may include,
Corresponding to a specific graphic object,
Wherein,
Wherein the display unit displays the specific image information corresponding to the specific graphic object on the display unit when the user gazes at one area on the display unit on which the specific graphic object is displayed for a predetermined time or more.
8. The apparatus of claim 7,
Wherein the display of the specific image information is terminated when the user gazes at an area other than the area on the display unit on which the specific graphic object is displayed for a predetermined time or more.
The method according to claim 1,
At least a part of the predetermined situation is,
Each corresponding specific device is set in advance,
Wherein,
Wherein the controller sets the device corresponding to the generated situation as a controller for controlling image information output on the display unit when a situation in which the specific device is set to correspond in advance occurs.
The apparatus of claim 1,
When the device controlling the image information output on the display unit is changed to another device according to the occurrence or termination of the preset specific situation, .
The image processing method according to claim 1,
Wherein the HMD is generated by image processing and rendering processing performed by any one of the HMD and the mobile terminal according to a result of detecting motion of an apparatus for controlling image information output on the display unit.
12. The apparatus according to claim 11,
When the image processing and rendering processing is performed in the mobile terminal,
Receiving image information generated by the mobile terminal according to any one of the detection result of the detection unit and the motion detection result of the mobile terminal from the mobile terminal based on the device for controlling the outputted image information, And outputs information on the display unit.
12. The apparatus according to claim 11,
If the image processing and rendering processing is performed in the HMD,
Generating image information to be controlled according to any one of the detection result of the sensing unit and the motion detection result of the mobile terminal based on the device for controlling the outputted image information and outputting the generated image information on the display unit And the HMD device.
The apparatus of claim 1,
Wherein when the device controlling the image information displayed on the display unit is changed to another device, notification information for notifying the display unit is displayed on the display unit.
A method of controlling a head mounted display (HMD) connected to a mobile terminal,
Outputting image information related to the selected content to a display unit provided in the HMD;
Sensing movement of the user's head through a sensor provided in the HMD;
Controlling image information displayed on the display unit according to the detected motion;
Detecting occurrence of a predetermined situation;
Detecting movement of the mobile terminal based on the generated specific situation;
Controlling image information displayed on the display unit according to the motion of the detected mobile terminal; And
And controlling image information displayed on the display unit based on a motion sensed by the HMD when an end of the predetermined condition is sensed.
KR1020150158307A 2015-11-11 2015-11-11 Tethering type head mounted display and method for controlling the same KR20170055296A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020150158307A KR20170055296A (en) 2015-11-11 2015-11-11 Tethering type head mounted display and method for controlling the same
US15/773,230 US20180321493A1 (en) 2015-11-11 2015-12-09 Hmd and method for controlling same
PCT/KR2015/013413 WO2017082457A1 (en) 2015-11-11 2015-12-09 Hmd and method for controlling same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150158307A KR20170055296A (en) 2015-11-11 2015-11-11 Tethering type head mounted display and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20170055296A true KR20170055296A (en) 2017-05-19

Family

ID=59049421

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150158307A KR20170055296A (en) 2015-11-11 2015-11-11 Tethering type head mounted display and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20170055296A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200012265A (en) * 2018-07-26 2020-02-05 (주)소프트젠 Multiple users audiovisual education system using vr image and method thereof
KR20200038845A (en) * 2018-10-04 2020-04-14 삼성전자주식회사 Electronic device and method for providing virtual device via at least portion of content
WO2021261829A1 (en) * 2020-06-22 2021-12-30 삼성전자 주식회사 Brightness adjustment method and hmd device
US11455028B2 (en) 2019-06-03 2022-09-27 Samsung Electronics Co., Ltd. Method for processing data and electronic device for supporting same

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200012265A (en) * 2018-07-26 2020-02-05 (주)소프트젠 Multiple users audiovisual education system using vr image and method thereof
KR20200038845A (en) * 2018-10-04 2020-04-14 삼성전자주식회사 Electronic device and method for providing virtual device via at least portion of content
US11455028B2 (en) 2019-06-03 2022-09-27 Samsung Electronics Co., Ltd. Method for processing data and electronic device for supporting same
WO2021261829A1 (en) * 2020-06-22 2021-12-30 삼성전자 주식회사 Brightness adjustment method and hmd device

Similar Documents

Publication Publication Date Title
US20180321493A1 (en) Hmd and method for controlling same
KR20170051013A (en) Tethering type head mounted display and method for controlling the same
KR20170048069A (en) System and method for controlling the same
KR101735484B1 (en) Head mounted display
KR20170037466A (en) Mobile terminal and method of controlling the same
KR20180099182A (en) A system including head mounted display and method for controlling the same
KR20170021159A (en) Mobile terminal and method for controlling the same
US9939642B2 (en) Glass type terminal and control method thereof
KR20160133185A (en) Mobile terminal and method for controlling the same
KR20180028211A (en) Head mounted display and method for controlling the same
KR20170058758A (en) Tethering type head mounted display and method for controlling the same
KR20170130952A (en) Mobile terminal and method for controlling the same
KR20170059760A (en) Mobile terminal and method for controlling the same
KR20160026532A (en) Mobile terminal and method for controlling the same
KR20170055296A (en) Tethering type head mounted display and method for controlling the same
US11025767B2 (en) Mobile terminal and control method therefor
KR20170115863A (en) Mobile terminal and method for controlling the same
KR20180103866A (en) Mobile terminal and control method thereof
KR20170058756A (en) Tethering type head mounted display and method for controlling the same
KR20160016397A (en) Mobile terminal and method for controlling the same
KR20160001229A (en) Mobile terminal and method for controlling the same
KR20170082036A (en) Mobile terminal
KR20170073985A (en) Mobile terminal and method for controlling the same
KR20170008498A (en) Electronic device and control method thereof
KR20140147057A (en) Wearable glass-type device and method of controlling the device