KR20170073273A - Head-Mounted Display Apparatus, HMD - Google Patents

Head-Mounted Display Apparatus, HMD Download PDF

Info

Publication number
KR20170073273A
KR20170073273A KR1020150181983A KR20150181983A KR20170073273A KR 20170073273 A KR20170073273 A KR 20170073273A KR 1020150181983 A KR1020150181983 A KR 1020150181983A KR 20150181983 A KR20150181983 A KR 20150181983A KR 20170073273 A KR20170073273 A KR 20170073273A
Authority
KR
South Korea
Prior art keywords
mobile terminal
headset
state
hmd
head
Prior art date
Application number
KR1020150181983A
Other languages
Korean (ko)
Inventor
임상현
함지혜
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150181983A priority Critical patent/KR20170073273A/en
Publication of KR20170073273A publication Critical patent/KR20170073273A/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C19/00Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V9/00Prospecting or detecting by methods not provided for in groups G01V1/00 - G01V8/00
    • G02B27/225
    • G06K9/00006
    • H04N13/044

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a head-mounted display device (HMD), and more particularly, to a head-mounted display device and a headset, the mobile terminal including a control unit, a display unit, And the controller divides the combined state of the mobile terminal and the headset into a fully separated state, a partially coupled state, and a fully coupled state, and displays a screen corresponding to the coupled state on the display unit The present invention relates to a head-mounted display device, and more particularly,

Description

[0001] The present invention relates to a head-mounted display device,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a head-mounted display device (HMD) using a mobile terminal, and more particularly, to an HMD capable of displaying information more conveniently when used.

A head-mounted display device (HMD) refers to a wearable device that can be worn on a head like a pair of glasses to receive various information. [0003] With the trend toward lighter and smaller quantities of digital devices, various wearable devices are being developed, and HMDs are also widely used. The HMD can be combined with augmented reality technology, N screen technology, etc., beyond the simple display function, to provide various convenience to the user.

In particular, the HMD can provide a surround image to provide a more realistic and realistic virtual space for the user. Here, the surrounding image can represent visual information spread in all directions around the HMD. Accordingly, the HMD can detect the direction of the face of the user wearing the HMD, and display an image corresponding to the corresponding direction in the surrounding image.

However, in the case of the HMD using the mobile terminal, it is necessary to change the information displayed on the display unit of the mobile terminal when attaching / detaching the mobile terminal and the headset.

The present invention is directed to solving the above-mentioned problems and other problems. It is another object of the present invention to provide a head-mounted display apparatus (HMD) that displays information on a display of a mobile terminal more efficiently in an HMD using a mobile terminal and a headset.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided a head-mounted display device (HMD) including a head-mounted display device mobile terminal and a headset according to the present invention. And the mobile terminal includes at least one of a controller, a display unit, a proximity sensor and a near field communication (NFC) that can confirm the state of connection between the mobile terminal and the headset, State, a partially-coupled state, and a fully-coupled state, and displays a screen corresponding to the coupled state on the display unit.

The details of other embodiments are included in the detailed description and drawings.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, there is an advantage of providing a head-mounted display apparatus (HMD) for efficiently displaying information on a display unit of a mobile terminal in an HMD using a mobile terminal and a headset .

1 is a block diagram illustrating a mobile terminal according to the present invention.
2A to 2C are conceptual diagrams of an HMD using a mobile terminal and a headset.
3A to 3D are conceptual diagrams for explaining the operation of the HMD.
4A to 4E are conceptual diagrams for explaining the operation of another HMD.
5A to 5D are conceptual diagrams illustrating a method of displaying information on a mobile terminal when the headset is detached and attached to the mobile terminal.
6A to 6B are conceptual diagrams for explaining the operation of the HMD.
7A to 7C are conceptual diagrams for explaining the operation of the HMD.
8A to 8B are conceptual diagrams for explaining the operation of the HMD.
9A to 9C are conceptual diagrams for explaining the operation of the HMD.
10A to 10D are conceptual diagrams for explaining the operation of the HMD.
11A to 11B are conceptual diagrams for explaining the operation of the HMD.
12A to 12C are conceptual diagrams for explaining the operation of the HMD.
13A to 13B are conceptual diagrams for explaining the operation of the HMD.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

 The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it should be understood by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, It will be easy to see.

Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the mobile terminal 100 as described above.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) Transmits and receives data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network in view of wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE- May be understood as a kind of mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 ) And the other mobile terminal 100 (or the external server). The short-range wireless communication network may be a short-range wireless personal area network.

Here, the other mobile terminal 100 may be a wearable device (e.g., a smartwatch, a smart glass, etc.) capable of interchanging data with the mobile terminal 100 according to the present invention (smart glass), HMD (head mounted display)). The short range communication module 114 can detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least part of the data processed by the mobile terminal 100 to the local communication module 114 ) To the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input means (or a mechanical key such as a button located at the front or rear or side of the mobile terminal 100, a dome switch, a jog wheel, Switches, etc.) and touch-based input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, And a touch key disposed on the touch panel. Meanwhile, the virtual key or the visual key can be displayed on a touch screen having various forms, for example, a graphic, a text, an icon, a video, As shown in FIG.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), a projection system (holographic system) can be applied.

Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.

In addition, the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and may be generated as one image as they are combined. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.

The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the stereoscopic display unit by the stereoscopic processing unit. The stereoscopic processing unit receives a 3D image (an image at a reference time point and an image at an expansion point), sets a left image and a right image therefrom, or receives a 2D image and converts it into a left image and a right image.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may combine and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data in the mobile terminal 100 to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the interface unit 160. [

The interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, May be a path through which the mobile terminal 100 is transmitted. The various command signals or the power source input from the cradle can be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data related to vibrations and sounds of various patterns output upon touch input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs storage functions of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal meets a set condition, the controller 180 can execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using a connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

1, a mobile terminal according to the present invention may include a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), a UWB (Ultra Wideband), a ZigBee, an NFC Field Communication), and Wireless USB (Wireless Universal Serial Bus).

Among them, the NFC module provided in the mobile terminal supports the non-contact type short-range wireless communication between the terminals at a distance of about 10 cm. The NFC module may operate in either a card mode, a reader mode, or a P2P mode. In order for the NFC module to operate in the card mode, the mobile terminal 100 may further include a security module for storing card information. Here, the security module may be a physical medium such as a UICC (e.g., a SIM (Subscriber Identification Module) or a USIM (Universal SIM)), a Secure micro SD and a sticker, or a logical medium For example, embeded SE (Secure Element)). Data exchange based on SWP (Single Wire Protocol) can be made between NFC module and security module.

When the NFC module is operated in the card mode, the mobile terminal can transmit the stored card information to the outside such as a conventional IC card. Specifically, if the mobile terminal storing the card information of the payment card such as a credit card or a bus card is brought close to the fare payment machine, the mobile local payment can be processed, and the mobile terminal storing the card information of the access card can be accessed If you are close to the time, the approval process for access may begin. Cards such as credit cards, transportation cards, and access cards are mounted on the security module in the form of an applet, and the security module can store card information on the mounted card. Here, the card information of the payment card may be at least one of a card number, balance, and usage details, and the card information of the access card may include at least one of a name, a number (e.g., It can be one.

When the NFC module is operated in the reader mode, the mobile terminal can read data from an external tag. At this time, the data received from the mobile terminal by the tag may be coded into a data exchange format (NFC Data Exchange Format) defined by the NFC Forum. In addition, the NFC Forum defines four record types. Specifically, the NFC forum defines four RTDs (Record Type Definitions) such as Smart Poster, Text, Uniform Resource Identifier (URI), and General Control. If the data received from the tag is a smart poster type, the control unit executes a browser (e.g., an Internet browser), and if the data received from the tag is a text type, the control unit can execute the text viewer. When the data received from the tag is a URI type, the control unit executes the browser or makes a telephone call, and if the data received from the tag is a general control type, it can execute appropriate operations according to the control contents.

When the NFC module is operated in a peer-to-peer (P2P) mode, the mobile terminal can perform P2P communication with another mobile terminal. At this time, LLCP (Logical Link Control Protocol) may be applied to P2P communication. For P2P communication, a connection can be created between the mobile terminal and another mobile terminal. At this time, the generated connection can be divided into a connectionless mode in which one packet is exchanged and terminated, and a connection-oriented mode in which packets are exchanged consecutively. Through P2P communications, data such as business cards, contact information, digital photos, URLs in electronic form, and setup parameters for Bluetooth and Wi-Fi connectivity can be exchanged. However, since the usable distance of NFC communication is short, the P2P mode can be effectively used to exchange small-sized data.

Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

2A to 2C are conceptual diagrams of an HMD using the mobile terminal 100 and the headset 210. FIG.

The HMD may comprise a mobile terminal 100 and a headset 210. The mobile terminal 100 includes a control unit 180 and a display unit 151. The mobile terminal 100 may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant assistants, portable multimedia players (PMPs), navigation, slate PCs, tablet PCs, ultrabooks, wearable devices such as smartwatches, A smart glass, an HMD (head mounted display)), and the like.

The headset 210 may incorporate the mobile terminal 100 and may include a head wearable strap for operation with the HMD. The headset 210 may include a proximity sensor, a GPS sensor, a gyro sensor, an acceleration sensor, and the like to measure the state of the user of the HMD. In general, the mobile terminal 100 may include a proximity sensor, a GPS sensor, a gyro sensor, and an acceleration sensor. However, the HMD may include a proximity sensor, a GPS sensor, a gyro sensor, The status of the user can be measured.

In addition, the headset 210 includes a sensor capable of measuring a connection state with the mobile terminal 100. A sensor capable of measuring the connection state with the mobile terminal 100 is a sensor that can confirm whether the mobile terminal 100 and the headset 210 are combined and can operate as an HMD. For example, it may include a proximity sensor, NFC, or the like, which can confirm whether the mobile terminal 100 and the headset 210 are coupled. However, such a sensor can measure the state of the headset 210 and the mobile terminal 100, and a sensor that can check the connection state between the mobile terminal 100 and the headset 210 can be used in various ways have.

The headset 210 may include a left eye lens 211 and a right eye lens 212. The left eye lens 211 and the right eye lens 212 may provide different images to the left eye and right eye, respectively, to obtain a 3D effect.

The headset 210 and the mobile terminal 100 may be moved in a sliding manner by moving the mobile terminal 100 in a sliding manner on the side of the headset 210, So that the terminal 100 can be coupled. The headset 210 has grooves having a size corresponding to the thickness of the mobile terminal 100 on the side thereof so that the mobile terminal 100 can be coupled to the mobile terminal 100. In order to insert the mobile terminal 100 into the groove, And the headset 210 can be coupled to each other.

Alternatively, the mobile terminal 100 may be coupled to the headset 210 by sliding the mobile terminal 100 on the upper surface of the headset 210 as shown in FIG. 2B. In the headset 210, there is a groove having a size corresponding to the thickness of the mobile terminal 100 on the top surface so that the mobile terminal 100 can be coupled to the mobile terminal 100. By sliding the mobile terminal 100 into the groove, 100 and the headset 210 can be combined.

As another example, the mobile terminal 100 and the headset 210 may be coupled in a door manner. Referring to FIG. 2C, the headset 210 may include a cover 221, which can be coupled with the mobile terminal 100, and can cover the front portion thereof.

As described above, the HMD can be implemented by combining the mobile terminal 100 and the headset 210, and the combining method can be various.

3A to 3D are conceptual diagrams for explaining the operation of the HMD.

The present invention distinguishes between the mobile terminal 100 and the headset 210 in three stages. The mobile terminal 100 and the headset 210 can be coupled in a sliding manner. The mobile terminal 100 and the headset 210 may include a sensor capable of measuring the state of engagement so as to distinguish the mobile terminal 100 from the headset 210 in a fully separated state, have.

3A, a screen corresponding to a normal state of the mobile terminal 100 is displayed on the display unit 151 of the mobile terminal 100 before the mobile terminal 100 and the headset 210 are combined.

3B, when the mobile terminal 100 and the headset 210 are coupled to or disconnected from each other, various screens are displayed on the display unit (not shown) of the mobile terminal 100 according to the specific combination state of the mobile terminal 100 and the headset 210 151). A specific method of displaying a screen on the display unit 151 of the mobile terminal 100 will be described later.

3C and 3D, when the mobile terminal 100 and the headset 210 are fully engaged, the mobile terminal 100 is fixed in a state where the user is wearing the headset 210, So that a stable screen can be provided to the user. The screen of the display unit 151 of the mobile terminal 100 is separated into the left and right eye viewing planes 231 and 232 in a state where the mobile terminal 100 and the headset 210 are completely combined. As the screen of the display unit 151 of the mobile terminal 100 is separated left and right, the HMD can provide a VR screen.

4A to 4E are conceptual diagrams for explaining the operation of another HMD.

The present invention divides the mobile terminal 100 and the headset 220 into three stages. The mobile terminal 100 and the headset 220 may be coupled in a door manner. The mobile terminal 100 and the headset 220 may include a sensor capable of measuring the state of engagement so as to distinguish the mobile terminal 100 from the headset 220 in a fully separated state, have.

4A, a screen corresponding to a normal state of the mobile terminal 100 is displayed on the display unit 151 of the mobile terminal 100 before the mobile terminal 100 and the headset 220 are combined.

4B and 4C, when the mobile terminal 100 and the headset 220 are coupled to or separated from each other, various screens are displayed on the display 100 of the mobile terminal 100 according to the specific combination state of the mobile terminal 100 and the headset 220. [ Can be displayed on the display section (151). For example, when the mobile terminal 100 is mounted on the cover 221 of the headset 220, a setting window 401 for setting a screen of the HMD can be displayed. A specific method of displaying a screen on the display unit 151 of the mobile terminal 100 will be described later.

The control unit 180 of the mobile terminal 100 displays a status message 402 indicating that the headset 220 is coupled to the mobile terminal 100 on the display unit 151, . Status message 402 indicates information about the headset 220. The information on the headset 220 may include a type of the headset 220, a connection state between the headset 220 and the mobile terminal 100, and the like.

4D and 4E, when the mobile terminal 100 and the headset 220 are fully engaged and the user is wearing the mobile terminal 100, So that a stable screen can be provided to the user. In the state where the mobile terminal 100 and the headset 220 are completely coupled to each other, the screen of the display unit 151 of the mobile terminal 100 is separated into the left and right eye illumination planes 231 and 232. As the screen of the display unit 151 of the mobile terminal 100 is separated left and right, the HMD can provide a VR screen.

5A to 5D are conceptual diagrams illustrating a method of displaying information on the mobile terminal 100 when the headset 210 and the mobile terminal 100 are detached and attached.

The control window may be displayed on a part of the display unit 151 of the mobile terminal 100 according to the state of the mobile terminal 100 and the headset 210 when the mobile terminal 100 and the headset 210 are slidably attached .

5A, when a portion of the mobile terminal 100 is inserted into the headset 210 in the HMD, a portion of the display unit 151 of the mobile terminal 100 is exposed to the outside of the headset 210. [ Accordingly, the control unit 180 may display a control window for controlling the HMD in a part of the exposed area of the display unit 151. [

The sensor unit of the mobile terminal 100 and the headset 210 may measure the combined state of the mobile terminal 100 and the headset 210. [ For example, the controller 180 may use the proximity sensor to determine the information inserted into the headset 210 by the mobile terminal 100. In another example, the control unit 180 can use the NFC sensor to grasp the information inserted into the headset 210 by the mobile terminal 100.

The control unit 180 displays a general screen on the display unit 151 of the mobile terminal 100 when the mobile terminal 100 is not coupled to the headset 210. [ When the mobile terminal 100 is inserted into the headset 210, the mobile terminal 100 and the headset 210 may be connected by wireless communication. The control unit 180 may display a screen for controlling the HMD on the display unit 151 according to the combined state of the mobile terminal 100 and the headset 210. [

Referring to FIG. 5A, the mobile terminal 100 may be inserted in the right side of the headset 210. The control unit 180 can display the control window 511 on the right side of the display unit 151 of the mobile terminal 100 by checking the combined state of the mobile terminal 100 and the headset 210. [ The control window can control the brightness, sound and wireless communication of the HMD.

According to another embodiment of the present invention, when the mobile terminal 100 is partially exposed to the outside of the headset 210 while the display unit 151 of the mobile terminal 100 is operating in the HMD state, (100) and the headset (210), and switches the screen displayed on the display unit (151) from the HMD state to the control state. That is, the control unit 180 can switch the screen displayed on the display unit 151 according to the combined state of the mobile terminal 100 and the headset 210. The control unit 180 may change the screen displayed on the display unit 151 from the HMD state to the control state when the combined state of the mobile terminal 100 and the headset 210 is changed from the fully coupled state to the partially coupled state . 4A, when the mobile terminal 100 is partially exposed to the right side of the headset 210, the control unit 180 displays a control window 511 on the right side of the display unit 151. FIG.

5B, when the mobile terminal 100 is partially exposed to the left side of the headset 210, the control window 512 may be displayed on the left side of the display unit 151. FIG. 5C, when the mobile terminal 100 is partially exposed to the upper side of the headset 210, the control window 513 can be displayed on the upper side of the display unit 151. [ 5D, when the mobile terminal 100 is partially exposed to the lower side of the headset 210, the control window 514 can be displayed on the lower side of the display unit 151

As described above, when the mobile terminal 100 and the headset 210 are changed from the fully engaged state to the partially engaged state, or when the mobile terminal 100 and the headset 210 are not coupled, It is possible to display a control window on the display unit 151. [ The controller 180 displays the HMD screen on the display unit 151 and the HMD screen displays the HMD screen on the display unit 151 and the headset 210 in the state where the mobile terminal 100 and the headset 210 are fully coupled, The screen 210 may be a screen before the state is changed to a partially combined state. That is, the controller 180 may store the information about the screen displayed on the display unit 151 in the memory so that the entire screen can be continuously displayed when the fully-combined state is established again.

The user of the mobile terminal 100 can control the mobile terminal 100 without needing to separately control the control window for controlling the HMD when the mobile terminal 100 is coupled to the headset 210. According to an embodiment of the present invention, can do. Also, the user of the mobile terminal 100 can activate the control window by partially exposing the mobile terminal 100 in the headset 210 in the HMD state. Accordingly, there is an advantage that the state of the HMD can be controlled without completely separating the mobile terminal 100 from the headset 210. [

According to another embodiment of the present invention, a different screen may be displayed on a part of the display unit 151 according to the exposure direction of the mobile terminal 100 in the combined state of the mobile terminal 100 and the headset 210 have. For example, when the mobile terminal 100 is exposed to the right side of the headset 210, the control unit 180 displays a control window for adjusting the brightness of the screen on the right side of the display unit 151, A control window for adjusting the volume of the headset 210 may be displayed when the mobile terminal 100 is exposed to the left side of the headset 210. A control window for controlling the magnitude of vibration may be displayed when the mobile terminal 100 is exposed to the upper side of the headset 210. [ That is, the mobile terminal 100 may display different control windows according to the directions in which the headset 210 is exposed.

According to the embodiment of the present invention, the user of the mobile terminal 100 can confirm the control window without a command to call a separate control window, and can skip the step of selecting the control object and can directly control a desired control object Can be called.

According to another embodiment of the present invention, when a different screen is displayed on a partial area of the display unit 151 according to the combined state of the mobile terminal 100 and the headset 210, You can choose. The control unit 180 can receive an object to be controlled according to the direction in which the mobile terminal 100 is exposed in the headset 210. The controller 180 stores the control object in the memory, The control window corresponding to the state can be displayed.

6A to 6B are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, when the combined state of the mobile terminal 100 and the headset 210 is changed from the fully coupled state to the partially coupled state while the HMD is reproducing the image contents, 151 may display information associated with the image content. For example, when the combined state of the mobile terminal 100 and the headset 210 is changed from the fully coupled state to the partially coupled state during the reproduction of the image content as shown in FIG. 6A, The screen 601 associated with the image content can be displayed in the area exposed by the headset 210. [

The screen 601 associated with the video content may include information on other video related to the video content, information on how to reproduce the video content, and the like.

The control unit 180 determines whether the mobile terminal 100 is connected to the headset 210 in a state where the mobile terminal 100 is connected to the headset 210. [ 6B, it may block an image from being displayed in an unexposed region of the headset 210. In this case, Also, as described above, a different screen may be displayed on a part of the display unit 151 according to the exposure direction of the mobile terminal 100 in the combined state of the mobile terminal 100 and the headset 210.

The combination of the mobile terminal 100 and the headset 210 may be displayed in a fully coupled state and the mobile terminal 100 and the headset 210 may be connected to each other, The screen displayed on the display unit 151 of the display unit 151 may be different. For example, in a case where a movie is played back in a state where the combined state of the mobile terminal 100 and the headset 210 is fully combined, information about other videos related to the video content, When the sports content is played back, detailed information corresponding to the sport to be played back, progress of other matches related to the match to be played back, and the like can be displayed. In addition, when game contents are reproduced, related information, capture, and the like can be displayed. The mobile terminal 100 and the headset 210 may be connected to each other in a state where the combined state of the mobile terminal 100 and the headset 210 is exposed in some combined states, The screen displayed on the display unit 151 of the mobile terminal 100 and the headset 210 may vary according to the type of content displayed in a fully combined state.

According to another embodiment of the present invention, the display form and the display area of the menu may vary depending on the degree of exposure of the display unit 151 in a state where the mobile terminal 100 and the headset 210 are partially combined. The sensor included in the mobile terminal 100 and / or the headset 210 can measure the degree of coupling between the mobile terminal 100 and the headset 210, and the controller can measure the degree of coupling between the mobile terminal 100 and the headset 210 It is possible to determine the exposed area of the display unit 151 of the mobile terminal 100 and to provide an optimized menu screen corresponding to the exposed area.

5A to 5D, the sizes of the control windows 511 to 514 may be changed depending on the degree of exposure of the display unit 151 in a state where the mobile terminal 100 and the headset 210 are partially coupled to each other have. The arrangement and type of the selection menu displayed in the control windows 511 to 514 may be changed depending on the degree of exposure of the display unit 151 in a state where the mobile terminal 100 and the headset 210 are partially engaged.

According to an embodiment of the present invention, the user of the mobile terminal 100 can easily change the combined state of the mobile terminal 100 and the headset 210 with information related to the video content during playback of the video content using the HMD There is an advantage that can be confirmed.

7A to 7C are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, when a message is received from the outside while an image is displayed on the HMD, the controller 180 can display whether or not the message is received in a part of the HMD.

Referring to FIG. 7A, when a message is displayed while the image 701 is displayed on the HMD, message reception information 702 is displayed in a part of the HMD. However, only the information on the message reception status can be displayed in the HMD, and the contents of the specific message may not be displayed.

7B, when the combined state of the mobile terminal 100 and the headset 210 is changed from the fully coupled state to the partially coupled state while the message reception information 702 is displayed, the display unit 151 of the mobile terminal 100 ) May display information associated with the received message. Information on message content 703 can be displayed in a part of the display unit 151 when a part of the mobile terminal 100 is exposed to the outside of the headset 210. [

7C, when the combined state of the mobile terminal 100 and the headset 210 is changed from the fully coupled state to the fully separated state while the message reception information 702 is being displayed, When the image displayed by the HMD is displayed on the display unit 151, the message content 704 corresponding to the message reception information 702 can be displayed.

According to an embodiment of the present invention, a user of the mobile terminal 100 can confirm whether or not a message is received while viewing an image with the HMD. By exposing only a part of the mobile terminal 100 to the headset 210, Can be confirmed. Therefore, it is possible to confirm the received message more easily. In addition, even when the mobile terminal 100 is completely separated from the headset 210, the contents of the message can be confirmed immediately. Therefore, even when the mobile terminal 100 is used as an HMD, the message can be easily checked.

8A to 8B are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, even when the application related to the HMD is not executed in the mobile terminal 100, when the mobile terminal 100 is coupled to the headset 210 and worn by the user, ) Screen 801 is displayed. The home screen 801 corresponding to the HMD displays an initial screen when operating as an HMD, and a selection window that can be selected by the user is displayed on the home screen 801. The control unit 180 can obtain the input information by determining the direction in which the input button or the eye of the user gazes.

8B, the controller 180 may display images corresponding to left and right eyes on the left and right eye glasses 811 and 812 of the mobile terminal 100, respectively, and output the images so that the images are displayed in 3D .

9A to 9C are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, when the mobile terminal 100 is coupled with the headset 210 while the moving picture is being played back in the mobile terminal 100, the moving picture may be played back by the HMD when the user wears it.

9A to 9C, when the mobile terminal 100 is coupled with the headset 210 and is worn by the user while the moving image 901 is being played back on the display unit 151 of the mobile terminal 100, The moving picture 902 is subsequently reproduced. When the mobile terminal 100 is coupled with the headset 210 and the user wears the mobile terminal 100 while the moving image 901 is being played back, the controller 180 controls the left and right inner surfaces 911 and 912 of the mobile terminal 100, ) Corresponding to the left and right eyes, respectively, so that the screen can be displayed in 3D.

In addition, the control unit 180 may receive the 3D output of the HMD, and may call and output 3D and 2D images.

10A to 10D are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, when the mobile terminal 100 is coupled to the headset 210, the fingerprint information of the user of the mobile terminal 100 may be received and operated in a secure state.

10A, when the fingerprint 921 is input to the fingerprint input unit 920 of the mobile terminal 100 in a state where the mobile terminal 100 and the headset 210 are partially coupled, the controller 180 controls the mobile terminal 100 And the headset 210 are in a fully coupled state and the coupled state is changed.

However, when the combined state of the mobile terminal 100 and the headset 210 is changed from the completely separated state to the fully coupled state, the mobile terminal 100 operates in a general HMD state. That is, the combination of the mobile terminal 100 and the headset 210 is maintained.

10B, when the fingerprint 921 is input to the fingerprint input unit 920 of the mobile terminal 100 in a state where the mobile terminal 100 and the headset 210 are partially engaged, the mobile terminal 100 and the headset 210 ) When operating as an HMD in a fully coupled state, information 931 informing that the HMD has entered the security state can be displayed on the HMD screen 930.

Referring to FIG. 10C, when the HMD operates in the secure state, the control unit 180 can execute the secure content 940. When the mobile terminal 100 operates in a general HMD state other than the secure state, the content of the secure state stored in the memory of the mobile terminal 100 is restricted in execution. The fact that the content in the secure state is restricted in execution may mean that the content in the normal state may not be executed even if an execution command is input. In this case, the control unit 180 may display that the content is in a secure state. According to another embodiment, the content in the secure state may be limited such that the content file itself in the secure state is not exposed in the execution list.

Therefore, according to the embodiment of the present invention, there is an advantage that it is possible to restrict the personal information to privacy or to prevent other people from executing the content requiring security.

10D, when the fingerprint 921 is input to the fingerprint input unit 920 of the mobile terminal 100 in a state where the mobile terminal 100 and the headset 210 are partly coupled, the controller 180 controls the mobile terminal 100 And the headset 210 are fully engaged, it is possible to automatically input the cipher information when the combined state is changed.

The fingerprint input unit 920 of the mobile terminal 100 is blocked by the headset 210 and the fingerprint input unit 920 of the user of the mobile terminal 100 May not be able to receive input. When the fingerprint 921 is input to the fingerprint input unit 920 of the mobile terminal 100 in a state where the mobile terminal 100 and the headset 210 are partly coupled to each other, Lt; / RTI > Therefore, the cryptographic information is automatically input in the fully coupled state without inputting the separate cryptographic information. Therefore, the login information window 951 can be displayed on the HMD.

The input of the password information means various operations in which a password is generally required, such as automatically executing a password-set file or automatically inputting an account name and a password corresponding to the account online.

According to the present invention, the user of the mobile terminal 100 can perform an operation requiring a password without a separate password input process when operating as an HMD through a single fingerprint recognition process.

11A to 11B are conceptual diagrams for explaining the operation of the HMD.

The color of the display unit 151 of the mobile terminal 100 may be displayed in a state in which the head of the user of the mobile terminal 100 is not worn in a state where the mobile terminal 100 and the headset 210 are combined, And the status information of the mobile terminal 100 can be displayed through the brightness flicker.

When the user of the mobile terminal 100 is not worn on the head of the user of the mobile terminal 100 while the mobile terminal 100 and the headset 210 are coupled, The detailed information of the display unit 151 of the display unit 151 can not be confirmed. However, the user of the mobile terminal 100 can check the color, brightness, and flicker of the display unit 151 of the passing mobile terminal 100 with the lens.

For example, the control unit 180 may turn the color of the display unit 151 of the mobile terminal 100 to green when the mobile terminal 100 receives a telephone call. The control unit 180 may change the color of the display unit 151 of the mobile terminal 100 to yellow when the mobile terminal 100 receives a message. The controller 180 may change the color of the display unit 151 of the mobile terminal 100 to red when the battery of the mobile terminal 100 is insufficient.

11A and 11B, the color of the light emitted through the lens roll of the headset 210 can be changed by changing the color of the display unit 151 of the mobile terminal 100. [ The first color is displayed on the left eye lens 961 and the right eye lens 962 and the second color is displayed on the left eye lens 971 and the right eye lens 972 to display the state of the mobile terminal 100 Can be displayed.

In another embodiment, the left eye lens 961 and the right eye lens 962 display a flicker of one second, and the left eye lens 971 and the right eye lens 972 display a flicker of 0.5 second, The state of the display device 100 can be displayed.

In another embodiment, different colors may be displayed on the left-eye and right-eye ranges 961 and 962. The first color may be displayed on the left eye lens 961 and the second color may be displayed on the right eye lens 962 to display the state of the mobile terminal 100. [ The state of the mobile terminal 100 can be displayed with less color than the case where the state of the mobile terminal 100 is displayed by displaying different colors on the left eye lens unit 961 and the right eye range unit 962 There are advantages.

12A to 12C are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, when the mobile terminal 100 and the headset 210 operate as an HMD in a fully coupled state, a camera disposed in a front portion of the mobile terminal 100 is used, Can be detected.

12A to 12C, the mobile terminal 100 recognizes the front of the user while the user views the HMD image 981 information. The mobile terminal 100 can monitor the information in the front 982 of the mobile terminal 100 to identify an object in proximity to the user. The controller 180 may output an image received from the front camera of the mobile terminal 100 to the HMD when an object 984 proximate to the user is sensed ahead through the camera. However, it is also possible to stop the image output from the existing HMD and output the image of the front of the mobile terminal 100, or output the image of the front of the mobile terminal 100 to a part of the HMD. Or may output the existing image of the mobile terminal 100 and the image of the front of the mobile terminal 100 to overlap with each other.

According to an embodiment of the present invention, even when a user views the projection using the HMD, the user can sense the danger ahead of the user, and the user of the mobile terminal 100 can use the mobile terminal 100 more securely with the HMD There are advantages.

13A to 13B are conceptual diagrams for explaining the operation of the HMD.

According to an embodiment of the present invention, when the user performs a predetermined operation in front of the screen while viewing the HMD image, the controller 180 recognizes the motion on the screen recognized through the forward camera of the mobile terminal 100, It is possible to output an image in front of the terminal 100 as an HMD image.

13A to 13B, the mobile terminal 100 recognizes the front of the user while the user views the HMD image 991 information. The mobile terminal 100 can recognize the user's gesture 993 by monitoring information on the front 992 of the mobile terminal 100. [ For example, when the user performs a hand-shake gesture on the front side of the mobile terminal 100, the controller 180 can recognize the gesture with the image input through the front side camera of the mobile terminal 100. [ The controller 180 may output the image received from the front camera of the mobile terminal 100 to the HMD when the gesture is input. However, it is also possible to stop the image output from the existing HMD and output the image of the front of the mobile terminal 100, or output the image of the front of the mobile terminal 100 to a part of the HMD. Or may output the existing image of the mobile terminal 100 and the image of the front of the mobile terminal 100 to overlap with each other.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a control unit 180 of the mobile terminal 100. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
120: Input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit
210, 220: Headset

Claims (13)

A mobile terminal; And
A headset including a left eye lens and a right eye lens,
The mobile terminal is attached to a front portion of the headset and operates as a head-mounted display device (HMD)
The mobile terminal includes:
A control unit;
A display unit; And
And a sensor capable of checking the state of coupling between the mobile terminal and the headset,
Wherein the control unit divides the combined state of the mobile terminal and the headset into a fully separated state, a partially coupled state, and a fully coupled state, and displays a screen corresponding to the coupled state on the display unit.
The method according to claim 1,
Wherein the headset includes at least one of a proximity sensor, a gyro sensor, an ultrasonic sensor, a photosensor, a gyro sensor, and an acceleration sensor.
The method according to claim 1,
Wherein the control unit displays a predetermined screen in a part of the mobile terminal exposed to the outside when the mobile terminal and the headset are in a coupled state and a part of the mobile terminal display unit is exposed to the outside of the headset Head-mounted display device.
The method of claim 3,
Wherein the control unit displays a different screen according to a direction in which the mobile terminal is exposed to the outside of the headset.
The method of claim 3,
Wherein the predetermined screen is a control window for controlling the head-mounted display device.
The method according to claim 1,
Wherein the control unit displays a message reception state on a screen of the head mount display device when a message is received in the mobile terminal in a state where the combined state of the mobile terminal and the headset is fully coupled and the headset is worn on the head of the user ,
And displays the contents of the message in a part of the mobile terminal exposed to the outside when a part of the mobile terminal display unit is exposed to the outside of the headset.
The method according to claim 1,
When the state of the combined state of the mobile terminal and the headset is completely separated and the moving picture is reproduced and the state of the mobile terminal and the headset are switched to the full state of the headset, the control unit continuously reproduces the moving picture on the screen of the head- Head mounted display device.
The method according to claim 1,
Wherein the control unit receives a fingerprint of a user through an input unit of the mobile terminal in a state where the combined state of the mobile terminal and the headset is partially combined,
Wherein the headset is operated in a secure state when the mobile terminal and the headset are fully engaged and the headset is worn on the user's head.
9. The method of claim 8,
Wherein the control unit receives a fingerprint of a user through a fingerprint input unit of the mobile terminal in a state where the combined state of the mobile terminal and the headset is partially combined,
Wherein the headset is operated in a secure state when the mobile terminal and the headset are fully engaged and the headset is worn on the user's head.
10. The method of claim 9,
And in the secure state, a list of contents executable only in the secure state is displayed on the head-mounted display device.
10. The method of claim 9,
And the content executable only in the secure state is executed in the secure state.
The method according to claim 1,
Wherein the control unit determines that the combined state of the mobile terminal and the headset is in a fully coupled state,
And displays the state of the mobile terminal through the left eye lens and the right eye lens of the headset by changing the state of the display unit of the mobile terminal.
13. The method of claim 12,
A method of displaying a status of a mobile terminal through a left eye lens and a right eye lens of a headset is at least one of color, brightness, flicker, and variations in color, brightness, and blink respectively.
KR1020150181983A 2015-12-18 2015-12-18 Head-Mounted Display Apparatus, HMD KR20170073273A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150181983A KR20170073273A (en) 2015-12-18 2015-12-18 Head-Mounted Display Apparatus, HMD

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150181983A KR20170073273A (en) 2015-12-18 2015-12-18 Head-Mounted Display Apparatus, HMD

Publications (1)

Publication Number Publication Date
KR20170073273A true KR20170073273A (en) 2017-06-28

Family

ID=59280850

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150181983A KR20170073273A (en) 2015-12-18 2015-12-18 Head-Mounted Display Apparatus, HMD

Country Status (1)

Country Link
KR (1) KR20170073273A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108196365A (en) * 2017-12-13 2018-06-22 深圳市虚拟现实科技有限公司 Correct the method and apparatus of mobile terminal locations
WO2019017513A1 (en) * 2017-07-20 2019-01-24 Lg Electronics Inc. Head-mounted display and method of controlling the same
WO2019054621A1 (en) * 2017-09-18 2019-03-21 주식회사 룩시드랩스 Head-mounted display device
KR101992138B1 (en) 2018-12-10 2019-06-24 주식회사 델바인 Apparatus for displaying stereoscopic image

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019017513A1 (en) * 2017-07-20 2019-01-24 Lg Electronics Inc. Head-mounted display and method of controlling the same
US11022804B2 (en) 2017-07-20 2021-06-01 Lg Electronics Inc. Head-mounted display and method of controlling the same
WO2019054621A1 (en) * 2017-09-18 2019-03-21 주식회사 룩시드랩스 Head-mounted display device
CN108196365A (en) * 2017-12-13 2018-06-22 深圳市虚拟现实科技有限公司 Correct the method and apparatus of mobile terminal locations
KR101992138B1 (en) 2018-12-10 2019-06-24 주식회사 델바인 Apparatus for displaying stereoscopic image

Similar Documents

Publication Publication Date Title
EP3112989A2 (en) Mobile terminal
KR101735484B1 (en) Head mounted display
KR102217561B1 (en) Head mounted display and method for controlling the same
KR20160133230A (en) Mobile terminal
KR20180099182A (en) A system including head mounted display and method for controlling the same
KR20150131815A (en) Mobile terminal and controlling method thereof
KR20180028211A (en) Head mounted display and method for controlling the same
KR20170014458A (en) Mobile terminal, watch-type mobile terminal and method for controlling the same
KR101719999B1 (en) Mobile terminal
KR20170073273A (en) Head-Mounted Display Apparatus, HMD
KR20170055296A (en) Tethering type head mounted display and method for controlling the same
KR101721132B1 (en) Mobile terminal and method for controlling the same
KR20160001229A (en) Mobile terminal and method for controlling the same
KR20180044551A (en) Mobile terminal for displaying the electronic devices for the interior space and operating method hereof
KR20170008498A (en) Electronic device and control method thereof
KR20170058756A (en) Tethering type head mounted display and method for controlling the same
KR20140147057A (en) Wearable glass-type device and method of controlling the device
US10764528B2 (en) Mobile terminal and control method thereof
KR20150111748A (en) Mobile terminal and method for controlling the same
KR20190079263A (en) Head mounted display
KR20180028210A (en) Display device and method for controlling the same
KR20170108715A (en) Mobile terminal and method for controlling the same
WO2016024668A1 (en) Mobile terminal and control method therefor
KR20160009946A (en) Mobile terminal and method for controlling the same
KR20150107098A (en) Mobile terminal and controlling method thereof