KR20170023491A - Camera and virtual reality system comorising thereof - Google Patents

Camera and virtual reality system comorising thereof Download PDF

Info

Publication number
KR20170023491A
KR20170023491A KR1020150118706A KR20150118706A KR20170023491A KR 20170023491 A KR20170023491 A KR 20170023491A KR 1020150118706 A KR1020150118706 A KR 1020150118706A KR 20150118706 A KR20150118706 A KR 20150118706A KR 20170023491 A KR20170023491 A KR 20170023491A
Authority
KR
South Korea
Prior art keywords
hmd
mobile terminal
view
camera
sensor
Prior art date
Application number
KR1020150118706A
Other languages
Korean (ko)
Inventor
임종식
허훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150118706A priority Critical patent/KR20170023491A/en
Publication of KR20170023491A publication Critical patent/KR20170023491A/en

Links

Images

Classifications

    • H04N13/0429
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • H04N13/0425

Abstract

A virtual reality system is disclosed. The system includes a head mounted display (HMD), a camera for detecting the position of the HMD, and a processing device for transmitting an image based on the detected position of the HMD to the HMD to realize a virtual reality. In particular, the camera includes a first image sensor for photographing a first view according to a first angle, a second image sensor for photographing a second view according to a second angle, 2 image sensor, and a controller for transmitting the third view synthesized with the first and second views to the HMD.

Description

[0001] CAMERA AND VIRTUAL REALITY SYSTEM COMORISING THEREOF [0002]

The present invention relates to a camera and a virtual reality system including the camera. More particularly, the present invention relates to a camera for extending a field of view (FOV) for a device implementing a virtual reality and a virtual reality system including the same.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

On the other hand, virtual reality is a computer application field that allows a user to feel situations that do not actually exist by using computer graphics technology. However, there may be a space limitation in implementing a virtual reality or augmented reality through the terminal. For example, since the field of view (FOV) of a camera for detecting the position of a mobile terminal or a head mounted display (HMD) that implements a virtual reality is limited, the position of the mobile terminal or the HMD frequently deviates from the FOV . Therefore, it is necessary to expand the FOV to enlarge the area of the mobile terminal or HMD that the camera can detect.

SUMMARY OF THE INVENTION The present invention has been made in view of the above-mentioned needs, and it is an object of the present invention to provide a camera and a camera for extending a field of view (FOV) for a device implementing a virtual reality by using a camera including a plurality of image sensors To provide a virtual reality system.

In order to achieve the above object, a virtual reality system according to an embodiment of the present invention includes an HMD (Head Mounted Display), a camera for detecting the position of the HMD, and an image based on the position of the detected HMD, A virtual reality system including a processing device for realizing a virtual reality, the camera comprising: a first image sensor for photographing a first view according to a first angle; A second image sensor disposed at any one of the first and second views for capturing a second view according to the second angle, and a controller for transmitting the third view synthesized with the first and second views to the HMD.

Also, if it is determined that the HMD is out of the field of view (FOV) corresponding to the third view, the processing apparatus can move the center of the screen displayed on the HMD to correspond to the position of the HMD.

Further, the HMD includes a barometric pressure sensor, and the processing device can move the center of the screen displayed on the HMD to correspond to the position of the HMD sensed by the air pressure sensor.

The HMD also includes first and second infrared range-finding sensors, wherein the processing unit is operable to determine a first distance between the first point sensed by the first infrared distance-measuring sensor and the HMD, and a second distance, sensed by the second infrared- The center of the screen displayed on the HMD can be moved corresponding to the second distance between the second point and the HMD.

Further, the HMD includes an inertial sensor, and the processing apparatus can move the center of the screen displayed on the HMD so as to correspond to the switching direction of the HMD sensed by the inertial sensor.

Also, the processing apparatus can display a GUI (Graphic User Interface) representing the position information of the HMD on the HMD.

A virtual reality system according to another embodiment of the present invention is a virtual reality system including a camera and a mobile terminal. The mobile terminal displays an image based on the position of the mobile terminal detected by the camera, A first image sensor for photographing a first view according to a first angle, a second image sensor for photographing a second view according to a second angle, the second image sensor being disposed on either the right or left side or the upper and lower sides with respect to the first image sensor, And a controller for transmitting the third view in which the first view and the second view are synthesized to the HMD.

Meanwhile, a camera according to an embodiment of the present invention detects the position of an HMD to transmit an image based on the position of an HMD (Head Mounted Display) to an HMD to realize a virtual reality, A second image sensor disposed on one of the right and left sides or the upper and lower sides with respect to the first image sensor to photograph a second view according to the second angle, And a controller for transmitting the third view combined with the first view and the second view to the HMD.

According to at least one embodiment of the present invention, since the FOV for a device implementing a virtual reality is expanded, the range of movement of a user wearing the device implementing the virtual reality is widened.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a block diagram of a virtual reality system according to an embodiment of the present invention.
2 is a diagram illustrating another example of a virtual reality system according to an embodiment of the present invention,
3 is a block diagram of a mobile terminal according to an embodiment of the present invention,
FIG. 4 illustrates an example of a glass type mobile terminal according to an exemplary embodiment of the present invention,
5 is a perspective view of an HMD (Head Mounted Display) according to an embodiment of the present invention.
6 is a perspective view of a mobile terminal according to an embodiment of the present invention,
7 to 9 are views showing various examples of a perspective view of a camera according to an embodiment of the present invention,
10 to 12 are views for explaining an extended FOV (Field Of View) of a camera according to an embodiment of the present invention,
13 to 15 are views showing various examples of a perspective view of a camera according to another embodiment of the present invention,
16-17 illustrate various examples of FOV of a camera according to another embodiment of the present invention,
18 to 23 illustrate various examples of the position of a mobile terminal and a corresponding virtual image according to an embodiment of the present invention,
24 is a diagram illustrating a location of a mobile terminal to which a virtual reality system according to another embodiment of the present invention is applied,
25 is a block diagram of a mobile terminal according to another embodiment of the present invention,
26 to 30 are various examples of the position of the mobile terminal and the resulting virtual image according to another embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

The use of the terms "comprising" or "having" in this application is intended to specify the presence of stated features, integers, steps, operations, elements, parts, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, parts, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, .

1 and 2 are various examples of drawings relating to a virtual reality system according to an embodiment of the present invention. Hereinafter, redundant description of well-known technical contents will be omitted.

Referring to FIG. 1, a virtual reality system 600A according to an exemplary embodiment of the present invention may include a camera 100, an HMD (Head Mounted Display) 300, and a processing device 200.

A user who wants to experience the virtual reality can wear the HMD 300. [ The HMD 300, which is also referred to as a head mounted display device, is a device for directly outputting images in front of a user's eyes, unlike a general display, and can directly feel a sense of reality more than an LCD panel or 3D glasses.

The camera 100 can detect the HMD 300 located within a predetermined area. Here, the predetermined area may mean a field of view (FOV) that can be photographed by the image sensor 120 included in the camera 100.

Here, the camera 100 does not refer to a general RGB camera for photographing a subject, but may detect an element outputting a specific signal and extract position information of the detected element. Accordingly, the camera 100 according to the embodiment of the present invention may include any device that performs the above-described functions without being limited by the name such as an infrared camera, an infrared (IR) sensor, or the like.

Meanwhile, the HMD 300 includes a plurality of light emitting devices (not shown), and each of the plurality of light emitting devices can output a detection signal in one direction. Therefore, the HMD 300 can detect the position of the user wearing the HMD 300 by receiving the detection signal output from each of the plurality of light emitting devices.

Accordingly, the camera 100 can transmit the detected user's location information to the processing device 200. [ 1, the processing device 200 is implemented as a PC (Personal Computer), but the present invention is not limited thereto. The processing device 200 receiving the location information of the user can implement a virtual reality by transmitting graphic or various images according to the user's location information to the HMD 300. [

2, a virtual reality system 600B according to an exemplary embodiment of the present invention may include a camera 100 and a mobile terminal 400. FIG. Hereinafter, the contents overlapping with the description of FIG. 1 will be omitted.

In FIG. 1, the location information of the user is transmitted from the camera 100 to the processing device 200, and the processing device 200 shows a system for transmitting the location information of the user and various images to the HMD 300. However, as shown in FIG. 2, the HMD 300 and the processing device 200 may be implemented as a single mobile terminal 400. FIG.

In this case, the mobile terminal 400 is preferably implemented as a glass-type mobile terminal or a wearable glass. Accordingly, the camera 100 can detect the location of the user wearing the mobile terminal 400 and transmit the detected location information of the user to the mobile terminal 400. The mobile terminal 400 receiving the location information of the user can realize a virtual reality by displaying various images based on the user location information.

3 is an exemplary block diagram of a mobile terminal 400 according to an embodiment of the present invention. 3, the mobile terminal 400 includes a wireless communication unit 410, an input unit 420, a sensing unit 440, an output unit 450, an interface unit 460, a memory 470, a controller 480, And a power supply 490, and the like. The components shown in FIG. 3 are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

More specifically, the wireless communication unit 410 among the components may be connected to the wireless communication system 400, between the mobile terminal 400 and another mobile terminal 400, or between the mobile terminal 400 and the external server, And may include one or more modules to enable communication. The wireless communication unit 410 may include one or more modules for connecting the mobile terminal 400 to one or more networks.

The wireless communication unit 410 may include at least one of a broadcast receiving module 411, a mobile communication module 412, a wireless Internet module 413, a short distance communication module 414, and a location information module 415.

The input unit 420 includes a camera 421 or an image input unit for inputting a video signal, a microphone 422 or an audio input unit for inputting an audio signal, a user input unit 423 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 420 may be analyzed and processed by a user's control command.

The sensing unit 440 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 440 may include a proximity sensor 441, an illumination sensor 442, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 422, a battery gauge, an environmental sensor (e.g., a barometer, a hygrometer, a thermometer, a radiation detection sensor, a heat sensor, A sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 450 includes at least one of a display unit 451, an acoustic output unit 452, a haptrip module 453, and a light output unit 454 to generate an output related to visual, auditory, can do. The display unit 451 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen functions as a user input unit 423 that provides an input interface between the mobile terminal 400 and the user and can provide an output interface between the mobile terminal 400 and the user.

The interface unit 460 serves as a channel with various kinds of external devices connected to the mobile terminal 400. The interface unit 460 includes a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a video input / output port, an audio input / output port, a video input / output port, and an earphone port. In the mobile terminal 400, corresponding to the connection of the external device to the interface unit 460, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 470 stores data supporting various functions of the mobile terminal 400. The memory 470 may store a plurality of application programs or applications running on the mobile terminal 400, data for operation of the mobile terminal 400, and commands. At least some of these applications may be downloaded from an external server via wireless communication. At least some of these application programs may exist on the mobile terminal 400 from the time of departure for the basic functions of the mobile terminal 400 (e.g., phone call incoming, calling function, message receiving, and calling function). Meanwhile, the application program may be stored in the memory 470, installed on the mobile terminal 400, and may be operated by the control unit 480 to perform the operation (or function) of the mobile terminal.

The control unit 480 typically controls the overall operation of the mobile terminal 400, in addition to operations associated with application programs. The control unit 480 may process or process signals, data, information, etc. input or output through the above-mentioned components, or may drive an application program stored in the memory 470 to provide or process appropriate information or functions to the user.

The controller 480 may also control at least some of the components discussed with reference to FIG. 3 to drive the application programs stored in the memory 470. Further, the control unit 480 may operate at least two or more of the components included in the mobile terminal 400 in combination with each other for driving an application program.

The power supply unit 490 receives external power and internal power under the control of the controller 480 and supplies power to the components included in the mobile terminal 400. The power supply unit 490 may include a battery, and the battery may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with each other to implement a method of operation, control, or control of the mobile terminal according to various embodiments described below. The method of operation, control, or control of the mobile terminal may also be implemented on the mobile terminal by driving at least one application program stored in the memory 470.

Hereinafter, the components listed above will be described in more detail with reference to FIG. 3 before explaining various embodiments implemented through the mobile terminal 400 as described above.

First, referring to the wireless communication unit 410, the broadcast receiving module 411 of the wireless communication unit 410 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. More than one broadcast receiving module may be provided to the mobile terminal 400 for simultaneous broadcast reception or broadcast channel switching of at least two broadcast channels.

The mobile communication module 412 may be a mobile communication module such as a mobile communication module, a mobile communication module, a mobile communication module, a mobile communication module, (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access), LTE (Long Term Evolution), LTE-A (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only) Long Term Evolution-Advanced), and the like, to a base station, an external terminal, and a server.

The wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module 413 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 400. The wireless Internet module 413 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro (Long Term Evolution-Advanced), LTE-A (Long Term Evolution-Advanced), and the like. The wireless Internet module 413 includes a wireless LAN module 413, Transmits and receives data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 413, which performs wireless Internet access through a mobile communication network, is a wireless Internet service provided by a mobile communication network, such as WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE- May be understood as a kind of mobile communication module 412.

The short-range communication module 414 is for short-range communication. The short-range communication module 414 may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB), a ZigBee, (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The local area communication module 414 is connected to the mobile terminal 400 and the wireless communication system through the wireless area networks or between the mobile terminal 400 and another mobile terminal 400, And may support wireless communication between the network where the other mobile terminal 400 (or the external server) is located. The short-range wireless communication network may be a short-range wireless personal area network.

Herein, another mobile terminal 400 may be a wearable device capable of interchanging data with the mobile terminal 400 according to the present invention (e.g., a smartwatch, a smart smart glass, HMD (head mounted display)). The short range communication module 414 can detect (or recognize) a wearable device capable of communicating with the mobile terminal 400 around the mobile terminal 400. [ Furthermore, the control unit 480 may transmit at least part of the data processed by the mobile terminal 400 to the short distance communication module 414 when the detected wearable device is an authorized device to communicate with the mobile terminal 400 according to the present invention. To the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 400 through the wearable device. For example, the user can make a phone call through the wearable device when a phone call is received in the mobile terminal 400, or send a message received through the wearable device when a message is received in the mobile terminal 400 It is possible to confirm.

The location information module 415 is a module for obtaining the location (or current location) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, the position of the mobile terminal can be acquired based on information of a wireless access point (AP) transmitting or receiving a wireless signal with the Wi-Fi module. The location information module 415 may perform substitute or additionally functions of other modules of the wireless communication unit 410 to obtain data on the location of the mobile terminal as needed. The location information module 415 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

The input unit 420 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 421. The camera 421 processes an image frame such as a still image or a moving image obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 451 or stored in the memory 470. [ The plurality of cameras 421 provided in the mobile terminal 400 may be arranged to have a matrix structure and the mobile terminal 400 may be provided with various angles or foci through the camera 421 having a matrix structure A plurality of pieces of image information can be input. The plurality of cameras 421 may be arranged in a stereo structure to obtain a left image and a right image for realizing a stereoscopic image.

The microphone 422 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 400. [ Meanwhile, the microphone 422 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 423 is used to receive information from a user. When information is input through the user input unit 423, the control unit 480 can control the operation of the mobile terminal 400 to correspond to the input information . The user input unit 423 may include a mechanical input unit (or a mechanical key such as a button located on the front or rear or side of the mobile terminal 400, a dome switch, a jog wheel, Etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or may be disposed on a portion other than the touch screen And a touch key. On the other hand, the virtual key or the visual key can be displayed on the touch screen with various forms, for example, graphic, text, icon, video, or a combination thereof Lt; / RTI >

Meanwhile, the sensing unit 440 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 480 may control the driving or operation of the mobile terminal 400 or may perform data processing, function or operation related to the application program installed in the mobile terminal 400 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 440 will be described in more detail.

First, the proximity sensor 441 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact using an electromagnetic force or infrared rays. The proximity sensor 441 may be disposed near the inner area of the mobile terminal or the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 441 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 441 may be configured to detect the proximity of the object with a change in electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

For the sake of convenience of explanation, the act of recognizing that an object is located on the touch screen while the object is not in contact with the touch screen is referred to as "proximity touch" Quot; contact touch ". The position at which an object is touched on the touch screen means a position at which the object corresponds vertically to the touch screen when the object is touched. The proximity sensor 441 may sense a proximity touch and a proximity touch pattern (e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position, proximity touch movement state, and the like). Meanwhile, the control unit 480 processes the data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 441 as described above, and further provides visual information corresponding to the processed data to the touch screen Can be output on the display device. Further, the control unit 480 may control the mobile terminal 400 so that different operations or data (or information) are processed depending on whether the touch on the same point on the touch screen is a proximity touch or a contact touch.

(Or touch input) applied to the touch screen (or the display unit 451) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion into an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 480. Thus, the control unit 480 can know which area of the display unit 451 is touched or the like. Here, the touch controller may be a separate component from the controller 480, and may be the controller 480 itself.

On the other hand, the control unit 480 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 400 or an application program being executed.

On the other hand, the touch sensors and proximity sensors described above can be used independently or in combination to provide short (touch), long (touch), multi touch, drag touch Touches such as flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, Sensing can be performed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. On the other hand, the controller 480 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated by using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

Meanwhile, the camera 421 of the input unit 420 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 421 and the laser sensor can be combined with each other to sense the touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, a photosensor mounts a photo diode and a transistor (TR) in a row / column and scans the contents placed on the photosensor using an electrical signal that changes according to the amount of light applied to the photo diode. That is, the photo sensor performs the coordinate calculation of the object to be sensed according to the amount of change of light, and the position information of the object to be sensed can be obtained through the calculation.

The display unit 451 displays (outputs) information to be processed by the mobile terminal 400. For example, the display unit 451 may display execution screen information of an application program driven by the mobile terminal 400 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information.

Also, the display unit 451 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), a projection system (holographic system) can be applied.

The audio output unit 452 can receive audio data received from the wireless communication unit 410 or stored in the memory 470 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 452 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, and the like) performed by the mobile terminal 400. [ The sound output unit 452 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 453 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 453 may be vibration. The intensity and pattern of the vibration generated in the haptic module 453 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 453 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 453 may include a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 453 can transmit the tactile effect through the direct contact, and the tactile effect can be realized by the user through the muscular sense of the finger or arm. The haptic module 453 may include two or more haptic modules according to the configuration of the mobile terminal 400.

The light output unit 454 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 400. Examples of events that occur in the mobile terminal 400 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 454 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 460 serves as a path for communication with all external devices connected to the mobile terminal 400. The interface unit 460 receives data from an external device or supplies power to each component in the mobile terminal 400 or allows data in the mobile terminal 400 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 460.

Meanwhile, the identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 400 and includes a user identification module (UIM), a subscriber identity module (SIM) a universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 400 through the interface unit 460. [

The interface unit 460 may be a path through which power from the cradle is supplied to the mobile terminal 400 when the mobile terminal 400 is connected to an external cradle or various command signals input from the cradle by the user are moved And may be a passage to be transmitted to the terminal 400. The various command signals or power from the cradle can be operated as a signal for recognizing that the mobile terminal 400 is correctly mounted on the cradle.

The memory 470 may store a program for the operation of the controller 480 and may temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 470 may store data related to vibrations and sounds of various patterns output upon touch input on the touch screen.

The memory 470 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 400 may operate in association with a web storage that performs the storage function of the memory 470 on the Internet.

Meanwhile, as described above, the control unit 480 controls the operations related to the application program and the general operation of the mobile terminal 400. [ For example, the control unit 480 may execute or release a lock state for restricting input of a user's control command to applications if the state of the mobile terminal satisfies a set condition.

In addition, the control unit 480 may perform control and processing related to voice communication, data communication, video call, or the like, or may perform pattern recognition processing capable of recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively have. Further, the control unit 480 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 400 according to the present invention.

The power supply unit 490 receives external power and internal power under the control of the controller 480 and supplies power necessary for operation of the respective components. The power supply unit 490 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

Also, the power supply unit 490 may include a connection port, and the connection port may be configured as an example of an interface 460 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 490 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 490 may use at least one of an inductive coupling method based on the magnetic induction phenomenon and a magnetic resonance coupling method based on the electromagnetic resonance phenomenon from an external wireless power transmission apparatus, .

In the following, the various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Meanwhile, the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses. These wearable devices include smart watch, smart glass, and head mounted display (HMD). Hereinafter, examples of a mobile terminal extended to a wearable device will be described.

The wearable device can be made to be able to exchange (or interlock) data with another mobile terminal 400. The short range communication module 414 can detect (or recognize) a wearable device that can communicate with the mobile terminal 400. If the detected wearable device is a device authenticated to communicate with the mobile terminal 400, the control unit 480 transmits at least a part of the data processed by the mobile terminal 400 to the wearable device 420 via the short- Lt; / RTI > Accordingly, the user can use the data processed by the mobile terminal 400 through the wearable device. For example, when a telephone is received in the mobile terminal 400, it is possible to make a telephone call through the wearable device or to confirm the received message through the wearable device when the message is received in the mobile terminal 400 .

FIG. 4 illustrates a case where a mobile terminal according to an embodiment of the present invention is implemented by a glass-type mobile terminal 400. FIG.

The glass-type mobile terminal 400 is configured to be worn on the head of a human body, and a frame unit (case, housing, etc.) for the mobile terminal 400 may be provided. The frame portion may be formed of a flexible material to facilitate wearing. This figure illustrates that the frame portion includes a first frame 401 and a second frame 402 of different materials. In general, the mobile terminal 400 may include features of the mobile terminal 400 of FIG. 3 or similar features.

The frame portion is supported on the head portion, and a space for mounting various components is provided. As shown in the figure, electronic parts such as the control module 480, the sound output module 452, and the like may be mounted on the frame part. Further, a lens 403 covering at least one of the left eye and the right eye may be detachably mounted on the frame portion.

The control module 480 controls various electronic components included in the mobile terminal 400. The control module 480 can be understood as a configuration corresponding to the control unit 480 described above. This figure illustrates that the control module 480 is provided in the frame portion on one side of the head. However, the position of the control module 480 is not limited thereto.

The display unit 451 may be implemented as a head mounted display (HMD). The HMD type refers to a display method that is mounted on a head and displays an image directly in front of the user's eyes. When the user wears the glass-type mobile terminal 400, the display unit 451 may be arranged to correspond to at least one of the left and right eyes so that the user can directly provide an image in front of the user's eyes. In this figure, the display unit 451 is located at a portion corresponding to the right eye so that an image can be output toward the user's right eye.

The display unit 451 can project an image to the user's eyes using a prism. Further, the prism may be formed to be transmissive so that the user can view the projected image and the general view of the front (the range that the user views through the eyes) together.

As described above, the image output through the display unit 451 may be overlapped with the general view. The mobile terminal 400 can provide an Augmented Reality (AR) in which a virtual image is superimposed on a real image or a background and displayed as a single image using the characteristics of the display.

The camera 421 is disposed adjacent to at least one of the left eye and the right eye, and is configured to photograph a forward image. Since the camera 421 is positioned adjacent to the eyes, the camera 421 can acquire a scene viewed by the user as an image.

Although the camera 421 is provided in the control module 480 in this figure, it is not limited thereto. The camera 421 may be installed in the frame portion, or may be provided in a plurality of ways to obtain a stereoscopic image.

The glass-type mobile terminal 400 may include user input units 423a and 423b operated to receive control commands. The user input units 423a and 423b can be employed in any manner as long as the user operates in a tactile manner such as a touch or a push. This figure illustrates that the frame unit and the control module 480 are provided with user input units 423a and 423b of a push and touch input method, respectively.

In addition, the glass-type mobile terminal 400 may be provided with a microphone (not shown) for receiving sound and processing it as electrical voice data and an acoustic output module 452 for outputting sound. The sound output module 452 may be configured to transmit sound in a general sound output mode or a bone conduction mode. When the sound output module 452 is implemented in a bone conduction manner, when the user wears the mobile terminal 400, the sound output module 452 is brought into close contact with the head and vibrates the skull to transmit sound.

5 to 6 are each an example of a perspective view of an HMD and a mobile terminal according to an embodiment of the present invention. Hereinafter, a method of detecting the position of the HMD 300 or the mobile terminal 400 by the camera 100 will be described.

5 is a perspective view of the HMD 300, specifically, a rear view of the HMD 300. As shown in FIG. As shown in FIG. 5, a plurality of light emitting devices 330 may be disposed on the outer rear surface of the HMD 300.

In this case, the light emitting device 330 does not necessarily mean a device that emits visible light, but may refer to various devices capable of emitting light of a specific wavelength to inform the camera 100 of its position. Therefore, if the camera 100 detects infrared rays, the plurality of light emitting devices 330 may be devices that emit infrared rays, respectively.

6 is a perspective view of the mobile terminal 400, in particular, a rear view of the glass-type mobile terminal 400. As shown in FIG. 6, a light emitting device 430 may be disposed at a distal end of each of the first frame 401 and the second frame 402 of the mobile terminal 400.

In FIG. 6, only one light emitting device 430 is formed in each of the first frame 401 and the second frame 402, but the present invention is not limited thereto. Accordingly, the first frame 401 and the second frame 402 may include a plurality of light emitting devices arranged up and down according to predetermined distances, respectively, or a plurality of light emitting devices arranged right and left.

On the other hand, the glass-type mobile terminal 400 is illustrated in FIG. 6, but the present invention is not limited thereto. Accordingly, the present invention can be equally applied to a case where a smart phone is mounted on an HMD device capable of mounting a smart phone.

If the HMD 300 or the mobile terminal 400 is within the FOV of the camera 100, the location information can be extracted by the camera 100 as described above. Here, in order for the user to experience the virtual reality more realistically and more actively, the FOV of the camera 100 needs to be widened. Therefore, a method of widening the FOV of the camera 100 will be described below.

7 to 9 are various examples of a perspective view of a camera according to an embodiment of the present invention, but only a part of the configuration of the camera is shown for explaining the present invention. Hereinafter, redundant description of well-known technical contents will be omitted.

As shown in FIGS. 7 to 9, a plurality of image sensors 120-1 and 120-2 may be formed on the base 110. FIG. Therefore, the plurality of image sensors 120-1 and 120-2 form respective FOVs, and a plurality of FOVs formed by the plurality of image sensors 120-1 and 120-2 can be combined into one FOV . This will be described in detail in FIG. 10 to FIG.

Meanwhile, the arrangement methods of the plurality of image sensors 120-1 and 120-2 may be various.

Specifically, as shown in FIG. 7, the first image sensor 120-1 and the second image sensor 120-2 are arranged in the left-right direction with respect to the base 110, The image sensor 120-1 and the second image sensor 120-2 may be disposed in the vertical direction with respect to the base 110. [ In addition, as shown in FIG. 9, the four image sensors 120-1 to 120-4 may be arranged in a rectangular shape.

7 to 9, a plurality of image sensors are separately formed on the base 110. However, the present invention is not limited thereto. Thus, each of the plurality of image sensors may be formed in contact with each other.

Thus, according to one embodiment of the present invention, a plurality of image sensors may be disposed on the base 110 in various manners. Hereinafter, the manner in which the FOV of the camera 100 is expanded by the plurality of image sensors will be described.

10 to 12 are various examples of drawings illustrating an extended FOV of a camera according to an embodiment of the present invention.

FIG. 10 shows a FOV according to the conventional camera 100. FIG. As shown in FIG. 10, the conventional camera 10 includes only one image sensor 20. Therefore, the width of the FOV F1 by the camera 10 may be only L1.

On the other hand, according to the camera 100 according to the embodiment of the present invention, the FOV as shown in FIG. 11 can be formed. Referring to FIG. 11, the first image sensor 120-1 and the second image sensor 120-2 are formed on the base 110 in contact with each other. In this case, the first image sensor 120-1 forms a first FOV with respect to the front left side with respect to the camera 100, and the second image sensor 120-2 forms a first FOV with respect to the front right side with respect to the camera 100 2 FOV can be formed.

In this case, the final FOV (F2) in which the first FOV and the second FOV are synthesized is as shown in FIG. That is, the FOV (F2) shown in FIG. 11 can be increased in width by L2 in comparison with the FOV (F1) shown in FIG. Accordingly, by arranging a plurality of image sensors, the position area of the HMD 300 or the mobile terminal 400 that can be detected by the camera 100 can be increased.

On the other hand, unlike the case of FIG. 11, the first image sensor 120-1 and the second image sensor 120-2 may be formed at a constant distance as shown in FIG. Referring to FIG. 12, the first image sensor 120-1 and the second image sensor 120-2 may be formed on the base 110 by a predetermined distance d. In this case, the first image sensor 120-1 and the second image sensor 120-2 form the first and second FOVs, respectively, as described above.

In this case, the final FOV (F3) in which the first and second FOVs are synthesized is as shown in FIG. 10, the FOV (F3) shown in Fig. 12 can be increased in width by L3. It can also be seen that the FOV (F3) shown in FIG. 12 has increased in area compared to the FOV (F2) shown in FIG. Accordingly, by increasing the distance between the first image sensor 120-1 and the second image sensor 120-2, the positional area of the HMD 300 or the mobile terminal 400, which can be detected by the camera 100, Can be increased.

10 to 12 illustrate the left and right directions of the camera 100. However, the present invention is also applicable to the vertical direction.

In the above description, the case where the first image sensor 120-1 and the second image sensor 120-2 are formed on the planar base 110 has been described. However, by changing the shape of the base 110, the positional area according to the final FOV can be increased and will be described below.

13 to 15 are various examples of a perspective view of a camera according to another embodiment of the present invention. Hereinafter, the description of the parts overlapping with the above description will be omitted.

13 to 15 show cross sections of the base 110 along the Z-direction.

Referring to FIG. 13, the left and right sides of the upper surface of the base 110 may be formed to be inclined with respect to the outward direction, respectively. Accordingly, the first image sensor 120-1 and the second image sensor 120-2 may be formed on the inclined surfaces with respect to the right and left directions, respectively.

Referring to FIG. 14, the upper surface of the base 110 may form a curved surface. Accordingly, the first image sensor 120-1 and the second image sensor 120-2 may be formed at symmetrical positions around the inflection line with respect to the curved surface, respectively.

Referring to FIG. 15, the upper surface of the base 110 may be formed such that the central axis directions of the FOVs of the four image sensors are different from each other. That is, the center axis direction of the FOV is the right upper side, the center axis direction of the FOV is the lower right side of the second image sensor 120-2, the third image sensor 120-3 is FOV The upper surface of each base 110 may be formed such that the center axis direction of the first image sensor 120-4 is directed to the left upper side and the central axis direction of the FOV is directed to the lower left side.

Thus, according to one embodiment of the present invention, a plurality of image sensors may be disposed on the base 110 in various manners. Hereinafter, the manner in which the FOV of the camera 100 is expanded by the plurality of image sensors will be described.

16 to 17 are various examples of the FOV of a camera according to another embodiment of the present invention. Hereinafter, the description of the parts overlapping with the above description will be omitted.

11, the first image sensor 120-1 and the second image sensor 120-2 are in contact with each other, and the center axes of the FOVs are formed to be inclined to the right and left, respectively. In this case, the first image sensor 120-1 forms a first FOV with respect to the front left side with respect to the camera 100, and the second image sensor 120-2 forms a first FOV with respect to the front right side with respect to the camera 100 2 FOV can be formed.

In this case, the width L4 of the final FOV in which the first and second FOVs are synthesized is as shown in Fig. The width L4 of the FOV according to the shape of the base 110 in FIGS. 16 through 17 can be increased as compared with the width (L1 + L2) of the FOV according to the planar base 110 as shown in FIG. Therefore, by disposing the image sensor so that the center axis of each FOV faces outward, it is possible to increase the position area of the HMD 300 or the mobile terminal 400, which can be detected by the camera 100. [

On the other hand, as shown in FIG. 17, the first image sensor 120-1 and the second image sensor 120-2 may be formed with a constant distance d. In this case, the width L5 of the final FOV in which the first and second FOVs are synthesized is as shown in Fig. The width L5 of the FOV shown in Fig. 17 can be increased in comparison with the width (L1 + L3) of the FOV shown in Fig. Accordingly, by increasing the distance between the first image sensor 120-1 and the second image sensor 120-2, the positional area of the HMD 300 or the mobile terminal 400, which can be detected by the camera 100, .

10 to 12 illustrate the left and right directions of the camera 100. However, the present invention is also applicable to the vertical direction.

In the foregoing, a method for extending the FOV of the camera 100 has been described. Hereinafter, a change in the image displayed on the display unit 451 of the HMD 300 or the mobile terminal 400 according to whether the light emitting device 430 is positioned within the extended FOV will be described.

18 to 23 are various examples of the position of the mobile terminal and the resulting virtual image according to an embodiment of the present invention. It will be appreciated that the present invention may be applied to a case where a smart phone is mounted on the HMD 300 or an HMD capable of mounting a smart phone.

A first light emitting device 430-1 and a second light emitting device 430-2 are vertically disposed at the ends of the first frame 401 of the mobile terminal 400, Three light emitting devices 430-3 and a fourth light emitting device 430-4 are vertically arranged. However, the present invention is not limited to this arrangement, and three or more light emitting devices 430 may be arranged in the first frame 401 and the second frame 402 in various ways in the vertical and / or horizontal directions.

On the other hand, as shown in FIG. 18, the plurality of light emitting devices 430 are positioned within the range of FOV (F). In this case, the screen shown in FIG. 19 can be displayed on the display unit 451. FIG.

In this case, the first light emitting device 430-1 and the second light emitting device 430-2 may be out of the range of FOV (F) when the user moves to the left. The camera 100 may transmit the location information of the mobile terminal 400 to the mobile terminal 400. The control unit 480 of the mobile terminal 400 may display a screen on which the center is shifted to the left side as shown in FIG. 21 based on the received location information.

In this case, the controller 480 can determine from the received position information that the first light emitting device 430-1 and the second light emitting device 430-2 are out of the range of FOV (F). Therefore, the control unit 480 displays the screen as shown in FIG. 19 after a predetermined time has elapsed after displaying the screen as shown in FIG. 21, so that the center of the screen to be displayed can be returned to the original position. This can be similarly explained when the user moves to the right and the third light emitting device 430-3 and the fourth light emitting device 430-4 are out of the range of FOV (F).

On the other hand, as shown in FIG. 22, when the user moves upward, the first light emitting device 430-1 and the third light emitting device 430-3 may be out of the range of FOV (F). In this case, the control unit 480 can display a screen on which the center is shifted upward as shown in FIG. 23 based on the received position information.

In this case, the control unit 480 can determine from the received position information that the first light emitting device 430-1 and the third light emitting device 430-3 are out of the range of FOV (F). Therefore, the control unit 480 displays the screen as shown in FIG. 19 after a predetermined time has elapsed after displaying the screen as shown in FIG. 23, so that the center of the screen to be displayed can be returned to the original position. This can be similarly explained when the user moves down and the second light emitting device 430-2 and the fourth light emitting device 430-4 are out of the range of FOV (F).

The case where a part of the plurality of light emitting elements is out of the range of the FOV (F) has been described above. Hereinafter, as shown in Fig. 24, the case where all of the plurality of light emitting devices are out of the range of the FOV (F) will be described.

FIG. 25 is a block diagram of a mobile terminal according to another embodiment of the present invention, and FIGS. 26 to 30 are various examples of screens displayed according to the position of the mobile terminal. Hereinafter, redundant description of well-known technical contents will be omitted.

25, the mobile terminal 400 according to another embodiment of the present invention further includes an inertial sensor 443, a barometric pressure sensor or an atmospheric pressure sensor 445, and an infrared distance measurement sensor 444 .

The infrared distance measuring sensor 444 includes a light emitting portion (not shown) and a light receiving portion (not shown). The infrared ray measuring sensor 444 detects the amount of infrared rays reflected by an external object and incident on the light receiving portion A voltage is formed according to the voltage, and the distance to an external object can be measured by this voltage. Hereinafter, a screen changed by the infrared distance measuring sensor 444 will be described.

As shown in FIG. 26 (A), a user wearing the mobile terminal 400 is located at the bottom of a specific space. In this case, the mobile terminal 400 includes a first infrared distance measuring sensor (not shown) for measuring the distance to the front, a second infrared distance measuring sensor (not shown) for measuring the distance to the ceiling, And a third infrared distance measuring sensor (not shown).

Therefore, if the user is located as shown in FIG. 26 (A), a screen as shown in FIG. 27 can be displayed on the display unit 451. In this case, a first GUI 510 indicating the distance to the front, a second GUI 520 indicating the distance to the ceiling, and a third GUI 530 indicating the distance to the floor may be displayed on the screen.

Also, the first to third GUIs 510 to 530 may be displayed in the form of arrows.

Also, the length of the GUI can be proportional to the distance displayed on the GUI. For example, since the distance between the mobile terminal 400 and the ceiling is twice as long as the distance between the mobile terminal 400 and the floor, the second GUI 520 is displayed twice as long as the third GUI 530 .

Then, as shown in FIG. 26 (B), the user wearing the mobile terminal 400 can move to a position 1 m above the floor. Therefore, as shown in FIG. 28, the display unit 451 may display a screen in which the center is shifted upward to correspond to the moved position of the user.

In this case, after a certain time has elapsed from the display of the screen as shown in FIG. 28, the screen shown in FIG. 29 is displayed so that the center of the displayed screen can be returned to the original position. In this case, the first to third GUIs 510 to 530 do not return to the previous screen, and can display the distances according to the current user positions, respectively.

On the other hand, the air pressure sensor 445 can measure the air pressure to the current position and convert the measured air pressure to the altitude of the current position. In addition, the inertial sensor 443 can measure the acceleration, the speed, the direction, and the distance by sensing the inertial force due to the motion. Hereinafter, a screen changed by the atmospheric pressure sensor 445, the inertial sensor 443, or the like will be described with reference to FIG.

Although not shown, it is assumed that the user wearing the mobile terminal 400 is located at the seventh floor of a specific building. The air pressure sensor 445 can highly convert the air pressure for the user's current position. Accordingly, the fourth GUI 540 can display the numerical value corresponding to the current altitude of the user.

In this case, the first GUI 510 may indicate a distance and a route that a user can virtually move. Accordingly, if the user has moved in the distance and direction according to the first GUI 510, the inertial sensor 443 can sense movement and direction change of the user. Accordingly, a screen corresponding to the left front can be displayed so as to correspond to the moved position of the user.

Also, if the user moves to the 8th floor, a screen shifted in the upward direction corresponding to the moved position of the user may be displayed. In this case, each of the second to third GUIs 520 and 530 may display a value corresponding to the current altitude.

The present invention described above can be embodied as computer readable codes on a medium on which a program is recorded. A computer readable medium includes any type of recording device in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). The computer may also include a controller 180 of the terminal. The foregoing detailed description, therefore, should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

600A, 600B: Virtual Reality System
400: mobile terminal 410: wireless communication unit
420: AV input unit 430: user input unit
440: sensing part 450: output part
460: interface unit 470: memory
480: Control section 490: Power supply section
300: HMD (Head Mouted Display)
200: processing device
100: camera 120: image sensor

Claims (8)

  1. HMD (Head Mounted Display);
    A camera for detecting a position of the HMD; And
    And a processing device for transmitting an image based on the detected position of the HMD to the HMD to implement a virtual reality, the virtual reality system comprising:
    The camera comprises:
    A first image sensor for photographing a first view according to a first angle;
    A second image sensor disposed on one of left and right sides or up and down sides of the first image sensor to photograph a second view according to a second angle; And
    And a controller for transmitting the third view combined with the first view and the second view to the HMD.
  2. The method according to claim 1,
    The processing apparatus includes:
    And moves the center of the screen displayed on the HMD to correspond to the position of the HMD when it is determined that the HMD is out of the FOV (Field Of View) corresponding to the third view.
  3. The method according to claim 1,
    In the HMD,
    And a barometric pressure sensor,
    The processing apparatus includes:
    And moves the center of the screen displayed on the HMD so as to correspond to the position of the HMD sensed by the atmospheric pressure sensor.
  4. The method according to claim 1,
    In the HMD,
    A first infrared distance measuring sensor, and a second infrared distance measuring sensor,
    The processing apparatus includes:
    A first distance between the first point sensed by the first distance measuring sensor and the HMD and a second distance between the second point sensed by the second distance measuring sensor and the HMD, And the center of the screen displayed on the display unit is moved.
  5. The method according to claim 1,
    In the HMD,
    An inertial sensor,
    The processing apparatus includes:
    And moves the center of the screen displayed on the HMD so as to correspond to the switching direction of the HMD sensed by the inertial sensor.
  6. The method according to claim 1,
    The processing apparatus includes:
    And displays a GUI (Graphic User Interface) indicating position information of the HMD on the HMD.
  7. 1. A virtual reality system including a camera and a mobile terminal,
    The mobile terminal includes:
    Displaying an image based on the location of the mobile terminal detected by the camera,
    The camera comprises:
    A first image sensor for photographing a first view according to a first angle;
    A second image sensor disposed on one of left and right sides or up and down sides of the first image sensor to photograph a second view according to a second angle; And
    And a controller for transmitting the third view combined with the first view and the second view to the HMD.
  8. A camera for detecting the position of the HMD to transmit an image based on a position of an HMD (Head Mounted Display) to the HMD to realize a virtual reality,
    A first image sensor for photographing a first view according to a first angle;
    A second image sensor disposed on one of left and right sides or up and down sides of the first image sensor to photograph a second view according to a second angle; And
    And a controller for transmitting the third view combined with the first view and the second view to the HMD.
KR1020150118706A 2015-08-24 2015-08-24 Camera and virtual reality system comorising thereof KR20170023491A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150118706A KR20170023491A (en) 2015-08-24 2015-08-24 Camera and virtual reality system comorising thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150118706A KR20170023491A (en) 2015-08-24 2015-08-24 Camera and virtual reality system comorising thereof

Publications (1)

Publication Number Publication Date
KR20170023491A true KR20170023491A (en) 2017-03-06

Family

ID=58399146

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150118706A KR20170023491A (en) 2015-08-24 2015-08-24 Camera and virtual reality system comorising thereof

Country Status (1)

Country Link
KR (1) KR20170023491A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019098450A1 (en) * 2017-11-15 2019-05-23 한국과학기술원 System for guiding gaze of user using mobile device and method thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019098450A1 (en) * 2017-11-15 2019-05-23 한국과학기술원 System for guiding gaze of user using mobile device and method thereof

Similar Documents

Publication Publication Date Title
US9635248B2 (en) Mobile device and method for controlling the same
KR20160136013A (en) Mobile terminal and method for controlling the same
KR20160029536A (en) Mobile terminal and control method for the mobile terminal
KR20160002000A (en) Mobile terminal and method for controlling external device using the same
KR20170128820A (en) Mobile terminal and method for controlling the same
US10185390B2 (en) Head mounted display with separate wire connected controller
KR20160006053A (en) Wearable glass-type device and control method of the wearable glass-type device
US10219026B2 (en) Mobile terminal and method for playback of a multi-view video
KR20160133230A (en) Mobile terminal
KR101678861B1 (en) Mobile terminal and method for controlling the same
KR20160025856A (en) Mobile terminal and method for controlling the same
KR20170005650A (en) Unmanned aerial vehicle, mobile terminal and method for controlling the same
KR20170006559A (en) Mobile terminal and method for controlling the same
KR20160001228A (en) Mobile terminal and method for controlling the same
KR20170014355A (en) Mobile terminal and method of controlling the same
KR20170024846A (en) Mobile terminal and method for controlling the same
KR20160074334A (en) Mobile terminal and method for controlling the same
US9904918B2 (en) Mobile terminal and control method therefor
EP2921943A1 (en) Terminal and method of processing data therein
KR101633342B1 (en) Mobile terminal and method for controlling the same
CN106155494B (en) Mobile terminal and control method thereof
KR20160149068A (en) Mobile terminal and method for controlling the same
KR20150146091A (en) Mobile terminal and method for controlling the same
KR20160019187A (en) Mobile terminal and method for controlling the same
KR20160017991A (en) Mobile terminal having smart measuring tape and object size measuring method thereof