KR20170027135A - Mobile terminal and method for controlling the same - Google Patents
Mobile terminal and method for controlling the same Download PDFInfo
- Publication number
- KR20170027135A KR20170027135A KR1020150123681A KR20150123681A KR20170027135A KR 20170027135 A KR20170027135 A KR 20170027135A KR 1020150123681 A KR1020150123681 A KR 1020150123681A KR 20150123681 A KR20150123681 A KR 20150123681A KR 20170027135 A KR20170027135 A KR 20170027135A
- Authority
- KR
- South Korea
- Prior art keywords
- mode
- information
- image
- display unit
- specific
- Prior art date
Links
Images
Classifications
-
- H04M1/72522—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/42—Graphical user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Controls And Circuits For Display Device (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A mobile terminal and a control method thereof are disclosed. The mobile terminal and the control method thereof according to the present invention allow the display unit to be recognized as being transparent when the first mode is selected and to allow the first information related to the image input through the camera to overlap with the image input through the user's eyes When the second mode is selected, the display unit is changed to be in an opaque state, a virtual image is displayed on the display unit, and the first mode and the second mode are mutually displayed When switching from the first mode to the second mode, or when switching from the second mode to the first mode, the second information obtained in the previous mode can be displayed in the current mode. According to the present invention, continuous information can be displayed by switching between a first mode for displaying an augmented reality and a second mode for displaying a virtual reality by using the same display unit.
Description
The present invention relates to a mobile terminal providing augmented reality or virtual reality and a control method thereof.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
In addition, the terminal can display a three-dimensional image on a space using a projector or the like. Particularly, when the terminal is formed of a glass type or a head-mounted display type, it is possible to provide an augmented reality or a virtual reality by utilizing such a three-dimensional image providing technology.
When the augmented reality or the virtual reality is provided through the mobile terminal, it is difficult to switch between the augmented reality and the virtual reality due to the difficulty of switching between the augmented reality and the virtual reality.
The present invention is directed to solving the above-mentioned problems and other problems. Yet another object of the present invention is to provide a display device capable of displaying a first mode for providing augmented reality and a second mode for providing a virtual reality by controlling the display unit to be transparent or opaque, And a method of controlling the mobile terminal.
According to an aspect of the present invention, there is provided an image processing apparatus including a camera, a display unit, and a first mode, 1 information is displayed on the display unit so that the first information is overlapped with the image input through the eyes of the user, and when the second mode is selected, the display unit is changed to be in an opaque state, And the second mode is switched from the first mode to the second mode when the first mode is switched to the first mode and the second mode is switched from the first mode to the second mode, And a control unit for controlling the display unit to display the second information obtained in the previous mode in the current mode when switching Thereby providing a mobile terminal.
According to another aspect of the present invention, there is provided a method of controlling a display device, the method comprising: selecting one of a first mode and a second mode; when the first mode is selected, The first information is displayed on the display unit so that the first information is overlapped with the image input through the user's eyes, and when the second mode is selected, the display unit is changed to be in an opaque state, Displaying the virtual image in the first mode and switching the second mode from the first mode to the first mode when switching from the first mode to the second mode, The method comprising the steps of:
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, the transparency of the display is adjusted so that augmented reality and virtual reality related to each other can be continuously provided in one terminal.
In addition, according to at least one embodiment of the present invention, information or sensing information registered in the augmented reality or virtual reality is provided in different modes, thereby sharing the experience of the user between different modes.
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1 is a block diagram illustrating a mobile terminal according to the present invention.
2 is a perspective view illustrating an example of a glass-type mobile terminal according to the present invention.
3 is a flowchart illustrating a control method of a mobile terminal according to the present invention.
4 to 11 are diagrams for explaining various embodiments of a control method of a mobile terminal according to the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
Referring to FIG. 1, FIG. 1 is a block diagram illustrating a mobile terminal according to the present invention.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
In addition, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.
In addition, the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and may be generated as one image as they are combined. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.
The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the stereoscopic display unit by the stereoscopic processing unit. The stereoscopic processing unit receives a 3D image (an image at a reference time point and an image at an expansion point), sets a left image and a right image therefrom, or receives a 2D image and converts it into a left image and a right image.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Meanwhile, the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses. These wearable devices include smart watch, smart glass, and head mounted display (HMD). Hereinafter, examples of a mobile terminal extended to a wearable device will be described.
The wearable device can be made to be able to exchange (or interlock) data with another
2 is a perspective view illustrating an example of a glass-type
The glass-type
The frame portion is supported on the head portion, and a space for mounting various components is provided. As shown in the figure, electronic parts such as the
The
The
The
As described above, the image output through the
The
Although the
The glass-type
In addition, the glass-type
The
The
The WiFi Positioning System (WPS) is a system in which a
The WiFi location tracking system may include a Wi-Fi location server, a
The
The Wi-Fi position location server extracts information of the wireless AP connected to the
The information of the wireless AP to be extracted based on the location information request message of the
As described above, the Wi-Fi position location server can receive the information of the wireless AP connected to the
Then, the Wi-Fi location server can extract (or analyze) the location information of the
As a method for extracting (or analyzing) the position information of the
The Cell-ID method is a method of determining the position of the mobile station with the strongest signal strength among neighboring wireless AP information collected by the mobile terminal. Although the implementation is simple, it does not cost extra and it can acquire location information quickly, but there is a disadvantage that positioning accuracy is lowered when the installation density of the wireless AP is low.
The fingerprint method collects signal strength information by selecting a reference position in a service area, and estimates the position based on the signal strength information transmitted from the mobile terminal based on the collected information. In order to use the fingerprint method, it is necessary to previously convert the propagation characteristics into a database.
The triangulation method is a method of calculating the position of the mobile terminal based on the coordinates of at least three wireless APs and the distance between the mobile terminals. (Time of Arrival, ToA), Time Difference of Arrival (TDoA) in which a signal is transmitted, and the time difference between the wireless AP and the wireless AP, in order to measure the distance between the mobile terminal and the wireless AP. , An angle at which a signal is transmitted (Angle of Arrival, AoA), or the like.
The landmark method is a method of measuring the position of a mobile terminal using a landmark transmitter that knows the location.
Various algorithms can be utilized as a method for extracting (or analyzing) the location information of the mobile terminal.
The extracted location information of the
The
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
FIG. 3 is a flowchart for explaining a control method of a mobile terminal according to the present invention, and FIGS. 4 to 11 are views for explaining various embodiments of a control method of a mobile terminal according to the present invention. Hereinafter, a glass-type mobile terminal will be described.
Referring to FIG. 3, the
Specifically, the
When the first mode is selected, the
The
The
When the second mode is selected, the
When the prism is made non-transmissive, the user can recognize only the projected image and can not recognize the image viewed through the eyes of the user in front. That is, the user can recognize only the virtual image projected on the
The
In addition, the
Specifically, when switching from the first mode to the second mode, the
When switching from the second mode to the first mode, the
When the
Specifically, the
When receiving an image including a feature such as a map, a building, and a road through the
In addition, the
When receiving an image related to shopping through the
In addition, the
In the first mode, the
4 to 11, a description will be made of a specific embodiment of a control method of a mobile terminal according to the present invention. 4A to 11A are not actually displayed on the
4, when the
Specifically, the
When a virtual image is registered corresponding to the first feature through the fourth feature, the
When the
The
The
5, the
Specifically, the
If a virtual image corresponding to each of the first information to the fourth information is registered, the
The
The
When receiving the input for the icon VR, the
The
When receiving the image of the amusement park through the
Referring to FIG. 6, while receiving the distance view through the
Specifically, the
When a virtual image is registered corresponding to the first feature through the fourth feature, the
When the
The
The
Referring to FIG. 7, while receiving a shopping object through the
Specifically, the
When a virtual image is registered corresponding to the first shopping object to the fourth shopping object, the
When the
The
The
8, when the
Specifically, the
The
The
The
The
9, while the user recognizes the cooking operation through the
Specifically, the
The
The
The
10, when the
Specifically, the
The
The
The
The
11, while the glass-type
Specifically, the
When the
The
According to the present invention, the first mode for providing the augmented reality and the second mode for providing the virtual reality can be displayed using the same display, and information of the previous mode can be used in the mutually switched mode, Experience. The present invention can be applied to various embodiments that can continuously share information between the augmented reality and the virtual reality, in addition to the above specific embodiments.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a
400: glass type
403: lens 421: camera
451: Display section 452: Acoustic output section
480:
Claims (20)
A display unit; And
The first information is held in a transparent state when the first mode is selected and the first information related to the image input through the camera overlaps with the image input through the user's eyes, And displaying the virtual image on the display unit when the second mode is selected, changing the display unit to be in an opaque state,
Wherein when switching from the first mode to the second mode or when switching from the second mode to the first mode is possible between the first mode and the second mode, A control unit for controlling the display unit to display the second information in the current mode;
.
Wherein the mobile terminal is formed in a glass type.
Wherein the virtual image is related to an image input through the camera in the first mode,
Wherein the control unit switches to the second mode and displays the virtual image on the display unit when receiving an input for switching from the first mode to the second mode.
Wherein,
When receiving an input for switching from the second mode to the first mode, switching to the first mode is performed so that information obtained from a virtual image or sensing information inputted or sensed while the virtual image is displayed And displays at least one second information on the display unit so as to be recognized by overlapping with an image input through the eyes of the user in the first mode.
Wherein the control unit displays an icon capable of switching to the second mode in an area of the display unit corresponding to a specific object registered with a virtual image among the objects included in the image input through the camera in the first mode Wherein the mobile terminal is a mobile terminal.
Wherein the control unit switches the first mode to the second mode when the input of the first icon of the icon is received and displays the registered virtual image corresponding to the area in which the first icon is displayed And controls the display unit.
Wherein the controller is configured to, when receiving an image including the feature through the camera in the first mode, display the at least one feature included in the received image on the display In addition,
And displaying an icon capable of switching to the second mode in an area of the display unit corresponding to the specific feature when the virtual image is registered corresponding to a specific feature among the at least one feature item .
Wherein the virtual image includes information related to appearance, virtual experience, or route guidance for the specific feature,
Wherein the control unit switches the first mode to the second mode when receiving an input to the icon and displays a virtual image corresponding to the specific feature on the display unit. .
Wherein the control unit further displays a reservation menu for the specific feature item and stores reservation information for the specific feature item as second information through the reservation menu.
Wherein the control unit switches from the second mode to the first mode and when receiving an image including the specific feature through the camera, the control unit changes the stored reservation information or the route guidance information to correspond to the specific feature And displays the information on the area of the display unit.
Wherein when the controller receives the image related to the shopping through the camera in the first mode, the control unit causes the display unit to display information of at least one object included in the received image by overlapping the received image Display,
And displays an icon capable of switching to the second mode in an area of the display unit corresponding to the specific object when a virtual image is registered corresponding to a specific object among the at least one object.
Wherein the virtual image includes use or experience information of the specific object,
Wherein the control unit switches the first mode to the second mode when receiving an input to the icon and displays a virtual image corresponding to the specific object on the display unit.
Wherein the control unit stores sensing information sensed while displaying a virtual image corresponding to the specific object as second information on the specific object.
The control unit switches the second mode from the first mode to the first mode and receives the image containing the specific object through the camera when the second information is stored in the area of the display unit corresponding to the specific object And displays the displayed information.
Wherein the control unit controls the display unit to overlap the received image with at least one learning information related to the received image when receiving the image related to the motion or the dish through the camera in the first mode, Display,
Wherein the display unit displays an icon capable of switching to the second mode when a specific motion operation during the exercise or cooking or a virtual image corresponding to a specific cooking order is registered.
Wherein the virtual image includes the specific movement operation or the specific cooking order and a specific description for each of the specific cooking operation,
The control unit may switch the first mode to the second mode when receiving an input to the icon, and display the specific exercise operation or a specific cooking order and a specific explanation for each of the specific exercise operations on the display unit The mobile terminal comprising:
Wherein the control unit stores the specific exercise operation or the specific cooking order and the confirmation of each of the specific descriptions as second information.
Wherein the controller switches the second mode to the first mode and when receiving the image including the specific motion operation or the specific cooking order through the camera, And displays the information on the area of the display unit corresponding to the order.
Wherein the control unit displays at least one navigation information related to the received image on the display unit so as to be overlapped with the received image to be recognized when receiving the image related to the driving information through the camera in the first mode, and,
Wherein the display unit displays an icon capable of switching to the second mode when a virtual image corresponding to the traffic information related to the driving information is registered, and when receiving an input to the icon, And displays the virtual image including the virtual image.
The first information is displayed on the display so that the first information related to the image input through the camera overlaps with the image input through the user's eyes, Displaying a virtual image on the display unit when the second mode is selected, changing the display unit to be in an opaque state, and displaying the virtual image on the display unit; And
Displaying the second information obtained in the previous mode in the current mode when switching from the first mode to the second mode or when switching from the second mode to the first mode;
And transmitting the control information to the mobile terminal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150123681A KR20170027135A (en) | 2015-09-01 | 2015-09-01 | Mobile terminal and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150123681A KR20170027135A (en) | 2015-09-01 | 2015-09-01 | Mobile terminal and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170027135A true KR20170027135A (en) | 2017-03-09 |
Family
ID=58402370
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150123681A KR20170027135A (en) | 2015-09-01 | 2015-09-01 | Mobile terminal and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170027135A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018182092A1 (en) * | 2017-03-31 | 2018-10-04 | 링크플로우 주식회사 | Image-based transaction method and device for performing method |
US10845600B2 (en) | 2018-04-24 | 2020-11-24 | Samsung Electronics Co., Ltd. | Controllable modifiable shader layer for head mountable display |
-
2015
- 2015-09-01 KR KR1020150123681A patent/KR20170027135A/en unknown
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018182092A1 (en) * | 2017-03-31 | 2018-10-04 | 링크플로우 주식회사 | Image-based transaction method and device for performing method |
US10845600B2 (en) | 2018-04-24 | 2020-11-24 | Samsung Electronics Co., Ltd. | Controllable modifiable shader layer for head mountable display |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10354404B2 (en) | Electronic device and control method therefor | |
KR20180024469A (en) | Mobile terminal and operating method thereof | |
KR20160141458A (en) | Mobile terminal | |
KR20180024429A (en) | Robot cleaner and a system inlduing the same | |
KR20180042777A (en) | Mobile terminal and operating method thereof | |
KR101893153B1 (en) | Mobile terminal and method for controlling the same | |
KR20170064342A (en) | Watch-type mobile terminal and method for controlling the same | |
US20160345124A1 (en) | Glasses-type terminal, and system including glasses-type terminal and signage | |
KR20170135267A (en) | Glass type mobile terminal | |
KR20180024576A (en) | Airport robot, recording medium recording program performing method of providing service thereof, and mobile terminal connecting same | |
KR20180002255A (en) | Glass type mobile terminal | |
KR101603114B1 (en) | Mobile terminal and method for controlling the same | |
KR101749393B1 (en) | Watch-type mobile terminal and dispaying method thereof | |
KR20170112527A (en) | Wearable device and method for controlling the same | |
US10271035B2 (en) | Glasses type terminal and system including glasses type terminal and signage | |
KR20170027135A (en) | Mobile terminal and method for controlling the same | |
KR20180095324A (en) | Glass type mobile device | |
KR20170023491A (en) | Camera and virtual reality system comorising thereof | |
KR20170029834A (en) | Mobile terminal and method for controlling the same | |
KR20160043266A (en) | Mobile device and method for controlling the same | |
KR20160022095A (en) | Mobile terminal for determining indoor position using local area communication, method for controlling the mobile terminal, and server therefore | |
KR20150123644A (en) | Mobile terminal and method for controlling the same | |
KR20180056035A (en) | Head mounted display and method for controlling the same | |
KR20180073959A (en) | Mobile terminal and method for controlling the same | |
KR20170040039A (en) | Mobile terminal and method for controlling the same |