KR20130053476A - Mobile terminal and method for controlling the same - Google Patents
Mobile terminal and method for controlling the same Download PDFInfo
- Publication number
- KR20130053476A KR20130053476A KR1020110118050A KR20110118050A KR20130053476A KR 20130053476 A KR20130053476 A KR 20130053476A KR 1020110118050 A KR1020110118050 A KR 1020110118050A KR 20110118050 A KR20110118050 A KR 20110118050A KR 20130053476 A KR20130053476 A KR 20130053476A
- Authority
- KR
- South Korea
- Prior art keywords
- proximity
- proximity sensor
- unit
- motion pattern
- module
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Abstract
Description
The present invention relates to a portable terminal and a control method thereof, in which the use of the terminal can be realized by further considering the convenience of the user.
A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.
In general, the terminal is movable The mobile terminal can be divided into a mobile terminal and a stationary terminal depending on whether the mobile terminal is portable or not, and the mobile terminal can be divided into a handheld terminal and a vehicle mount terminal.
Currently, due to the development of display technology, a number of terminals equipped with 3D functions and a touch screen have been released.
That is, the user may control the display operation of the 3D image by directly touching the touch screen provided in the terminal while watching the 3D image through the terminal.
However, when the user directly touches the touch screen while viewing the 3D image, the user may not be aware of the actual distance between the user and the touch screen due to the sense of space generated by the 3D image.
An object of the present invention is to provide a mobile terminal and a control method thereof, by which a user can control a display operation of a 3D object currently displayed on a screen by using a proximity depth and a motion pattern of a proximity touch.
According to an aspect of the present invention, there is provided a portable terminal including: a display configured to display a 3D (Dimensional) object; A proximity sensor unit including two or more proximity sensors for detecting a proximity depth and a motion pattern of an object to be approached; And a controller configured to control a display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.
In addition, the control method of a mobile terminal according to the present invention comprises the steps of: displaying a 3D (Dimensional) object on the screen; Driving a proximity sensor unit including two or more proximity sensors for detecting a proximity depth and a motion pattern of an object approaching the screen; And controlling the display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.
The mobile terminal and its control method according to the present invention control the display operation of the 3D object currently displayed on the screen using the proximity depth of the user's proximity touch and the motion pattern, so that the touch screen does not need to be touched directly. It also provides the effect of providing a new type of user interface for the manipulation of 3D objects.
1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of an example of a mobile terminal according to the present invention;
FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
3 is a conceptual diagram for explaining the principle of binocular disparity.
4 is a conceptual diagram for explaining a sense of distance and 3D depth due to binocular parallax.
FIG. 5 is a conceptual diagram illustrating a method of implementing 3D stereoscopic images in a view barrier type display unit applicable to embodiments of the present invention.
6 is a flowchart illustrating a process of controlling a display operation of a 3D object through a proximity touch of a mobile terminal according to the present invention.
7 to 13 are explanatory views illustrating a process of controlling a display operation of a 3D object through a proximity touch of a mobile terminal according to the present invention.
Hereinafter, a portable terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.
The portable terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.
1 is a block diagram of a portable terminal according to an embodiment of the present invention.
The
Hereinafter, the components will be described in order.
The
The
The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.
The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
For example, the
The broadcast signal and / or broadcast related information received through the
The
The
The short
The
Referring to FIG. 1, an A / V (Audio / Video)
The image frame processed by the
At this time, two or
For example, the
In this case, the
The
The
The
The
The
The
The
The
In addition, the
That is, the
That is, under the control of the
The
The 3D display process of the
The
Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the
There may be two or
When the
The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the
The
Two or
Preferably, the
Examples of the
Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.
The proximity sensor detects a proximity touch, a proximity depth, and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). . Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.
The
The
The
In addition to vibration, the
The
The
Specifically, the
The
Preferably, the
The
The
The
The identification module is a chip that stores various types of information for authenticating the use authority of the
When the
The
The
The
The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the
According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the
2A is a front perspective view of an example of a mobile terminal according to the present invention;
The disclosed
The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a
The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).
The mobile terminal body, mainly the
The
The
The content input by the first or
FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
Referring to FIG. 2B, a
The flash 123 and the mirror 124 may be further disposed adjacent to the
The sound output module 152 'may be further disposed on the rear side of the portable terminal body. The
In addition to the antenna for a call or the like, an
The terminal body is equipped with a
The
Meanwhile, the display for exclusive use of the
The
Hereinafter, a 3D image control process of a mobile terminal that can be applied in embodiments of the present invention will be described.
Stereoscopic images that can be implemented on the
First, the first stereoscopic image category will be described.
The first category is a method in which the same image is provided in both eyes (monoscopic), which can be implemented as a general display unit. More specifically, the
The second category is a stereo scopic method in which different images are provided in both eyes. It is a method using a principle of feeling a three-dimensional feeling when a human is looking at an object with the naked eye. In other words, the two eyes of a person see different plane images when they see the same thing by the distance between them. These different planar images are transmitted to the brain through the retina, and the brain fuses them to feel the depth and reality of the stereoscopic image. Therefore, binocular disparity due to the distance between the two eyes makes a sense of stereoscopic effect, and this binocular disparity becomes the most important element of the second category. This binocular disparity will be described in more detail with reference to Fig.
3 is a conceptual diagram for explaining the principle of binocular disparity.
In FIG. 3, it is assumed that the
If the left
As a result, in order to implement the stereoscopic image of the second category in the mobile terminal, the left eye image and the right eye image, which are seen at the same time with the same object, must be separated and reached through the display unit. Next, the 3D depth due to binocular disparity will be described with reference to FIG. 4.
4 is a conceptual diagram for explaining a sense of distance and 3D depth due to binocular parallax.
Referring to FIG. 4, when the
This difference in stereoscopic dimensions can be quantified to 3D depth or 3D level.
Next, an implementation method of the 3D stereoscopic image will be described.
As described above, in order to realize a 3D stereoscopic image, it is necessary that the right eye image and the left eye image are divided into two and reach the binocular. Various methods for this are described below.
1) Parallax barrier method
The parallax barrier method is a method of controlling a light propagation direction by electronically driving a blocking device provided between a general display unit and both eyes so that different images may be reached in both eyes.
This will be described with reference to FIG.
FIG. 5 is a conceptual diagram illustrating a method of implementing 3D stereoscopic images in a view barrier type display unit applicable to embodiments of the present invention.
The structure of the display barrier
In addition, as shown in FIG. 5B, the
5 illustrates that the parallax barrier moves in parallel in one axial direction, but the present invention is not limited thereto, and a parallax barrier capable of moving in parallel in two or more axial directions according to a control signal of the
2) Lens refraction method
The lens refraction method (lenticular) is a method using a lenticular screen provided between a display part and a binocular eye, and a method of refracting a traveling direction of light through lenses on a lenticular screen so that different images are reached in both eyes.
3) polarized glasses system
Polarized light is polarized so that the directions of polarization are orthogonal to each other to provide different images in both directions or in the case of the circular polarized light so that the directions of rotation are different from each other.
4) Active shutter method
Eye image is alternately displayed at a predetermined cycle through the display unit and the user's eyeglasses are arranged such that when the image in the corresponding direction is displayed, the shutter in the opposite direction is closed, To reach the eye. That is, during the time when the left eye image is displayed, the shutter of the right eye is closed to allow the left eye image to be reached only in the left eye, and the shutter of the left eye is closed during the time when the right eye image is displayed.
It is assumed that the portable terminal according to an embodiment of the present invention described below can provide a 3D stereoscopic image to a user through the
However, since the principle of the 3D image described above with reference to FIGS. 4 and 5 is a situation that assumes a three-dimensional object, the shape of the object is different in the left eye image and the right eye image. However, in the case of a planar object instead of a three-dimensional object, the shape of the object is the same in the left eye image and the right eye image. However, if the position where the object is arranged in each image is different, the user may feel the perspective of the object. In the present specification, for the sake of understanding, it is assumed that the stereoscopic image appearing below is a plane object. Of course, it is apparent that the present invention can be applied to a three-dimensional object.
Hereinafter, exemplary embodiments of an operation control process of a 3D object through a proximity touch of the
6 is a flowchart illustrating a process of controlling a display operation of a 3D object through a proximity touch of a mobile terminal according to the present invention.
Referring to FIG. 6, the
In this case, the 3D object may be an object included in the execution screen of the 3D content, the 3D content is previously stored in the
Preferably, the 3D content may be a 3D video, and the 3D object may be an object such as a person, an object, or a building included in the 3D video. Also, the 3D content may be a 3D image in a gallery, and the 3D object may be an object in the 3D image. Also, the 3D content may be a 3D document, and the 3D object may be an object such as a word, an icon, a character, or an image included in the 3D document. The 3D content may be a 3D preview image, and the 3D object may be an object such as a person, an object, or a building included in the 3D preview image. The 3D content may be a 3D menu screen, and the 3D object may be an object such as a menu item included in the 3D menu screen. The 3D content may be a 3D list, and the 3D object may be an object such as an item included in the 3D list. Also, the 3D content may be a 3D idle screen, and the 3D object may be an object such as an indicator icon, a clock, a widget, a current time, a function icon, and the like included in the 3D idle screen. In addition, the 3D content may be a 3D home screen, and the 3D object may be applications included in the 3D home screen. In addition, the 3D content may be a 3D phonebook, and the 3D object may be contact information included in the 3D phonebook. Also, the 3D content may be a 3D call log list, and the 3D object may be call log information included in the 3D call log list. The 3D content may be a 3D message transmission and reception history list, and the 3D object may be message transmission and reception history information included in the 3D message transmission and reception history list. Also, the 3D content may be 3D email, and the 3D object may be transmitted and received emails included in the 3D email. Also, the 3D content may be a 3D game, and the 3D object may be objects such as a unit included in the 3D game. Also, the 3D content may be a 3D webpage, and the 3D object may be objects included in the 3D webpage.
As a result, the 3D content including the 3D object includes all data executable in the
Next, when the 3D object is displayed, the
In addition, the
Hereinafter, the process of FIG. 6 will be described in detail with reference to FIGS. 7 to 13.
7 to 9 are views illustrating a process of detecting a proximity depth and a motion pattern through the proximity sensor unit and the proximity sensor unit according to the present invention.
Referring to FIG. 7, the
In this case, the
Referring to FIG. 8A, the
Next, referring to FIG. 8B, the
Next, referring to FIG. 8C, the
Next, referring to FIG. 8D, the
Next, referring to FIG. 9A, the
In addition, the
In addition, the
In addition, the
Next, referring to FIG. 9B, the
In addition, when the
In addition, when the
In addition, when the
In addition, when the
In addition, when the
In addition, when the
In addition, when the
10 to 12 illustrate, for example, that the 3D object is 3D type menus.
First, referring to FIG. 10A, when the
In addition, the
If the determined proximity depth is a preset d1 depth, the
In addition, the
That is, the
Next, referring to FIG. 11A, when the
For example, when the proximity motion pattern in the right direction is detected through the
In addition, the
In addition, although not shown in the drawing, the
That is, the
Next, referring to FIG. 12A, when a plurality of
In addition, the
If the determined proximity depth is a preset d1 depth, the
In addition, the
Next, FIG. 13A illustrates that the
In this case, as described above, the 3D content may be a 3D video, and the 3D object may be an object such as a person, an object, or a building included in the 3D video. Also, the 3D content may be a 3D image in a gallery, and the 3D object may be an object in the 3D image. Also, the 3D content may be a 3D document, and the 3D object may be an object such as a word, an icon, a character, or an image included in the 3D document. The 3D content may be a 3D preview image, and the 3D object may be an object such as a person, an object, or a building included in the 3D preview image. The 3D content may be a 3D menu screen, and the 3D object may be an object such as a menu item included in the 3D menu screen. The 3D content may be a 3D list, and the 3D object may be an object such as an item included in the 3D list. Also, the 3D content may be a 3D idle screen, and the 3D object may be an object such as an indicator icon, a clock, a widget, a current time, a function icon, and the like included in the 3D idle screen. In addition, the 3D content may be a 3D home screen, and the 3D object may be applications included in the 3D home screen. In addition, the 3D content may be a 3D phonebook, and the 3D object may be contact information included in the 3D phonebook. Also, the 3D content may be a 3D call log list, and the 3D object may be call log information included in the 3D call log list. The 3D content may be a 3D message transmission and reception history list, and the 3D object may be message transmission and reception history information included in the 3D message transmission and reception history list. Also, the 3D content may be 3D email, and the 3D object may be transmitted and received emails included in the 3D email. Also, the 3D content may be a 3D game, and the 3D object may be objects such as a unit included in the 3D game. Also, the 3D content may be a 3D webpage, and the 3D object may be objects included in the 3D webpage.
In this case, when the input of the proximity motion pattern in the first direction is sensed through the
In addition, when an input of a proximity motion pattern in a second direction opposite to the first direction is detected through the
In addition, although not shown in the drawing, the
In addition, although not shown in the drawing, the
It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like, which are also implemented in the form of carrier waves (eg, transmission over the Internet). It also includes. Also, the computer may include a
Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.
100: mobile terminal 110: wireless communication unit
111: broadcast receiver 112: mobile communication module
113
115: Position information module 120: A / V input section
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output section
151: Display unit 152: Acoustic output module
153: Alarm module 154: Haptic module
155: Projector Module 160: Memory
170: interface unit 180: control unit
181: Multimedia module 190: Power supply
Claims (10)
A proximity sensor unit including two or more proximity sensors for detecting a proximity depth and a motion pattern of an object to be approached; And
And a controller configured to control a display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.
The controller is configured to drive the proximity sensors when it is detected that the 3D object is displayed on the display unit.
The proximity sensor unit detects a motion pattern in a proximity depth of the object and at least one of up, down, left and right directions of the object.
The proximity sensor unit may include a first proximity sensor that detects a proximity depth of the object, and second to second provided in the up / down / left / right directions of the first proximity sensor with respect to the first proximity sensor. 5. A mobile terminal comprising at least one of proximity sensors.
The controller may vary the shape of the 3D object according to the proximity depth of the object detected by the proximity sensor.
The 3D object is a 3D image,
The controller may be configured to gradually enlarge or reduce the 3D image according to the proximity depth of the object.
The 3D object is a 3D menu,
The controller may display a previous or next menu of the menu or an upper or lower menu of the menu according to the proximity depth of the object.
The controller may be configured to move the 3D object in the same direction as the motion pattern of the object detected by the proximity sensor.
The controller may be configured to rotate the 3D object according to a motion pattern of the object detected by the proximity sensor.
Driving a proximity sensor unit including two or more proximity sensors for sensing a proximity depth and a motion pattern of an object approaching the screen; And
And controlling the display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110118050A KR20130053476A (en) | 2011-11-14 | 2011-11-14 | Mobile terminal and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020110118050A KR20130053476A (en) | 2011-11-14 | 2011-11-14 | Mobile terminal and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20130053476A true KR20130053476A (en) | 2013-05-24 |
Family
ID=48662669
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020110118050A KR20130053476A (en) | 2011-11-14 | 2011-11-14 | Mobile terminal and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20130053476A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150129370A (en) * | 2014-05-12 | 2015-11-20 | 서순석 | Apparatus for control object in cad application and computer recordable medium storing program performing the method thereof |
US9552644B2 (en) | 2014-11-17 | 2017-01-24 | Samsung Electronics Co., Ltd. | Motion analysis method and apparatus |
-
2011
- 2011-11-14 KR KR1020110118050A patent/KR20130053476A/en not_active Application Discontinuation
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150129370A (en) * | 2014-05-12 | 2015-11-20 | 서순석 | Apparatus for control object in cad application and computer recordable medium storing program performing the method thereof |
US9552644B2 (en) | 2014-11-17 | 2017-01-24 | Samsung Electronics Co., Ltd. | Motion analysis method and apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102014775B1 (en) | Mobile terminal and method for controlling the same | |
KR101674957B1 (en) | Mobile terminal and method for controlling thereof | |
KR101728728B1 (en) | Mobile terminal and method for controlling thereof | |
KR101728725B1 (en) | Mobile terminal and method for controlling thereof | |
KR101873747B1 (en) | Mobile terminal and method for controlling thereof | |
KR101873759B1 (en) | Display apparatus and method for controlling thereof | |
US20110246877A1 (en) | Mobile terminal and image display controlling method thereof | |
KR20120021414A (en) | Mobile terminal and method for controlling the same | |
KR20120079548A (en) | Display device and method for controlling thereof | |
KR20120010764A (en) | MOBILE TERMINAL AND METHOD FOR CONTROLLING A THREE DIMENSION IMAGE in thereof | |
KR20110054256A (en) | Mobile terminal and method for controlling thereof | |
KR20120007195A (en) | Mobile terminal and method for controlling thereof | |
KR20120048116A (en) | Mobile terminal and method for controlling the same | |
KR101633336B1 (en) | Mobile terminal and method for controlling thereof | |
KR101723413B1 (en) | Mobile terminal and method for controlling thereof | |
KR101709500B1 (en) | Mobile terminal and method for controlling thereof | |
KR20130071059A (en) | Mobile terminal and method for controlling thereof | |
KR101740442B1 (en) | Mobile terminal and method for controlling thereof | |
KR101629313B1 (en) | Mobile terminal and method for controlling the same | |
KR20130053476A (en) | Mobile terminal and method for controlling the same | |
KR20120093601A (en) | Mobile terminal and method for controlling the same | |
KR20120081651A (en) | Mobile terminal and method for controlling the same | |
KR101799269B1 (en) | Mobile terminal and method for controlling thereof | |
KR101753033B1 (en) | Mobile terminal and method for controlling thereof | |
KR20120130394A (en) | Mobile terminal and method for controlling the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |