KR20130053476A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20130053476A
KR20130053476A KR1020110118050A KR20110118050A KR20130053476A KR 20130053476 A KR20130053476 A KR 20130053476A KR 1020110118050 A KR1020110118050 A KR 1020110118050A KR 20110118050 A KR20110118050 A KR 20110118050A KR 20130053476 A KR20130053476 A KR 20130053476A
Authority
KR
South Korea
Prior art keywords
proximity
proximity sensor
unit
motion pattern
module
Prior art date
Application number
KR1020110118050A
Other languages
Korean (ko)
Inventor
하상우
박재환
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110118050A priority Critical patent/KR20130053476A/en
Publication of KR20130053476A publication Critical patent/KR20130053476A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

PURPOSE: A portable terminal and a control method thereof are provided to control a display operation of a three-dimensional object displayed on a current screen by using a motion pattern and a proximity depth of a touch. CONSTITUTION: A display unit(151) displays a three-dimensional object. A proximity sensor unit includes two or more proximity sensors. Two or more proximity sensors sense the proximity depth and motion pattern. A control unit(180) controls the display operation of the three-dimensional object according to proximity depth and the motion pattern sensed by the proximity sensor unit. [Reference numerals] (110) Wireless communication unit; (111) Broadcasting reception module; (112) Mobile communication module; (113) Wireless Internet module; (114) Local area communication module; (115) Location information module; (120) A/V input unit; (121) Camera; (122) Microphone; (130) User input unit; (140) Sensing unit; (141) Proximity sensor; (150) Output unit; (151) Display unit; (152) Sound output module; (153) Alarm unit; (154) Haptic module; (155) Projector module; (160) Memory; (170) Interface unit; (180) Control unit; (181) Multimedia module; (190) Power supplying unit

Description

MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME

The present invention relates to a portable terminal and a control method thereof, in which the use of the terminal can be realized by further considering the convenience of the user.

A terminal such as a personal computer, a notebook computer, a mobile phone, or the like can be configured to perform various functions. Examples of such various functions include a data and voice communication function, a function of photographing a video or a moving image through a camera, a voice storage function, a music file playback function through a speaker system, and an image or video display function. Some terminals include additional functions to execute games, and some other terminals are also implemented as multimedia devices. Moreover, recent terminals can receive a broadcast or multicast signal to view a video or television program.

In general, the terminal is movable The mobile terminal can be divided into a mobile terminal and a stationary terminal depending on whether the mobile terminal is portable or not, and the mobile terminal can be divided into a handheld terminal and a vehicle mount terminal.

Currently, due to the development of display technology, a number of terminals equipped with 3D functions and a touch screen have been released.

That is, the user may control the display operation of the 3D image by directly touching the touch screen provided in the terminal while watching the 3D image through the terminal.

However, when the user directly touches the touch screen while viewing the 3D image, the user may not be aware of the actual distance between the user and the touch screen due to the sense of space generated by the 3D image.

An object of the present invention is to provide a mobile terminal and a control method thereof, by which a user can control a display operation of a 3D object currently displayed on a screen by using a proximity depth and a motion pattern of a proximity touch.

According to an aspect of the present invention, there is provided a portable terminal including: a display configured to display a 3D (Dimensional) object; A proximity sensor unit including two or more proximity sensors for detecting a proximity depth and a motion pattern of an object to be approached; And a controller configured to control a display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.

In addition, the control method of a mobile terminal according to the present invention comprises the steps of: displaying a 3D (Dimensional) object on the screen; Driving a proximity sensor unit including two or more proximity sensors for detecting a proximity depth and a motion pattern of an object approaching the screen; And controlling the display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.

The mobile terminal and its control method according to the present invention control the display operation of the 3D object currently displayed on the screen using the proximity depth of the user's proximity touch and the motion pattern, so that the touch screen does not need to be touched directly. It also provides the effect of providing a new type of user interface for the manipulation of 3D objects.

1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention.
2A is a front perspective view of an example of a mobile terminal according to the present invention;
FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.
3 is a conceptual diagram for explaining the principle of binocular disparity.
4 is a conceptual diagram for explaining a sense of distance and 3D depth due to binocular parallax.
FIG. 5 is a conceptual diagram illustrating a method of implementing 3D stereoscopic images in a view barrier type display unit applicable to embodiments of the present invention.
6 is a flowchart illustrating a process of controlling a display operation of a 3D object through a proximity touch of a mobile terminal according to the present invention.
7 to 13 are explanatory views illustrating a process of controlling a display operation of a 3D object through a proximity touch of a mobile terminal according to the present invention.

Hereinafter, a portable terminal related to the present invention will be described in detail with reference to the drawings. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The portable terminal described in this specification may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), and navigation. However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, and the like, unless the configuration is applicable only to the portable terminal.

1 is a block diagram of a portable terminal according to an embodiment of the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface. The unit 170, the controller 180, and the battery 190 may be included. The components shown in Fig. 1 are not essential, and a portable terminal having more or fewer components may be implemented.

Hereinafter, the components will be described in order.

The wireless communication unit 110 may include one or more modules that enable wireless communication between the wireless terminal 100 and the wireless communication system or between the wireless terminal 100 and a network in which the wireless terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 114, and a location information module 115 .

The broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.

The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information may refer to a broadcast channel, a broadcast program, or information related to a broadcast service provider. The broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.

The broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only A digital broadcasting system such as DVB-CB, OMA-BCAST, or Integrated Services Digital Broadcast-Terrestrial (ISDB-T). Of course, the broadcast receiving module 111 may be adapted to other broadcasting systems as well as the digital broadcasting system described above.

The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 refers to a module for wireless Internet access, and may be built in or externally mounted in the mobile terminal 100. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short range communication module 114 refers to a module for short range communication. Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used as a short range communication technology.

The location information module 115 is a module for acquiring the location of the portable terminal, and a representative example thereof is a GPS (Global Position System) module. According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites, and then applies trigonometry to the calculated information to obtain a three-dimensional string of latitude, longitude, The location information can be accurately calculated. At present, a method of calculating position and time information using three satellites and correcting an error of the calculated position and time information using another satellite is widely used. In addition, the GPS module 115 can calculate speed information by continuously calculating the current position in real time.

Referring to FIG. 1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [

At this time, two or more cameras 121 may be provided depending on the use environment.

For example, the camera 121 may be provided with first and second cameras 121a and 121b for capturing 3D images on the opposite side of the display unit 151 of the mobile terminal 100. The third camera 121c for self-photographing of the user may be provided in a portion of the surface where the display unit 151 of the terminal 100 is provided.

In this case, the first camera 121a is for capturing the left eye image, which is the source image of the 3D image, and the second camera 121b, for the right eye image capturing.

The microphone 122 receives an external sound signal through a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 when the voice data is in the call mode, and output. Various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in receiving an external sound signal.

The user input unit 130 generates input data for a user to control the operation of the terminal.

The user input unit 130 may receive from the user a signal designating two or more contents among the displayed contents according to the present invention. A signal for designating two or more contents may be received via the touch input, or may be received via the hard key and soft key input.

The user input unit 130 may receive an input from the user for selecting the one or more contents. In addition, the user may receive an input for generating an icon related to a function that the portable terminal 100 can perform.

The user input unit 130 may include a directional keypad, a keypad, a dome switch, a touchpad (static / static), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, the presence of the user, the orientation of the portable terminal, And generates a sensing signal for controlling the operation of the portable terminal 100. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. It is also possible to sense whether the battery 190 is powered on, whether the interface unit 170 is connected to an external device, and the like. The sensing unit 140 may include a proximity sensor unit 141. The proximity sensor unit 141 will be described later with reference to the touch screen.

The output unit 150 is for generating an output relating to visual, auditory or tactile sense and includes a display unit 151, an acoustic output module 152, an alarm unit 153, a haptic module 154, 155, and the like.

The display unit 151 displays (outputs) the information processed in the portable terminal 100. For example, when the portable terminal is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed and / or received image, UI, or GUI is displayed.

In addition, the display unit 151 according to the present invention supports 2D and 3D display modes.

That is, the display unit 151 according to the present invention may have a configuration in which the switch liquid crystal 151b is combined with a general display device 151a as shown in FIG. 5 below. 5 (a), the optical parallax barrier 50 is operated by using the switch liquid crystal 151b to control the traveling direction of the light, so that different lights can be separated from the left and right eyes. Therefore, when a combined image of the right eye image and the left eye image is displayed on the display device 151a, the user can see the image corresponding to each eye and feel as if it is displayed in three-dimensional form.

That is, under the control of the control unit 180, the display unit 151 drives only the display device 151a without driving the switch liquid crystal 151b and the optical parallax barrier 50 in the 2D display mode Performs normal 2D display operation.

The display unit 151 drives the switch liquid crystal 151b and the optical parallax barrier 50 and the display device 151a under the control of the controller 180 to display the display liquid crystal 151b, 151a to perform the 3D display operation.

The 3D display process of the display unit 151 will be described later in detail with reference to FIGS. 3 to 5.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) A flexible display, and a three-dimensional display (3D display).

Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and a typical example of the transparent display is TOLED (Transparent OLED) and the like. The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.

There may be two or more display units 151 according to the embodiment of the portable terminal 100. [ For example, in the portable terminal 100, a plurality of display units may be spaced apart from one another or may be disposed integrally with each other, or may be disposed on different surfaces.

When the display unit 151 and a sensor for detecting a touch operation (hereinafter, referred to as a touch sensor) form a mutual layer structure (hereinafter referred to as a touch screen), the display unit 151 may be configured in addition to an output device. Can also be used as an input device. The touch sensor may have the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor may be configured to convert a change in a pressure applied to a specific portion of the display unit 151 or a capacitance generated in a specific portion of the display unit 151 into an electrical input signal. The touch sensor can be configured to detect not only the position and area to be touched but also the pressure at the time of touch.

If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller (not shown). The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.

The proximity sensor unit 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using mechanical force by using electromagnetic force or infrared rays. The proximity sensor has a longer life span than the contact sensor and its utilization is also high.

Two or more proximity sensors 141 may be disposed in an inner region of the mobile terminal 100 surrounded by the touch screen 151 or near the touch screen 151. That is, the proximity sensor unit 141 is composed of two or more proximity sensors and detects a proximity depth and a motion pattern of an object to be approached.

Preferably, the proximity sensor unit 141 may be provided in the form of a direction key of the user input unit 130, and at least one of an upward direction, a downward direction, a left direction, a right direction, a diagonal direction, and a rotation direction of an object to be approached. A motion pattern according to a direction is detected, and a signal corresponding to the detected result is output to the controller 180. Operation of the proximity sensor unit 141 will be described later in detail with reference to FIGS. 7 to 12.

Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. . And to detect the proximity of the pointer by the change of the electric field along the proximity of the pointer when the touch screen is electrostatic. In this case, the touch screen (touch sensor) may be classified as a proximity sensor.

Hereinafter, for convenience of explanation, the act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen is referred to as a "proximity touch", and the touch The act of actually touching the pointer on the screen is called "contact touch." The position where the pointer is proximately touched on the touch screen means a position where the pointer is vertically corresponding to the touch screen when the pointer is touched.

The proximity sensor detects a proximity touch, a proximity depth, and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). . Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The sound output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. The sound output module 152 also outputs sound signals related to functions (e.g., call signal reception tones, message reception tones, etc.) performed in the portable terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the portable terminal include receiving a call signal, receiving a message, inputting a key signal, and touch input. The alarm unit 153 may output a signal for notifying the occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. In this case, the display unit 151 and the audio output module 152 may be a type of the alarm unit 153. The display unit 151 and the audio output module 152 may be connected to the display unit 151 or the audio output module 152, .

The haptic module 154 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. The intensity and pattern of vibration generated by the haptic module 154 can be controlled. For example, different vibrations may be synthesized and output or sequentially output.

In addition to vibration, the haptic module 154 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through an injection or inlet port, grazing to the skin surface, contact of an electrode, electrostatic force, and the like. Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.

The haptic module 154 can be implemented not only to transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sensation of the finger or arm. The haptic module 154 may include two or more haptic modules 154 according to the configuration of the portable terminal 100.

The projector module 155 is a component for performing an image project function using the portable terminal 100 and is the same as an image displayed on the display unit 151 according to a control signal of the controller 180. Or at least partially different images may be displayed on an external screen or wall.

Specifically, the projector module 155 includes a light source (not shown) that generates light (for example, laser light) for outputting an image to the outside, a light source And a lens (not shown) for enlarging and outputting the image at a predetermined focal distance to the outside. Further, the projector module 155 may include a device (not shown) capable of mechanically moving the lens or the entire module to adjust the image projection direction.

The projector module 155 can be divided into a CRT (Cathode Ray Tube) module, an LCD (Liquid Crystal Display) module and a DLP (Digital Light Processing) module according to the type of the display means. In particular, the DLP module may be advantageous for miniaturization of the projector module 151 by enlarging and projecting an image generated by reflecting light generated from a light source on a DMD (Digital Micromirror Device) chip.

Preferably, the projector module 155 may be provided in the longitudinal direction on the side, front, or back of the portable terminal 100. Of course, the projector module 155 may be provided at any position of the mobile terminal 100 as necessary.

The memory 160 may store a program for processing and controlling the controller 180 and may store the input / output data (e.g., a telephone directory, a message, an audio, a still image, an electronic book, History, and the like). The memory 160 may also store the frequency of use of each of the data (for example, each telephone number, each message, and frequency of use for each multimedia). In addition, the memory unit 160 may store data on vibration and sound of various patterns output when the touch is input on the touch screen.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory, etc.) ), A random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read- A magnetic disk, an optical disk, a memory, a magnetic disk, or an optical disk. The portable terminal 100 may operate in association with a web storage that performs a storage function of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the portable terminal 100. The interface unit 170 receives data from an external device or receives power from the external device and transmits the data to each component in the portable terminal 100 or allows data in the portable terminal 100 to be transmitted to an external device. For example, a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, an audio I / O port, A video input / output (I / O) port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip that stores various types of information for authenticating the use authority of the mobile terminal 100, and includes a user identification module (UIM), a subscriber identification module (SIM), and a universal user authentication module ( Universal Subscriber Identity Module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

When the portable terminal 100 is connected to an external cradle, the interface unit may be a path through which power from the cradle is supplied to the portable terminal 100, or various command signals input from the cradle by the user It can be a passage to be transmitted to the terminal. The various command signals input from the cradle or the power source may be operated as a signal for recognizing that the portable terminal is correctly mounted on the cradle.

The controller 180 typically controls the overall operation of the portable terminal. For example, voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [

The controller 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), and the like. It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. The described embodiments may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed by the control unit 180. [

2A is a front perspective view of an example of a mobile terminal according to the present invention;

The disclosed portable terminal 100 has a bar-shaped main body. However, the present invention is not limited thereto, and can be applied to various structures such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are relatively movably coupled.

The body includes a case (a casing, a housing, a cover, and the like) which forms an appearance. In this embodiment, the case may be divided into a front case 101 and a rear case 102. [ A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injecting synthetic resin or may be formed of a metal material, for example, a metal material such as stainless steel (STS) or titanium (Ti).

The mobile terminal body, mainly the front case 101, includes a display unit 151, an audio output unit 152, a third camera 121c, a user input unit 130/131, 132, a microphone 122, an interface 170, and the like. Can be arranged.

The display unit 151 occupies most of the main surface of the front case 101. A sound output unit 151 and a camera 121 are disposed in an area adjacent to one end of both ends of the display unit 151 and a user input unit 131 and a microphone 122 are disposed in an area adjacent to the other end. The user input unit 132 and the interface 170 may be disposed on the side surfaces of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the portable terminal 100 and may include a plurality of operation units 131 and 132. The operation units 131 and 132 may be collectively referred to as a manipulating portion.

The content input by the first or second manipulation units 131 and 132 may be variously set. For example, the first operation unit 131 receives commands such as start, end, scroll, and the like, and the second operation unit 132 controls the size of the sound output from the sound output unit 152 or the size of the sound output from the display unit 151 The touch recognition mode can be activated or deactivated.

FIG. 2B is a rear perspective view of the portable terminal shown in FIG. 2A.

Referring to FIG. 2B, a third camera 121c may be additionally mounted on the rear of the portable terminal body, that is, the rear case 102. The third camera 121c may be a camera having a photographing direction substantially opposite to the first and second cameras 121a and 121b and having the same or different pixels as the first and second cameras 121a and 121b. have.

The flash 123 and the mirror 124 may be further disposed adjacent to the third camera 121c. The flash 123 shines light toward the subject when the subject is photographed by the third camera 121c. The mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the third camera 121c.

The sound output module 152 'may be further disposed on the rear side of the portable terminal body. The sound output unit 152 ′ may implement a stereo function together with the sound output module 152 (see FIG. 2A), and may be used to implement a speakerphone mode during a call.

In addition to the antenna for a call or the like, an antenna 116 for receiving a broadcast signal may be additionally disposed on the side of the portable terminal body. The antenna 116 constituting a part of the broadcast receiving unit 111 (refer to FIG. 1) may be installed to be pulled out from the terminal body.

The terminal body is equipped with a power supply unit 190 for supplying power to the portable terminal 100. The power supply unit 190 may be built in the terminal body or may be directly detachable from the outside of the terminal body.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type for the display unit 151. [ In this case, if the display unit 151 is configured to output the time information on both sides (i.e., in the directions of both the front and back sides of the mobile terminal), the time information can be recognized through the touch pad 135 do. The information output on both sides may be all controlled by the touch pad 135. [

Meanwhile, the display for exclusive use of the touch pad 135 may be separately installed, so that the touch screen may be disposed in the rear case 102 as well.

The touch pad 135 operates in association with the display unit 151 of the front case 101. The touch pad 135 may be disposed parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

Hereinafter, a 3D image control process of a mobile terminal that can be applied in embodiments of the present invention will be described.

Stereoscopic images that can be implemented on the display unit 151 of the portable terminal can be largely classified into two categories. The criterion for this classification is whether or not different images are provided in both eyes.

First, the first stereoscopic image category will be described.

The first category is a method in which the same image is provided in both eyes (monoscopic), which can be implemented as a general display unit. More specifically, the controller 180 arranges a polyhedron generated through one or more points, lines, planes, or a combination thereof in a virtual three-dimensional space, and displays an image viewed from a specific viewpoint on the display unit 151. That's how. Therefore, the substantial part of such a stereoscopic image can be referred to as a planar image.

The second category is a stereo scopic method in which different images are provided in both eyes. It is a method using a principle of feeling a three-dimensional feeling when a human is looking at an object with the naked eye. In other words, the two eyes of a person see different plane images when they see the same thing by the distance between them. These different planar images are transmitted to the brain through the retina, and the brain fuses them to feel the depth and reality of the stereoscopic image. Therefore, binocular disparity due to the distance between the two eyes makes a sense of stereoscopic effect, and this binocular disparity becomes the most important element of the second category. This binocular disparity will be described in more detail with reference to Fig.

3 is a conceptual diagram for explaining the principle of binocular disparity.

In FIG. 3, it is assumed that the cube 31 is viewed with the naked eye in front of the eye level. In this case, the left eye planar image 32 is seen in which only three surfaces of the top, front, and left sides of the hexahedron 31 are visible. In addition, as the right eye, the right eye planar image 33 showing only three surfaces of the top, front, and right sides of the cube 31 is visible.

If the left eye plane image 32 is reached in the left eye and the right eye plane image 33 is reached in the right eye, even if the object is not actually an object in front of the eye, the person may feel as if the person actually sees the cube 31.

As a result, in order to implement the stereoscopic image of the second category in the mobile terminal, the left eye image and the right eye image, which are seen at the same time with the same object, must be separated and reached through the display unit. Next, the 3D depth due to binocular disparity will be described with reference to FIG. 4.

4 is a conceptual diagram for explaining a sense of distance and 3D depth due to binocular parallax.

Referring to FIG. 4, when the hexahedron 40 is viewed at a distance d1 through both eyes, the lateral gravity of an image that enters each eye is relatively higher than that of the hexahedron 40 at a d2 distance, The difference is also great. In addition, the degree of three-dimensional feeling that a person feels when viewing the cube 40 at a distance d1 is greater than when the cube 40 is viewed at a distance d2. In other words, when a person looks at an object through both eyes, the closer to the object, the greater the three-dimensional feeling, and the farther the object, the less the three-dimensional feeling.

This difference in stereoscopic dimensions can be quantified to 3D depth or 3D level.

Next, an implementation method of the 3D stereoscopic image will be described.

As described above, in order to realize a 3D stereoscopic image, it is necessary that the right eye image and the left eye image are divided into two and reach the binocular. Various methods for this are described below.

1) Parallax barrier method

The parallax barrier method is a method of controlling a light propagation direction by electronically driving a blocking device provided between a general display unit and both eyes so that different images may be reached in both eyes.

This will be described with reference to FIG.

FIG. 5 is a conceptual diagram illustrating a method of implementing 3D stereoscopic images in a view barrier type display unit applicable to embodiments of the present invention.

The structure of the display barrier type display unit 151 for displaying a 3D stereoscopic image may have a configuration in which the switch liquid crystal 151b is combined with a general display device 151a. Using the switch liquid crystal 151b, the optical parallax barrier 50 may be operated as shown in FIG. 5 (a) to control the traveling direction of the light to separate the light to reach the left and right eyes. Therefore, when an image in which a right eye image and a left eye image are combined is displayed on the display device 151a, the image corresponding to each eye is seen from the user's point of view as if it is displayed in three dimensions.

In addition, as shown in FIG. 5B, the parallax barrier 50 by the switch liquid crystal is electrically controlled so that all of the light is transmitted, thereby eliminating separation of the light by the parallax barrier so that the same image can be seen by the left and right eyes. It may be. In this case, the same function as that of the general display unit may be performed.

5 illustrates that the parallax barrier moves in parallel in one axial direction, but the present invention is not limited thereto, and a parallax barrier capable of moving in parallel in two or more axial directions according to a control signal of the controller 180 may be used. have.

2) Lens refraction method

The lens refraction method (lenticular) is a method using a lenticular screen provided between a display part and a binocular eye, and a method of refracting a traveling direction of light through lenses on a lenticular screen so that different images are reached in both eyes.

3) polarized glasses system

Polarized light is polarized so that the directions of polarization are orthogonal to each other to provide different images in both directions or in the case of the circular polarized light so that the directions of rotation are different from each other.

4) Active shutter method

Eye image is alternately displayed at a predetermined cycle through the display unit and the user's eyeglasses are arranged such that when the image in the corresponding direction is displayed, the shutter in the opposite direction is closed, To reach the eye. That is, during the time when the left eye image is displayed, the shutter of the right eye is closed to allow the left eye image to be reached only in the left eye, and the shutter of the left eye is closed during the time when the right eye image is displayed.

It is assumed that the portable terminal according to an embodiment of the present invention described below can provide a 3D stereoscopic image to a user through the display unit 151 through any one of the above-described methods.

However, since the principle of the 3D image described above with reference to FIGS. 4 and 5 is a situation that assumes a three-dimensional object, the shape of the object is different in the left eye image and the right eye image. However, in the case of a planar object instead of a three-dimensional object, the shape of the object is the same in the left eye image and the right eye image. However, if the position where the object is arranged in each image is different, the user may feel the perspective of the object. In the present specification, for the sake of understanding, it is assumed that the stereoscopic image appearing below is a plane object. Of course, it is apparent that the present invention can be applied to a three-dimensional object.

Hereinafter, exemplary embodiments of an operation control process of a 3D object through a proximity touch of the mobile terminal 100 according to the present invention will be described in detail with reference to FIGS. 6 to 13.

6 is a flowchart illustrating a process of controlling a display operation of a 3D object through a proximity touch of a mobile terminal according to the present invention.

Referring to FIG. 6, the controller 180 displays a 3D object selected by the user through the touch screen 151 or the user input unit 130 [S110].

In this case, the 3D object may be an object included in the execution screen of the 3D content, the 3D content is previously stored in the memory 160 or received from the outside through the wireless communication unit 110, or The 3D preview image may be a 3D preview image generated based on left and right eye images input through the first and second cameras 121a and 121b.

Preferably, the 3D content may be a 3D video, and the 3D object may be an object such as a person, an object, or a building included in the 3D video. Also, the 3D content may be a 3D image in a gallery, and the 3D object may be an object in the 3D image. Also, the 3D content may be a 3D document, and the 3D object may be an object such as a word, an icon, a character, or an image included in the 3D document. The 3D content may be a 3D preview image, and the 3D object may be an object such as a person, an object, or a building included in the 3D preview image. The 3D content may be a 3D menu screen, and the 3D object may be an object such as a menu item included in the 3D menu screen. The 3D content may be a 3D list, and the 3D object may be an object such as an item included in the 3D list. Also, the 3D content may be a 3D idle screen, and the 3D object may be an object such as an indicator icon, a clock, a widget, a current time, a function icon, and the like included in the 3D idle screen. In addition, the 3D content may be a 3D home screen, and the 3D object may be applications included in the 3D home screen. In addition, the 3D content may be a 3D phonebook, and the 3D object may be contact information included in the 3D phonebook. Also, the 3D content may be a 3D call log list, and the 3D object may be call log information included in the 3D call log list. The 3D content may be a 3D message transmission and reception history list, and the 3D object may be message transmission and reception history information included in the 3D message transmission and reception history list. Also, the 3D content may be 3D email, and the 3D object may be transmitted and received emails included in the 3D email. Also, the 3D content may be a 3D game, and the 3D object may be objects such as a unit included in the 3D game. Also, the 3D content may be a 3D webpage, and the 3D object may be objects included in the 3D webpage.

As a result, the 3D content including the 3D object includes all data executable in the mobile terminal 100 and 3D.

Next, when the 3D object is displayed, the controller 180 drives the proximity sensor unit 141 and detects a proximity depth and a motion pattern of an object approaching the driven proximity sensor unit 141 [ S120].

In addition, the controller 180 controls the display operation of the 3D object according to the detected proximity depth and the motion pattern of the proximity touch [S130].

Hereinafter, the process of FIG. 6 will be described in detail with reference to FIGS. 7 to 13.

7 to 9 are views illustrating a process of detecting a proximity depth and a motion pattern through the proximity sensor unit and the proximity sensor unit according to the present invention.

Referring to FIG. 7, the proximity sensor unit 141 detects a proximity depth of an object to be approached and a motion pattern for at least one of up, down, left, and right directions of the object to be approached to the controller 180. Output

In this case, the proximity sensor unit 141 may include a first proximity sensor 141A for detecting a proximity depth of an object to be approached, a second proximity sensor 141B positioned in an upward direction of the first proximity sensor 141A, The third proximity sensor 141C located in the downward direction of the first proximity sensor 141A, the fourth proximity sensor 141D located in the left direction of the first proximity sensor 141A, and the first proximity. And a fifth proximity sensor 141E located in the right direction of the sensor 141A.

Referring to FIG. 8A, the proximity sensor unit 141 detects a proximity motion pattern toward the first proximity sensor 141A to the second proximity sensor 141B, or the third proximity sensor 141C to the third proximity sensor 141C. When the proximity motion pattern is detected in the direction of the first proximity sensor 141A to the second proximity sensor 141B, the detected proximity motion pattern is recognized as an upward proximity motion pattern and output to the controller 180. Then, the controller 180 may move the 3D object currently displayed on the screen upward according to the upward proximity motion pattern recognized by the proximity sensor unit 141.

Next, referring to FIG. 8B, the proximity sensor unit 141 detects a proximity motion pattern in the direction of the first proximity sensor 141A to the third proximity sensor 141C, or the second proximity sensor ( 141B) When the proximity motion pattern is detected in the direction of the first proximity sensor 141A to the third proximity sensor 141C, the detected proximity motion pattern is recognized as a downward proximity motion pattern and output to the controller 180. do. Then, the controller 180 may move the 3D object displayed on the current screen downward according to the downward proximity motion pattern recognized by the proximity sensor unit 141.

Next, referring to FIG. 8C, the proximity sensor unit 141 detects a proximity motion pattern in a direction from the first proximity sensor 141A to the fifth proximity sensor 141E, or the fourth proximity sensor ( 141D) When the proximity motion pattern is detected in the direction of the first proximity sensor 141A to the fifth proximity sensor 141E, the detected proximity motion pattern is recognized as a rightward proximity motion pattern and output to the controller 180. do. Then, the controller 180 may move the 3D object displayed on the current screen in the right direction according to the rightward proximity motion pattern recognized by the proximity sensor unit 141.

Next, referring to FIG. 8D, the proximity sensor unit 141 detects a proximity motion pattern in a direction from the first proximity sensor 141A to the fourth proximity sensor 141D, or the fifth proximity sensor ( 141E) When the proximity motion pattern is detected in the direction of the first proximity sensor 141A to the fourth proximity sensor 141D, the detected proximity motion pattern is recognized as a leftward proximity motion pattern and output to the controller 180. do. Then, the controller 180 may move the 3D object displayed on the current screen in the left direction according to the leftward proximity motion pattern recognized by the proximity sensor unit 141.

Next, referring to FIG. 9A, the proximity sensor unit 141 detects a proximity motion pattern in a direction from the second proximity sensor 141B to the fourth proximity sensor 141D, or the fourth proximity sensor ( 141D) When the proximity motion pattern is detected in the direction of the second proximity sensor 141B, the detected proximity motion pattern is recognized as a diagonal proximity motion pattern in an up / left direction and is output to the controller 180. Then, the controller 180 may move the 3D object displayed on the current screen in the diagonal direction of the upper / left direction according to the diagonal motion pattern of the upward / left direction recognized by the proximity sensor unit 141.

In addition, the proximity sensor 141 detects a proximity motion pattern in the direction of the fourth proximity sensor 141D to the third proximity sensor 141C, or in the direction of the third proximity sensor 141C to the fourth proximity sensor 141D. When the proximity motion pattern is detected, the detected proximity motion pattern is recognized as a diagonal proximity motion pattern in a lower / left direction and is output to the controller 180. Then, the controller 180 may move the 3D object displayed on the current screen in the diagonal direction of the bottom / left according to the diagonal motion pattern of the downward / left direction recognized by the proximity sensor unit 141.

In addition, the proximity sensor 141 detects a proximity motion pattern in the direction of the third proximity sensor 141C to the fifth proximity sensor 141E, or in the direction of the fifth proximity sensor 141E to the third proximity sensor 141C. When the proximity motion pattern is detected, the detected proximity motion pattern is recognized as a diagonal proximity motion pattern in the lower and right directions, and is output to the controller 180. Then, the controller 180 may move the 3D object currently displayed on the current screen in the diagonal direction of the lower / right according to the diagonal direction proximity motion pattern recognized by the proximity sensor unit 141.

In addition, the proximity sensor unit 141 detects a proximity motion pattern in the direction of the fifth proximity sensor 141E to the second proximity sensor 141B or in the direction of the second proximity sensor 141B to the fifth proximity sensor 141E. When the proximity motion pattern is detected, the detected proximity motion pattern is recognized as a diagonal proximity motion pattern in the upper / right direction and is output to the controller 180. Then, the controller 180 may move the 3D object displayed on the current screen in the diagonal direction of the right / right according to the diagonal direction proximity motion pattern of the upper / right recognized through the proximity sensor unit 141.

Next, referring to FIG. 9B, the proximity sensor unit 141 may have a proximity motion pattern toward the fourth proximity sensor 141D → the second proximity sensor 141B → the fifth proximity sensor 141E. When detected, the detected proximity motion pattern is recognized as a proximity motion pattern corresponding to a 90-degree rotation in a clockwise direction, and is output to the controller 180. Then, the controller 180 rotates the 3D object currently displayed on the screen 90 degrees clockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects the proximity motion pattern in the direction of the fifth proximity sensor 141E → second proximity sensor 141B → fourth proximity sensor 141D, the proximity sensor 141 counterclockwise detects the detected proximity motion pattern. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation in the direction, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees counterclockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects the proximity motion pattern in the direction of the second proximity sensor 141B → the fifth proximity sensor 141E → the third proximity sensor 141C, the proximity sensor 141 rotates the detected proximity motion pattern in a clockwise direction. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees clockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects the proximity motion pattern toward the third proximity sensor 141C → the fifth proximity sensor 141E → the second proximity sensor 141B, the proximity sensor 141 counterclocks the detected proximity motion pattern. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation in the direction, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees counterclockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects a proximity motion pattern in a direction of the fifth proximity sensor 141E → third proximity sensor 141C → fourth proximity sensor 141D, the proximity sensor 141 rotates the detected proximity motion pattern in a clockwise direction. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees clockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects a proximity motion pattern in the direction of the fourth proximity sensor 141D → third proximity sensor 141C → fifth proximity sensor 141E, the proximity sensor 141 counterclockwise detects the detected proximity motion pattern. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation in the direction, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees counterclockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects the proximity motion pattern in the direction of the third proximity sensor 141C → the fourth proximity sensor 141D → the second proximity sensor 141B, the proximity sensor 141 rotates the detected proximity motion pattern in a clockwise direction. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees clockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

In addition, when the proximity sensor 141 detects a proximity motion pattern toward the second proximity sensor 141B → the fourth proximity sensor 141D → the third proximity sensor 141C, the proximity sensor 141 counterclocks the detected proximity motion pattern. Recognizes as a proximity motion pattern corresponding to a 90-degree rotation in the direction, and outputs it to the controller 180. Then, the controller 180 rotates the 3D object displayed on the current screen 90 degrees counterclockwise according to the proximity motion pattern recognized by the proximity sensor unit 141.

10 to 12 illustrate, for example, that the 3D object is 3D type menus.

First, referring to FIG. 10A, when the first 3D menu 311 is displayed among the plurality of 3D menus 311, 312, and 313 on the screen 310, the proximity sensor may display the proximity sensor. The unit 141 is driven. In this case, the 3D menus 311, 312, and 313 may be upper or lower subordinate menus, or lower menus included in a specific menu.

In addition, the controller 180 determines the proximity depth of the user through the first proximity sensor 141A that detects the proximity depth of the user.

If the determined proximity depth is a preset d1 depth, the controller 180 switches the first 3D menu 311 to a second 3D menu 312 as illustrated in FIG. 10B. Display.

In addition, the controller 180 displays the proximity depth of the user detected by the first proximity sensor 141A while the second 3D menu 312 is displayed, and the preset d2 depth d1 <d2 or d1> d2. As shown in FIG. 10C, the second 3D menu 312 is converted into the third 3D menu 313 and displayed.

That is, the controller 180 gradually switches from the first 3D menu 311 to the third 3D menu 313 as the proximity depth of the user approaches the first proximity sensor 141A, and displays the proximity depth of the user. As the distance from the first proximity sensor 141A increases, the third 3D menu 313 may gradually switch to display the first 3D menu 311.

Next, referring to FIG. 11A, when the first 3D menu 311 is displayed among the plurality of 3D menus 311, 312, and 313 on the screen 310, the controller 180 displays the first 3D menu 311. The proximity sensor unit 141 is driven.

For example, when the proximity motion pattern in the right direction is detected through the first proximity sensor 141A to the fifth proximity sensor 141E, as illustrated in FIG. The first 3D menu 311 is switched to the second 3D menu 312 for display.

In addition, the controller 180 detects the proximity motion pattern in the right direction through the first proximity sensor 141A to the fifth proximity sensor 141E while the second 3D menu 312 is displayed. As shown in (c), the second 3D menu 312 is switched to the third 3D menu 313 for display.

In addition, although not shown in the drawing, the controller 180 displays the proximity motion pattern in the left direction through the first proximity sensor 141A to the fourth proximity sensor 141D while the second 3D menu 312 is displayed. If detected, the second 3D menu 312 may be converted to the first 3D menu 311 and displayed.

That is, the controller 180 gradually converts the first 3D menu 311 from the first 3D menu 311 to the third 3D menu 313 according to the input of the proximity motion pattern of the user having the first direction, and reverses the first direction. In accordance with the input of the proximity motion pattern of the user having the second direction, the third 3D menu 313 may be gradually switched to the first 3D menu 311 and displayed.

Next, referring to FIG. 12A, when a plurality of 3D menus 311, 312, and 313 are displayed on the screen 310, the controller 180 drives the proximity sensor unit 141. Let's do it. In this case, the 3D menus 311, 312, and 313 have different 3D depths, and for example, the first 3D menu 311 among the 3D menus 311, 312, and 313 is closest to the user. It is shown that the most 3D depth is given, and that the 3D menu 313 is given the least 3D depth so that the user may feel the farthest.

In addition, the controller 180 determines the proximity depth of the user through the first proximity sensor 141A that detects the proximity depth of the user.

If the determined proximity depth is a preset d1 depth, the controller 180 may display the second 3D menu 312 to be closest to the user as shown in FIG. 12B. The 3D depth is given to the menu 312 to display the most, and the 3D depth to the first 3D menu 311 is given to the first 3D menu 311 so as to view the farthest.

In addition, the controller 180 may determine that the proximity depth of the user detected through the first proximity sensor 141A is a preset d2 depth d1 <in a state where the most 3D depth is given to the second 3D menu 312. d2 or d1> d2), as shown in (c) of FIG. 12, gives the third 3D menu 313 the most 3D depth so that the third 3D menu 313 looks closest to the user. And the smallest 3D depth is displayed on the second 3D menu 312 so that the existing second 3D menu 312 is viewed farthest.

Next, FIG. 13A illustrates that the 3D object 321 is displayed on the 3D content screen 320.

In this case, as described above, the 3D content may be a 3D video, and the 3D object may be an object such as a person, an object, or a building included in the 3D video. Also, the 3D content may be a 3D image in a gallery, and the 3D object may be an object in the 3D image. Also, the 3D content may be a 3D document, and the 3D object may be an object such as a word, an icon, a character, or an image included in the 3D document. The 3D content may be a 3D preview image, and the 3D object may be an object such as a person, an object, or a building included in the 3D preview image. The 3D content may be a 3D menu screen, and the 3D object may be an object such as a menu item included in the 3D menu screen. The 3D content may be a 3D list, and the 3D object may be an object such as an item included in the 3D list. Also, the 3D content may be a 3D idle screen, and the 3D object may be an object such as an indicator icon, a clock, a widget, a current time, a function icon, and the like included in the 3D idle screen. In addition, the 3D content may be a 3D home screen, and the 3D object may be applications included in the 3D home screen. In addition, the 3D content may be a 3D phonebook, and the 3D object may be contact information included in the 3D phonebook. Also, the 3D content may be a 3D call log list, and the 3D object may be call log information included in the 3D call log list. The 3D content may be a 3D message transmission and reception history list, and the 3D object may be message transmission and reception history information included in the 3D message transmission and reception history list. Also, the 3D content may be 3D email, and the 3D object may be transmitted and received emails included in the 3D email. Also, the 3D content may be a 3D game, and the 3D object may be objects such as a unit included in the 3D game. Also, the 3D content may be a 3D webpage, and the 3D object may be objects included in the 3D webpage.

In this case, when the input of the proximity motion pattern in the first direction is sensed through the proximity sensor unit 141 as described above with reference to FIGS. 7 to 9, as shown in FIG. 13B. The 3D depth is increased to display the 3D object 321 closer to the user according to the input proximity motion pattern.

In addition, when an input of a proximity motion pattern in a second direction opposite to the first direction is detected through the proximity sensor unit 141, the controller 180 displays the input as illustrated in (c) of FIG. 13. The 3D depth is reduced and displayed so that the 3D object 321 is far from the user according to the proximity motion pattern.

In addition, although not shown in the drawing, the controller 180 detects an input of a proximity motion pattern in a first direction (upward, downward, leftward, rightward, and diagonal directions) through the proximity sensor unit 141. The 3D object 321 is moved and displayed on the screen 320 in the same manner as the first direction of the input proximity motion pattern.

In addition, although not shown in the drawing, the controller 180 detects an input of a clockwise or counterclockwise rotational proximity motion pattern through the proximity sensor unit 141, and the same as the rotational direction of the input proximity motion pattern. The 3D object 321 is rotated and displayed on the screen 320.

It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, optical data storage devices, and the like, which are also implemented in the form of carrier waves (eg, transmission over the Internet). It also includes. Also, the computer may include a control unit 180 of the terminal.

Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.

100: mobile terminal 110: wireless communication unit
111: broadcast receiver 112: mobile communication module
113 wireless Internet module 114 short-range communication module
115: Position information module 120: A / V input section
121: camera 122: microphone
130: user input unit 140: sensing unit
141: proximity sensor 150: output section
151: Display unit 152: Acoustic output module
153: Alarm module 154: Haptic module
155: Projector Module 160: Memory
170: interface unit 180: control unit
181: Multimedia module 190: Power supply

Claims (10)

A display unit displaying a 3D (Dimensional) object;
A proximity sensor unit including two or more proximity sensors for detecting a proximity depth and a motion pattern of an object to be approached; And
And a controller configured to control a display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.
The method according to claim 1,
The controller is configured to drive the proximity sensors when it is detected that the 3D object is displayed on the display unit.
The method according to claim 1,
The proximity sensor unit detects a motion pattern in a proximity depth of the object and at least one of up, down, left and right directions of the object.
The method of claim 3,
The proximity sensor unit may include a first proximity sensor that detects a proximity depth of the object, and second to second provided in the up / down / left / right directions of the first proximity sensor with respect to the first proximity sensor. 5. A mobile terminal comprising at least one of proximity sensors.
The method according to claim 1,
The controller may vary the shape of the 3D object according to the proximity depth of the object detected by the proximity sensor.
6. The method of claim 5,
The 3D object is a 3D image,
The controller may be configured to gradually enlarge or reduce the 3D image according to the proximity depth of the object.
6. The method of claim 5,
The 3D object is a 3D menu,
The controller may display a previous or next menu of the menu or an upper or lower menu of the menu according to the proximity depth of the object.
The method according to claim 1,
The controller may be configured to move the 3D object in the same direction as the motion pattern of the object detected by the proximity sensor.
The method according to claim 1,
The controller may be configured to rotate the 3D object according to a motion pattern of the object detected by the proximity sensor.
Displaying a 3D (Dimensional) object on the screen;
Driving a proximity sensor unit including two or more proximity sensors for sensing a proximity depth and a motion pattern of an object approaching the screen; And
And controlling the display operation of the 3D object according to the proximity depth and the motion pattern detected by the proximity sensor.
KR1020110118050A 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same KR20130053476A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110118050A KR20130053476A (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110118050A KR20130053476A (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20130053476A true KR20130053476A (en) 2013-05-24

Family

ID=48662669

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110118050A KR20130053476A (en) 2011-11-14 2011-11-14 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20130053476A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150129370A (en) * 2014-05-12 2015-11-20 서순석 Apparatus for control object in cad application and computer recordable medium storing program performing the method thereof
US9552644B2 (en) 2014-11-17 2017-01-24 Samsung Electronics Co., Ltd. Motion analysis method and apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150129370A (en) * 2014-05-12 2015-11-20 서순석 Apparatus for control object in cad application and computer recordable medium storing program performing the method thereof
US9552644B2 (en) 2014-11-17 2017-01-24 Samsung Electronics Co., Ltd. Motion analysis method and apparatus

Similar Documents

Publication Publication Date Title
KR102014775B1 (en) Mobile terminal and method for controlling the same
KR101674957B1 (en) Mobile terminal and method for controlling thereof
KR101728728B1 (en) Mobile terminal and method for controlling thereof
KR101728725B1 (en) Mobile terminal and method for controlling thereof
KR101873747B1 (en) Mobile terminal and method for controlling thereof
KR101873759B1 (en) Display apparatus and method for controlling thereof
US20110246877A1 (en) Mobile terminal and image display controlling method thereof
KR20120021414A (en) Mobile terminal and method for controlling the same
KR20120079548A (en) Display device and method for controlling thereof
KR20120010764A (en) MOBILE TERMINAL AND METHOD FOR CONTROLLING A THREE DIMENSION IMAGE in thereof
KR20110054256A (en) Mobile terminal and method for controlling thereof
KR20120007195A (en) Mobile terminal and method for controlling thereof
KR20120048116A (en) Mobile terminal and method for controlling the same
KR101633336B1 (en) Mobile terminal and method for controlling thereof
KR101723413B1 (en) Mobile terminal and method for controlling thereof
KR101709500B1 (en) Mobile terminal and method for controlling thereof
KR20130071059A (en) Mobile terminal and method for controlling thereof
KR101740442B1 (en) Mobile terminal and method for controlling thereof
KR101629313B1 (en) Mobile terminal and method for controlling the same
KR20130053476A (en) Mobile terminal and method for controlling the same
KR20120093601A (en) Mobile terminal and method for controlling the same
KR20120081651A (en) Mobile terminal and method for controlling the same
KR101799269B1 (en) Mobile terminal and method for controlling thereof
KR101753033B1 (en) Mobile terminal and method for controlling thereof
KR20120130394A (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application