KR20100064840A - Mobile terminal and operation method thereof - Google Patents

Mobile terminal and operation method thereof Download PDF

Info

Publication number
KR20100064840A
KR20100064840A KR1020080123452A KR20080123452A KR20100064840A KR 20100064840 A KR20100064840 A KR 20100064840A KR 1020080123452 A KR1020080123452 A KR 1020080123452A KR 20080123452 A KR20080123452 A KR 20080123452A KR 20100064840 A KR20100064840 A KR 20100064840A
Authority
KR
South Korea
Prior art keywords
displayed
screen
displaying
proximity
objects
Prior art date
Application number
KR1020080123452A
Other languages
Korean (ko)
Inventor
조혜연
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020080123452A priority Critical patent/KR20100064840A/en
Publication of KR20100064840A publication Critical patent/KR20100064840A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)

Abstract

PURPOSE: A mobile terminal and an operating method thereof are provided to predict operational possibility for a specific object displayed on a screen. CONSTITUTION: An operation screen is displayed(S200). According as a finger of a user approaches a display unit, proximity touch is detected(S205). A controller displays clickable objects displayed on a proximity location differently from another object(S210). One of the clickable objects is selected by touch inputs(S215). The operation corresponding to the selected object is performed(S220). The above processes are repeated until another operation mode is selected(S225).

Description

Mobile terminal and operation method thereof

The present invention relates to a portable terminal and a method for controlling the operation thereof, and more particularly, to a portable terminal capable of displaying a clickable object displayed on a screen according to a proximity touch or other input, and a method of operating the same.

A portable terminal is a portable device that is portable and has one or more functions for making voice and video calls, inputting and outputting information, and storing data. As the functions of the portable terminal have diversified, they have complicated functions such as photographing and photographing, playing music files and video files, receiving games, receiving broadcasts, and wireless internet, and have developed a comprehensive multimedia player, .

In the mobile terminal implemented in the form of such a multimedia device, new attempts have been applied in various ways in terms of hardware and software to implement complex functions. For example, a user interface environment for a user to search for or select a function easily and conveniently. In addition, while the mobile terminal is regarded as a personal portable product for expressing the personality of the user, various types of design changes such as a double-sided LCD (Liquid Crystal Display) and a front touch screen are also required.

However, since the mobile terminal has to consider mobility and portability, there is a limitation in allocating space for a user interface such as a display or a keypad, and thus the size of the portable terminal may be limited even if the front touch screen is applied to the portable terminal. There is no choice but to. Accordingly, when visually displaying whether or not an object such as an icon or text displayed on a limited size touch screen is visually displayed, it is convenient to predict the possibility of operation, but the screen configuration is complicated. Done.

Therefore, there is a need for a method of visually expressing a screen without complicating a screen configuration on whether an object displayed on the screen is an object that can be clicked on or other related functions.

Accordingly, an object of the present invention is to provide a mobile terminal and a method of operating the same, which can display whether an object displayed on a screen is clickable or other related functions according to a proximity touch or other input.

In accordance with an aspect of the present invention, there is provided a method of operating a mobile terminal, the method including displaying an operation screen on a display unit, and when a proximity touch is input to the operation screen, a clickable displayed at the proximity touch input position is clickable. ) Marking the entity with the other entity.

In addition, the operation method of the mobile terminal according to the present invention for achieving the above object, the step of displaying the operation screen on the touch screen, and the touch screen when dragging any one of the objects displayed on the operation screen and the input, the touch screen Displaying a function associated with the dragged object at.

Meanwhile, the portable terminal according to the present invention includes a display unit for displaying an operation screen, a proximity sensor for outputting a signal corresponding to a proximity touch input, and a proximity touch input on the operation screen according to a signal output from the proximity sensor. And a control unit for displaying the clickable object displayed at the proximity touch input position to be distinguishable from other objects.

In addition, the portable terminal according to the present invention, when a touch screen for displaying an operation screen, and an input for touching and dragging any one of the objects displayed on the operation screen is detected, a function associated with the dragged object on the touch screen It includes a control unit for displaying.

In order to achieve the above object, the present invention provides a processor-readable recording medium having recorded thereon a program for executing the method in the processor.

According to the present invention, an object which can be clicked on an object such as an icon, text, or other image displayed on the screen can be distinguishably displayed from another object according to the proximity touch input. In addition, a function associated with a specific object may be displayed according to the drag input. Accordingly, the possibility of operation with respect to a specific object displayed on the screen can be predicted, so that the portable terminal can be intuitively operated and an operation error can be prevented.

Hereinafter, with reference to the drawings will be described the present invention in more detail.

The portable terminal described in the present specification includes a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), navigation, and the like. In addition, suffixes "module" and " part "for the components used in the following description are given merely for convenience of description, and do not give special significance or role in themselves. Therefore, the "module" and "unit" may be used interchangeably.

FIG. 1 is a block diagram of a portable terminal according to an exemplary embodiment of the present invention. Referring to FIG. Referring to FIG. 1, a portable terminal according to an exemplary embodiment of the present invention will be described in terms of components according to functions.

1, the portable terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A memory 160, an interface unit 170, a control unit 180, and a power supply unit 190. Such components may be configured by combining two or more components into one component, or by dividing one component into two or more components as necessary when implemented in a practical application.

The wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 113, a wireless Internet module 115, a short distance communication module 117, and a GPS module 119.

The broadcast receiving module 111 receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. In this case, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal. The broadcast related information may also be provided through a mobile communication network, and in this case, may be received by the mobile communication module 113. Broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).

The broadcast receiving module 111 receives broadcast signals using various broadcast systems, and in particular, digital multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), and media forward link only (MediaFLO). ), Digital broadcast signals may be received using digital broadcasting systems such as DVB-H (Digital Video Broadcast-Handheld) and ISDB-T (Integrated Services Digital Broadcast-Terrestrial). In addition, the broadcast receiving module 111 may be configured to be suitable for all broadcast systems providing broadcast signals as well as such digital broadcast systems. The broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 113 transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to voice call signal, video call signal, or text / multimedia message transmission and reception.

The wireless Internet module 115 refers to a module for wireless Internet access, and the wireless Internet module 115 can be embedded in the mobile terminal 100 or externally. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short-range communication module 117 refers to a module for short-range communication. Bluetooth, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, and the like can be used as the short distance communication technology.

A GPS (Global Position System) module 119 receives position information from a plurality of GPS satellites.

The A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 123. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode. The processed image frame may be displayed on the display unit 151.

The image frame processed by the camera 121 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. The camera 121 may be equipped with two or more cameras according to the configuration of the terminal.

The microphone 123 receives an external sound signal by a microphone in a communication mode, a recording mode, a voice recognition mode, or the like, and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 113 and output when the voice data is in the call mode. The microphone 123 may be a variety of noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 130 generates key input data input by the user for controlling the operation of the terminal. The user input unit 130 may include a key pad, a dome switch, a touch pad (static pressure / capacitance), a jog wheel, a jog switch, a finger mouse, and the like. In particular, when the touch pad has a mutual layer structure with the display unit 151 described later, this may be referred to as a touch screen.

The sensing unit 140 senses the current state of the portable terminal 100 such as the open / close state of the portable terminal 100, the position of the portable terminal 100, Thereby generating a sensing signal. For example, when the mobile terminal 100 is in the form of a slide phone, it may sense whether the slide phone is opened or closed. In addition, it may be responsible for sensing functions related to whether the power supply unit 190 is supplied with power, whether the interface unit 170 is coupled to an external device, and the like.

The sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, and an acceleration sensor 145. The proximity sensor 141 may detect the presence or absence of an approaching object or an object present in the vicinity without mechanical contact. The proximity sensor 141 may detect a proximity object by using a change in an alternating magnetic field or a change in a static magnetic field, or by using a change rate of capacitance. Two or more proximity sensors 141 may be provided according to the configuration aspect.

The pressure sensor 143 may detect whether pressure is applied to the portable terminal 100, the magnitude of the pressure, and the like. The pressure sensor 143 may be installed at a portion where the pressure of the portable terminal 100 is required depending on the use environment. If the pressure sensor 143 is installed in the display unit 151, a touch input through the display unit 151 and a greater pressure than the touch input are applied according to a signal output from the pressure sensor 143. The pressure touch input can be identified. Also, the magnitude of the pressure applied to the display unit 151 at the time of the pressure touch input can be determined according to the signal output from the pressure sensor 143. [

The acceleration sensor 145 converts an acceleration signal in one direction into an electrical signal, and is widely used with the development of micro-electromechanical systems (MEMS) technology. The acceleration sensor 145 measures the acceleration of a large value embedded in an airbag system of a vehicle and used to detect a collision. There are many different types of measurements. The acceleration sensor 145 is usually configured by mounting two or three axes in one package, and in some cases, only one Z axis is required depending on the use environment. Therefore, when the acceleration sensor in the X-axis direction or the Y-axis direction is used instead of the Z-axis direction for some reason, the acceleration sensor may be mounted on the main substrate by using a separate piece substrate.

The output unit 150 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 155, and a haptic module 157.

The display unit 151 displays and outputs information processed by the mobile terminal 100. For example, when the portable terminal 100 is in the call mode, a UI (User Interface) or a GUI (Graphic User Interface) associated with a call is displayed. When the portable terminal 100 is in the video communication mode or the photographing mode, the photographed or received images can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display unit 151 and the touch pad have a mutual layer structure and constitute a touch screen, the display unit 151 may be used as an input device in addition to the output device. If the display unit 151 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like. In this case, the touch screen panel is a transparent panel attached to the outside and may be connected to the internal bus of the portable terminal 100. The touch screen panel keeps a watch on the contact result, and if there is a touch input, sends the corresponding signals to the touch screen panel controller. The touch screen panel controller processes the signals, and then transmits corresponding data to the controller 180 so that the controller 180 can determine whether the touch input has been made and which area of the touch screen has been touched.

In addition, the display unit 151 may be configured of an electronic paper (e-Paper). Electronic paper (e-Paper) is a kind of reflective display, and has excellent visual characteristics such as high resolution, wide viewing angle and bright white background as conventional paper and ink. The electronic paper (e-paper) can be implemented on any substrate such as plastic, metal, paper, and the image is retained even after the power is shut off, and the battery life of the portable terminal 100 is long Can be maintained. As the electronic paper, a hemispherical twist ball filled with a telephone can be used, or an electrophoresis method and a microcapsule can be used.

In addition, the display unit 151 may include a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display. It may include at least one of (3D display). In addition, two or more display units 151 may exist according to the implementation form of the mobile terminal 100. For example, the external display unit (not shown) and the internal display unit (not shown) may be simultaneously provided in the portable terminal 100.

The audio output module 153 outputs audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output module 153 outputs sound signals related to functions performed in the portable terminal 100, for example, a call signal reception tone, a message reception tone, and the like. The sound output module 153 may include a speaker, a buzzer, and the like.

The alarm unit 155 outputs a signal for notifying the occurrence of an event of the portable terminal 100. Examples of events occurring in the mobile terminal 100 include call signal reception, message reception, key signal input, and the like. The alarm unit 155 outputs a signal for notifying occurrence of an event in a form other than an audio signal or a video signal. For example, the signal may be output in the form of vibration. The alarm unit 155 can output a signal to notify when a call signal is received or a message is received. Also. When the key signal is input, the alarm unit 155 may output the signal as a feedback to the key signal input. The user may recognize the occurrence of an event through the signal output from the alarm unit 155. The signal for notifying the event occurrence in the mobile terminal 100 may also be output through the display unit 151 or the sound output module 153.

The haptic module 157 generates various tactile effects that the user can feel. A representative example of the haptic effect generated by the haptic module 157 is a vibration effect. When the haptic module 157 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 157 can be converted, and the different vibrations can be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 157 may be provided with a function of stimulating by a pin arrangement vertically moving with respect to the contact skin surface, an effect of stimulating air through the injection or suction force of the air through the injection port or the suction port, The effect of stimulation through contact of an electrode (eletrode), the effect of stimulation by electrostatic force, and the effect of reproducing a cold sensation using a device capable of endothermic or exothermic can be generated. The haptic module 157 can be implemented not only to transmit the tactile effect through direct contact but also to feel the tactile effect through the muscular sense of the user's finger or arm. Two or more haptic modules 157 may be provided according to a configuration aspect of the mobile terminal 100.

The memory 160 may store a program for processing and controlling the controller 180 and may provide a function for temporarily storing input or output data (for example, a phone book, a message, a still image, a video, etc.). It can also be done.

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM The storage medium may include at least one type of storage medium. In addition, the mobile terminal 100 may operate a web storage that performs a storage function of the memory 150 on the Internet.

The interface unit 170 serves as an interface with all external devices connected to the portable terminal 100. Examples of external devices connected to the mobile terminal 100 include a wired / wireless headset, an external charger, a wired / wireless data port, a memory card, a subscriber identification module (SIM) or a user identity module (UIM) card. The same card socket, audio input / output (I / O) terminal, video input / output (I / O) terminal, and earphones are included. The interface unit 170 receives data from the external device or receives power from the external device and transmits the data to each component in the portable terminal 100 so that data in the portable terminal 100 can be transmitted to the external device .

The interface unit 170 is a path through which power from the cradle connected when the portable terminal 100 is connected to the external cradle is supplied to the portable terminal 100 or various command signals inputted from the cradle by the user are carried And may be a passage to be transmitted to the terminal 100.

The controller 180 typically controls the operation of the respective units to control the overall operation of the mobile terminal 100. For example, perform related control and processing for voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured in hardware in the controller 180 or may be configured in software separately from the controller 180.

In addition, the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The mobile terminal 100 having such a configuration can be configured to be operable in a communication system capable of transmitting data through a frame or a packet, including a wired / wireless communication system and a satellite-based communication system. have.

The portable terminal related to the present invention has been described above in terms of components according to functions. Hereinafter, referring to FIGS. 2 and 3, the portable terminal related to the present invention will be further described in terms of components according to appearance. Hereinafter, for convenience of description, a slider type portable terminal having a touch screen among various types of portable terminals such as a folder type, a bar type, a swing type, a slider type, etc. will be described as an example. However, the present invention is not limited to the slider type portable terminal, but can be applied to all types of portable terminals including the aforementioned type.

2 is a perspective view of a portable terminal according to an embodiment of the present invention viewed from the front. Referring to FIG. 2, the portable terminal includes a first body 100A and a second body 100B configured to be slidable in at least one direction on the first body 100A.

A state in which the first body 100A is disposed to overlap the second body 100B may be referred to as a closed configuration. As shown in FIG. 2, the first body 100A may be referred to as the second body 100B. The exposed state of at least a portion of the N may be referred to as an open configuration.

Although the mobile terminal 100 operates mainly in the standby mode (Standby Mode) in the closed state, the standby mode may be released by the user's operation. In addition, the mobile terminal 100 operates mainly in a call mode or the like in an open state, but may be switched to a standby mode by a user's manipulation or a lapse of a predetermined time.

 The case forming the outer appearance of the first body 100A is formed by the first front case 100A-1 and the first rear case 100A-2. Various electronic components are embedded in the space formed by the first front case 100A-1 and the first rear case 100A-2. At least one intermediate case may be further disposed between the first front case 100A-1 and the first rear case 100A-2. Such cases may be formed by injecting synthetic resin, or may be formed to have a metal material such as stainless steel (STS) or titanium (Ti).

In the first body 100A, specifically, the first front case 100A-1, the display unit 151, the first sound output module 153a, the first camera 121a, and the first user input unit 130a are provided. Can be deployed.

The display unit 151 includes a liquid crystal display (LCD), organic light emitting diodes (OLED), and the like, which visually express information. The display unit 151 may be configured such that the touch pad is overlapped in a layer structure so that the display unit 151 operates as a touch screen to input information by a user's touch. The first sound output module 153a may be implemented in the form of a receiver or a speaker. The first camera 121a may be implemented to be suitable for capturing an image or a video of a user or the like.

Like the first body 100A, the case forming the external appearance of the second body 100B is formed by the second front case 100B-1 and the second rear case 100B-2. The second user input unit 130b may be disposed on the front surface of the second body 100B, specifically, the second front case 100B-1. The third and fourth user input units 130c and 130d, the microphone 123, and the interface unit 170 are disposed on at least one of the second front case 100B-1 or the second rear case 100B-2). Can be.

The first to fourth user input units 130a, 130b, 130c, and 130d may be collectively referred to as the user input unit 130, and may be adopted in any manner as long as the user is tactile manner in a tactile manner. have.

For example, the user input unit 130 may be implemented as a dome switch or a touch pad that may receive a command or information by a user's push or touch operation, or may be operated by a wheel or jog method or a joystick that rotates a key. Or the like.

In functional terms, the first user input unit 130a is for inputting a command such as start, end, scroll, etc., and the second user input unit 130b is for inputting numbers, letters, symbols, and the like. In addition, the third and fourth user input units 130c and 130d may operate as hot-keys for activating a special function in the mobile terminal 100.

The microphone 123 may be implemented in a form suitable for receiving a user's voice, other sounds, and the like.

3 is a rear perspective view of the mobile terminal shown in FIG. 2. Referring to FIG. 3, a wheel type fifth user input unit 130e and a second camera 121b may be additionally mounted on the rear surface of the second rear case 100B-2 of the second body 100B. The sixth user input unit 130f may be disposed on the side surface of the second body 100B.

The second camera 121b may have a photographing direction substantially opposite to that of the first camera 121a, and may have different pixels from the first camera 121a. For example, in the case of a video call or the like, the first camera 121a has a low pixel so that the user's face is photographed and transmitted to the other party, and the second camera 121b photographs a general subject and does not transmit it immediately. In many cases, it is desirable to have a high pixel.

The flash 125 and the mirror 126 may be additionally disposed adjacent to the second camera 121b. The flash 125 shines light toward the subject when the subject is photographed by the second camera 121b. The mirror 126 allows a user to see his / her own face or the like when the user wants to photograph himself (self-photographing) using the second camera 121b.

A second sound output module (not shown) may be further disposed on the second rear case 100B-2. The second sound output module may implement the stereo function together with the first sound output module 153a, and may be used for talking in the speakerphone mode.

In addition, an antenna (not shown) for receiving a broadcast signal may be disposed on one side of the second rear case 100B-2 in addition to an antenna for a call. The antenna may be installed to be pulled out of the second body 100B. A portion of the slide module 100C for slidably coupling the first body 100A and the second body 100B is disposed on the first rear case 100A-2 side of the first body 100A. The other part of the slide module 100C may be disposed on the second front case 100B-1 side of the second body 100B and may not be exposed to the outside as shown in FIG. 3.

In the above description, the second camera 121b and the like are disposed on the second body 100B, but the present invention is not limited thereto. For example, at least one or more of the components described as disposed in the second rear case 100B-2, such as the second camera 121b, may be the first body 100A, mainly the first rear case 100A-2. It is also possible to mount on). In such a configuration, there is an advantage that the components disposed in the first rear case 100A-2 in the closed state are protected by the second body 100B. In addition, even if the second camera 121b is not separately provided, the first camera 121a may be rotatably formed to be able to photograph up to the photographing direction of the second camera 121b.

The rear case 100A-2 is provided with a power supply unit 190 for supplying power to the portable terminal. The power supply unit 190 is, for example, a rechargeable battery, and may be detachably coupled to the rear case 100A-2 for charging.

4 is a diagram referred to describe a proximity touch input. As shown in FIG. 4, when a pointer such as a user's finger or a pen approaches the display unit 151, a proximity sensor 141 installed inside or adjacent to the display unit 151 detects the proximity signal to detect a proximity signal. Output

The proximity sensor 141 may be configured to output different proximity signals according to a distance between the pointer touched by the proximity touch and the display unit 151 (hereinafter, referred to as "proximity depth"). For example, a distance in which a detection object approaches a proximity sensor and a proximity signal is output is called a detection distance Sn. As such, a proximity signal output from each proximity sensor using a plurality of proximity sensors having different detection distances is used. Comparing to it can be seen how close to the proximity object.

In FIG. 4, a cross section in which a proximity sensor is disposed to detect three proximity depths is illustrated, but a proximity sensor may be arranged to detect less than three or four or more proximity depths. Specifically, when the pointer is completely in contact with the display unit 151 (D0), it is recognized as a contact touch. When the pointer is positioned below the D1 distance on the display unit 151, the pointer is recognized as a proximity touch of the first proximity depth. When the pointer is positioned to be spaced apart from the D1 distance by more than the D2 distance on the display unit 151, the pointer is recognized as a proximity touch of the second proximity depth. When the pointer is positioned to be spaced apart from the D2 distance by more than the D3 distance on the display unit 151, the pointer is recognized as the proximity touch of the third proximity depth. When the pointer is positioned at a distance greater than or equal to the distance D3 on the display unit 151, it is recognized that the proximity touch is released.

In addition, a plurality of proximity sensors having different detection areas may be arranged to determine which proximity sensor is output from among the proximity sensors to determine where the pointer approaches the display unit 151 and whether the pointer is displayed. Whether to move closer to the unit 151 may be identified.

Accordingly, the controller 180 may recognize the proximity touch as various input signals according to the proximity depth and the proximity position of the pointer, and perform various operation control according to the various input signals.

5 is a flowchart provided to explain a method of operating a mobile terminal according to an embodiment of the present invention. Referring to FIG. 5, the controller 180 displays an operation screen corresponding to a selected menu or operation on the display unit 151 configured as a touch screen (S200). At this time, examples of the operation screen to be displayed include a standby screen, a message reception screen, a main menu screen, an image or video viewer screen, a broadcast screen, a map screen, and a web page screen.

When the user's finger is close to the display unit 151 while the operation screen is displayed and the proximity touch is detected (S205), the controller 180 is clickable displayed at the proximity position input by the proximity touch on the operation screen. The object is displayed to be distinguishable from other objects (S210).

In this case, the clickable object is a text, an icon, or an image linked to other data or a function. When the clickable object is clicked, the clickable object automatically moves to another data or executes a function. Examples of clickable objects include hyperlink objects and soft keys. In addition, the clickable object may be displayed in a different color or shape from another object when the proximity touch signal is input, or may be displayed in a different shaded color. It is also possible to generate haptic effects corresponding to clickable entities that are identifiably displayed.

 The controller 180 controls to perform an operation corresponding to the selected object when any one of the clickable objects that are identified to be selected is selected by a touch input (S215).

This process is repeatedly performed until another operation mode is selected (S225). In addition, by such a process, the clickable object displayed on the screen can be displayed to be distinguishable from other objects so that the operation possibility can be predicted and intuitive operation is possible.

6 is a flowchart provided to explain a method of operating a mobile terminal according to another embodiment of the present invention. Referring to FIG. 6, the controller 180 displays an operation screen corresponding to a selected menu or operation on the display unit 151 configured as a touch screen (S300).

In the state where the operation screen is displayed, when there is an input that is dragged after a specific object displayed on the operation screen is touched (S305), the controller 180 displays a function associated with the dragged object (S310). In this case, the associated function includes a function of performing a specific operation or transferring to a related menu. The associated function may be displayed in the form of a pop-up window or speech ballon. In addition, a tactile effect corresponding to the dragged object may be generated.

If one of the associated functions is selected (S315), the controller 180 controls the selected function to be performed (S320).

This process is repeatedly performed until another operation mode is selected (S325). In addition, by such a process, a function associated with a specific object can be displayed efficiently even on a small screen.

7A to 7C are views referred to for describing a process of displaying a clickable object from another object so as to be distinguishable from other objects in a method of operating a mobile terminal according to the present invention.

7A illustrates an example of a received message viewing screen. While the reception message viewing screen 400 is displayed on the display unit 151 as shown in FIG. 7B, the proximity touch input 410 is displayed on the reception message viewing screen 400 with the user's finger approaching. If present, clickable objects 425 and 430 among the objects displayed in the proximal position are distinguishably displayed from other objects.

In this case, the clickable objects 425 and 430 may be displayed in a different color or shape or different shades of colors than other objects. When one of the objects 425 and 430 that are identified to be identified is touched, a corresponding action is performed. In the case of FIG. 7B, when one of the objects 425 and 430 displayed as distinguishable is touched, a web page screen connected to a corresponding website is displayed.

7C illustrates an example of a web page screen. When there is a proximity touch input 465 on the web page screen 460, the clickable object 470 displayed in the proximity position may be enlarged and displayed to be distinguishable from other objects. When the enlarged and displayed clickable object 470 is touched, a corresponding operation is performed. Depending on the user's environment, information related to the object that can be identified can be displayed together.

8A to 8C are views for explaining a process of displaying a linked function according to a drag input in a method of operating a mobile terminal according to the present invention.

8A illustrates an example of a received message viewing screen. As such, when there is an input 510 for touching and dragging an object on the received message view screen 500, as illustrated in FIG. 8B, a function associated with the dragged object is displayed. In the case of FIG. 8B, the hyperlink objects 525 and 530 associated with the dragged object are displayed.

In the case of FIG. 8C, the function of moving to the schedule menu as the associated function according to the drag input 550 is indicated by the speech bubble form 550. If there are several functions associated with the selected object, several speech bubbles may be displayed. In addition, the associated function may be displayed in various forms identifiable with a pop-up window or other various objects in addition to the speech bubble.

In this way, by displaying a clickable object or a related function with other objects so as to be distinguished from other objects, the possibility of operation can be determined in advance for a specific object, so that the user can intuitively operate it. Confusion can be prevented. In addition, the portable terminal and its operation method according to the present invention are not limited to the configuration and method of the embodiments described as described above, but the embodiments are all or all of the embodiments so that various modifications can be made. Some may be optionally combined.

Meanwhile, the present invention can be embodied as processor readable codes on a processor readable recording medium provided in a mobile terminal such as a mobile station modem (MSM). The processor-readable recording medium includes all kinds of recording devices that store data that can be read by the processor. Examples of the processor-readable recording medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like, and also include a carrier wave such as transmission through the Internet. The processor-readable recording medium can also be distributed over network coupled computer systems so that the processor-readable code is stored and executed in a distributed fashion.

In addition, although the preferred embodiment of the present invention has been shown and described above, the present invention is not limited to the specific embodiments described above, but the technical field to which the invention belongs without departing from the spirit of the invention claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

1 is a block diagram of a portable terminal according to an embodiment of the present invention;

2 is a front perspective view of a portable terminal according to an embodiment of the present invention;

3 is a rear perspective view of a portable terminal according to an embodiment of the present invention;

4 is a view referred to for explaining a proximity touch input;

5 is a flowchart provided to explain a method of operating a mobile terminal according to an embodiment of the present invention;

6 is a flowchart provided to explain a method of operating a mobile terminal according to another embodiment of the present invention;

7A to 7C are views referred to for describing a process of identifiably displaying a clickable object according to a proximity touch in a method of operating a mobile terminal according to the present invention; and

8A to 8C are views for explaining a process of displaying a linked function according to a drag input in a method of operating a mobile terminal according to the present invention.

Explanation of symbols on the main parts of the drawings

110: wireless communication unit 120: A / V input unit

130: user input unit 140: sensing unit

150: output unit 160: memory

170: interface unit 180: control unit

Claims (20)

Displaying an operation screen on a display unit; And And displaying the clickable object displayed at the proximity touch input position so as to be distinguishable from other objects when the proximity touch is input to the operation screen. The method of claim 1, And if any one of the objects identified as distinguishable is selected, performing an operation corresponding to the selected object. The method of claim 1, And displaying information related to the identifiably displayed entity. The method of claim 1, And generating a haptic effect corresponding to the distinguishedly displayed object. The method of claim 1, And wherein the clickable object displays at least one of a color, a shape, and a shaded color differently from the other object. The method of claim 1, And the clickable object is enlarged and displayed. The method of claim 1, The clickable entity includes a hyperlink entity and a soft key. A display unit displaying an operation screen; A proximity sensor for outputting a signal corresponding to a proximity touch input; And And a control unit for displaying a clickable object displayed on the proximity touch input position to be distinguishable from other objects when a proximity touch input is detected on the operation screen according to a signal output from the proximity sensor. The method of claim 8, The control unit controls to perform an operation corresponding to the selected object when any one of the objects that are distinguishably displayed is selected. The method of claim 8, And the controller is further configured to further display information related to the distinguishably displayed entity. The method of claim 8, It further includes a haptic module for generating a haptic effect, And the controller controls the haptic module to generate a haptic effect corresponding to the distinguishably displayed object. Displaying an operation screen on a touch screen; And When an input of touching and dragging any one of the objects displayed on the operation screen, displaying a function associated with the dragged object on the touch screen. The method of claim 12, The associated function may be displayed in any one of a pop-up window form and a speech ballon form. The method of claim 12, And if any one of the linked functions is selected, performing the selected function. The method of claim 12, And generating a haptic effect corresponding to the dragged object. The method of claim 12, And the dragged object is identifiably displayed. A touch screen displaying an operation screen; And And a controller configured to display a function associated with the dragged object on the touch screen when an input for touching and dragging any one of the objects displayed on the operation screen is detected. The method of claim 17, The controller may display the associated function in one of a pop-up window form and a speech ballon form. The method of claim 17, The control unit, if any one of the linked functions is selected, characterized in that for controlling to perform the selected function. The method of claim 17, It further includes a haptic module for generating a haptic effect, The controller controls the haptic module to generate a haptic effect corresponding to the dragged object.
KR1020080123452A 2008-12-05 2008-12-05 Mobile terminal and operation method thereof KR20100064840A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020080123452A KR20100064840A (en) 2008-12-05 2008-12-05 Mobile terminal and operation method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020080123452A KR20100064840A (en) 2008-12-05 2008-12-05 Mobile terminal and operation method thereof

Publications (1)

Publication Number Publication Date
KR20100064840A true KR20100064840A (en) 2010-06-15

Family

ID=42364389

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020080123452A KR20100064840A (en) 2008-12-05 2008-12-05 Mobile terminal and operation method thereof

Country Status (1)

Country Link
KR (1) KR20100064840A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8913026B2 (en) 2012-03-06 2014-12-16 Industry-University Cooperation Foundation Hanyang University System for linking and controlling terminals and user terminal used in the same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8913026B2 (en) 2012-03-06 2014-12-16 Industry-University Cooperation Foundation Hanyang University System for linking and controlling terminals and user terminal used in the same
US10656895B2 (en) 2012-03-06 2020-05-19 Industry—University Cooperation Foundation Hanyang University System for linking and controlling terminals and user terminal used in the same

Similar Documents

Publication Publication Date Title
KR101611302B1 (en) Mobile terminal capable of receiving gesture input and control method thereof
KR101553842B1 (en) Mobile terminal providing multi haptic effect and control method thereof
KR101549558B1 (en) Mobile terminal and control method thereof
KR101657549B1 (en) Mobile terminal and control method thereof
KR101645186B1 (en) Mobile terminal and operation control method thereof
KR20100020818A (en) Mobile terminal and operation control method thereof
KR20110116526A (en) Mobile terminal and operation control method thereof
KR20120021673A (en) Mobile terminal and operation control method thereof
KR20110011393A (en) Mobile terminal and control method thereof
KR20100040190A (en) Mobile terminal for providing merge funtion to merge web pages and operation method thereof
KR20110008939A (en) Mobile terminal equipped with circular display unit and control method thereof
KR101597524B1 (en) Mobile terminal capable of controlling operation using a touchless sensor and control method thereof
KR101105775B1 (en) Mobile terminal and control method thereof
KR101544550B1 (en) Mobile terminal and control method thereof
KR101629641B1 (en) Mobile terminal and control method thereof
KR101532006B1 (en) Mobile terminal and operation method thereof
KR20120006873A (en) Mobile terminal and operation method thereof
KR101608650B1 (en) Mobile terminal and control method thereof
KR101769556B1 (en) Mobile terminal and operation method thereof
KR20100130875A (en) Electronic device capable of controlling operation using a touchless sensor and control method thereof
KR101600792B1 (en) Mobile terminal comprising hacking prevention function and operation method thereof
KR20100064840A (en) Mobile terminal and operation method thereof
KR20110074111A (en) Mobile terminal and operation control method thereof
KR20110029834A (en) Mobile terminal and operation control method thereof
KR101758177B1 (en) Mobile terminal and operation control method thereof

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E601 Decision to refuse application