EP2680099A2 - Mobile terminal and control method thereof - Google Patents
Mobile terminal and control method thereof Download PDFInfo
- Publication number
- EP2680099A2 EP2680099A2 EP13000951.7A EP13000951A EP2680099A2 EP 2680099 A2 EP2680099 A2 EP 2680099A2 EP 13000951 A EP13000951 A EP 13000951A EP 2680099 A2 EP2680099 A2 EP 2680099A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- pressure
- controller
- mobile terminal
- display
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1656—Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
Definitions
- the present disclosure may relate to a mobile terminal having a squeeze sensor.
- Terminals may be classified as mobile (or portable) terminals and stationary terminals based on its mobility.
- the mobile terminals may be further classified as handheld terminals and vehicle mount terminals based on whether (or not) the terminal can be directly carried by a user.
- a terminal may be allowed to capture still images or moving images, play music or video files, play games, receive broadcast and the like, so as to be implemented as an integrated multimedia player. Improvement of structural or software elements of the terminal may be taken into consideration to support and enhance functions of the terminal.
- a user may generate an input signal using a touch sensor provided on a display unit of the terminal.
- the touch sensor may be activated only when the display unit is activated, and thus the touch sensor has a limit in generating various input signals.
- FIG. 1 is a block diagram illustrating a mobile terminal
- FIGs. 2A and 2B are perspective views illustrating an external appearance of the mobile terminal
- FIGs. 3A and 3B are perspective views illustrating an external appearance of the mobile terminal
- FIGs. 4 and 5 are flow charts for explaining a method of a mobile terminal according to an example embodiment of the present disclosure.
- FIGs. 6 through 16 are views illustrating an operation example of the mobile terminal.
- FIG. 1 is a block diagram illustrating a mobile terminal according to an example arrangement. Other arrangements may also be provided.
- FIG. 1 shows that a mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, a power supply unit 190, and the like.
- A/V audio/video
- FIG. 1 shows that constituent elements as shown in FIG. 1 are not necessarily required, and the mobile terminal 100 may be implemented with greater or less number of elements than those illustrated elements.
- the wireless communication unit 110 may include one or more elements allowing radio communication between the mobile terminal 100 and a wireless communication system, or allowing wireless (or radio) communication between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
- the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114, a location information module 115, and/or the like.
- the broadcast receiving module 111 may receive broadcast signals and/or broadcast associated information from an external broadcast management server through a broadcast channel.
- the broadcast associated information may be information regarding a broadcast channel, a broadcast program, a broadcast service provider, and/or the like.
- the broadcast associated information may also be provided through a mobile communication network.
- the broadcast associated information may be received by the mobile communication module 112.
- the broadcast signal and broadcast-associated information received through the broadcast receiving module 111 may be stored in the memory 160.
- the mobile communication module 112 may transmit and/or receive a radio signal to and/or from at least one of a base station, an external terminal and/or a server over a mobile communication network.
- the radio signal may include a voice call signal, a video call signal or various types of data according to text and/or multimedia message transmission and reception.
- the wireless Internet module 113 may be a module for supporting wireless Internet access and may be built-in or externally installed to the mobile terminal 100.
- a variety of wireless Internet access techniques may be used, such as WLAN (Wireless LAN), Wi-Fi, Wibro (Wireless Broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), and/or the like.
- the short-range communication module 114 may be a module for supporting a short-range communication.
- a variety of short-range communication technologies may be used, such as Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBee, and/or the like.
- the location information module 115 may be a module for acquiring a location (or position) of the mobile terminal 100.
- a GPS module may be an example of the location information module 115.
- the A/V (audio/video) input unit 120 may receive an audio or video signal, and the A/V (audio/video) input unit 120 may include a camera 121, a microphone 122, and the like.
- the camera 121 may process an image frame, such as still or moving images, obtained by an image sensor in a video phone call or an image capturing mode.
- the processed image frame may be displayed on a display unit 151 (or display).
- the image frames processed by the camera 121 may be stored in the memory 160 or may be transmitted to an external device through the wireless communication unit 110. Two or more cameras 121 may be provided according to a use environment of the mobile terminal 100.
- the microphone 122 may receive an external audio signal through a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and the microphone 122 may process the audio signal into electrical voice data.
- the processed voice data may be converted and outputted in a format that is transmittable to a mobile communication base station through the mobile communication module 112 in the phone call mode.
- the microphone 122 may implement various types of noise canceling algorithms to cancel noise generated during the process of receiving the external audio signal.
- the user input unit 130 may generate input data to control an operation of the mobile terminal 100.
- the user input unit 130 may be configured with a keypad, a dome switch, a touch pad (pressure/capacitance), a jog wheel, a jog switch, and/or the like.
- the sensing unit 140 may detect presence or absence of the user's contact, and a current status of the mobile terminal 100 such as an opened or closed configuration, a location of the mobile terminal 100, an orientation of the mobile terminal 100, an acceleration or deceleration of the mobile terminal 100, and the like.
- the sensing unit 140 may generate a sensing signal for controlling operation of the mobile terminal 100.
- the sensing unit 140 may sense an opened or closed configuration of the slide phone.
- the sensing unit 140 may sense whether (or not) power is supplied from the power supply unit 190, or whether (or not) an external device is coupled to the interface unit 170.
- the sensing unit 140 may include a proximity sensor 141.
- the sensing unit 140 may include a touch sensor for sensing a touch operation with respect to the display unit 151. Other sensors may also be used.
- the touch sensor may be implemented as a touch film, a touch sheet, a touch pad, and/or the like.
- the touch sensor may be configured to convert changes of a pressure applied to a specific part of the display unit 151, or a capacitance generated from a specific part of the display unit 151, into electric input signals.
- the touch sensor may be configured to sense not only a touched position and a touched area, but also a touch pressure.
- the display unit 151 may be used as an input device (rather than an output device).
- the display unit 151 may be referred to as a touch screen.
- the corresponding signals may be transmitted to a touch controller.
- the touch controller may process signals transferred from the touch sensor, and then transmit data corresponding to the processed signals to the controller 180.
- the controller 180 may sense which region of the display unit 151 has been touched.
- a proximity of a sensing object may be detected by changes of an electromagnetic field according to proximity of a sensing object.
- the touch screen may be categorized as a proximity sensor 141.
- the proximity sensor 141 may be a sensor for detecting presence or absence of a sensing object using an electromagnetic field or infrared rays without a mechanical contact.
- the proximity sensor 141 may have a longer lifespan and more enhanced utility than a contact sensor.
- the proximity sensor 141 may include an optical transmission type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and/or the like.
- a behavior of closely approaching the touch screen without contact may be referred to as a proximity touch
- a behavior of substantially coming in contact with the touch screen may be referred to as a contact touch
- the proximity sensor 141 may sense proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, moving status, etc.). Information relating to the sensed proximity touch and the sensed proximity touch patterns may be output on the touch screen.
- proximity touch patterns e.g., distance, direction, speed, time, position, moving status, etc.
- the output unit 150 may generate an output related to visual, auditory, tactile senses.
- the output unit 150 may include the display unit 151 (or display), an audio output module 152, an alarm unit 153 (or alarm), a haptic module 154, and/or the like.
- the display unit 151 may display (output) information processed in the mobile terminal 100. For example, when the mobile terminal 100 is operated in a phone call mode, the display unit 151 may display a user interface (UI) or graphic user interface (GUI) related to a phone call. When the mobile terminal 100 is operated in a video call mode or an image capturing mode, the display unit 151 may display a captured image, a received image, UI, GUI, and/or the like.
- UI user interface
- GUI graphic user interface
- the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a 3-dimensional (3D) display, and/or an e-ink display.
- LCD liquid crystal display
- TFT-LCD thin film transistor-liquid crystal display
- OLED organic light emitting diode
- flexible display a flat panel
- 3D 3-dimensional
- e-ink display e-ink display
- At least one of those displays (or display devices) included in the display unit 151 may be configured with a transparent or optical transparent type to allow the user to view the outside therethrough. It may be referred to as a transparent display.
- An example of the transparent display may be a transparent OLED (TOLED), and/or the like. Under this configuration, the user may view an object positioned at a rear side of the mobile device body through a region occupied by the display unit 151 of the mobile device body.
- TOLED transparent OLED
- Two or more display units 151 may be provided according to implementation of the mobile terminal 100.
- a plurality of the display units 151 may be provided on one surface in a separate or integrated manner, or may be provided on different surfaces, respectively.
- the audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 160, in a call-receiving mode, a call-placing mode, a recording mode, a voice selection mode, a broadcast reception mode, and/or the like.
- the audio output module 152 may output an audio signal related to a function carried out in the mobile terminal 100 (e.g. sound alarming a call received or a message received, and the like).
- the audio output module 152 may include a receiver, a speaker, a buzzer, and/or the like.
- the alarm unit 153 may output signals notifying occurrence of an event from the mobile terminal 100.
- the examples of an event occurring from the mobile terminal 100 may include a call received, a message received, a key signal input, a touch input, and/or the like.
- the alarm unit 153 may output not only video or audio signals, but also other types of signals such as signals for notifying the occurrence of an event in a vibration manner. Since the video or audio signals may also be output through the display unit 151 or the audio output unit 152, the display unit 151 and the audio output module 152 may be categorized as part of the alarm unit 153.
- the haptic module 154 may generate various tactile effects that can be felt by the user.
- a representative example of the tactile effects generated by the haptic module 155 may include vibration.
- Vibration generated by the haptic module 155 may have a controllable intensity, a controllable pattern, and/or the like. For example, different vibrations may be output in a synthesized manner or in a sequential manner.
- the haptic module 154 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moved with respect to a skin surface being touched, air injection force or air suction force through an injection port or suction port, touch by a skin surface, contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or heat emitting device, and/or the like.
- the haptic module 154 may transmit tactile effects through the user's direct contact, or the user's muscular sense using a finger or a hand. Two or more haptic modules 154 may be provided according to configuration of the mobile terminal 100.
- the memory 160 may store a program for operating the controller 180, or may temporarily store input/output data (e.g. phonebooks, messages, still images, moving images, and the like).
- the memory 160 may store data related to various patterns of vibrations and sounds outputted when performing a touch input on the touch screen.
- the memory 160 may be implemented using any type of suitable storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
- the mobile terminal 100 may operate a web storage that performs the storage function of the memory 160 on the Internet.
- the interface unit 170 may interface the mobile terminal 100 with external devices.
- the interface unit 170 may allow a data reception from an external device, a power delivery to each component in the mobile terminal 100, or a data transmission from the mobile terminal 100 to an external device.
- the interface unit 170 may include wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for coupling devices having an identification module, audio Input/Output (I/O) ports, video I/O ports, earphone ports, and/or the like, for example.
- I/O audio Input/Output
- the identification module may be a chip for storing various information required to authenticate an authority to use the mobile terminal 100, which may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and the like.
- the device having the identification module (hereafter also referred to as an identification device) may be implemented in a type of smart card.
- the identification device may be coupled to the mobile terminal 100 via a port.
- the interface unit 170 may serve as a path for power to be supplied from an external cradle to the mobile terminal 100 when the mobile terminal 100 is connected to the external cradle or as a path for transferring various command signals inputted from the cradle by a user to the mobile terminal 100.
- Such various command signals or power inputted from the cradle may operate as signals for recognizing that the mobile terminal 100 has accurately been mounted to the cradle.
- the controller 180 may control overall operations of the mobile terminal 100. For example, the controller 180 may perform the control and processing related to telephony calls, data communications, video calls, and/or the like.
- the controller 180 may include a multimedia module 181 that provides multimedia playback.
- the multimedia module 181 may be configured as part of the controller 180 or as a separate component.
- the controller 180 can perform a pattern recognition processing so as to recognize a handwriting or drawing input on the touch screen as text or image.
- the power supply unit 190 may receive external or internal power to provide power required by various components under control of the controller 180.
- embodiments may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and electrical units designed to perform the functions described herein.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- processors controllers, micro-controllers, microprocessors, and electrical units designed to perform the functions described herein.
- embodiments such as procedures or functions may be implemented together with separate software modules that allow performing of at least one function or operation.
- Software codes may be implemented by a software application written in any suitable programming language. The software codes may be stored in the memory 160 and executed by the controller 180.
- the user input unit 130 may be manipulated to receive a command for controlling operation of the mobile terminal 100, and may include a plurality of manipulation units.
- the manipulation units may be commonly designated as a manipulating portion, and any method may be employed if it is a tactile manner allowing the user to perform manipulation with a tactile feeling.
- the visual information may be displayed in the form of a character, a numeral, a symbol, a graphic, an icon, and/or the like.
- a character, a numeral, a symbol, a graphic, and an icon may be displayed with a predetermined arrangement so as to be implemented in the form of a keypad.
- a keypad may be referred to as a soft key.
- the display unit 151 may operate on an entire region or may operate by being separated into a plurality of regions. In case of the latter, the plurality of regions may be configured to operate in an associative way. For example, an output window and an input window may be displayed on the upper and lower portions of the display unit 151, respectively. The output window and the input window may be regions allocated to output or input information, respectively. A soft key on which numerals for inputting a phone number or the like are displayed is outputted on the input window. When the soft key is touched, a numeral corresponding to the touched soft key is displayed on the output window. When the first manipulating unit is manipulated, a phone call connection for the phone number displayed on the output window may be attempted or a text displayed on the output window may be entered to the application.
- the display unit 151 or touch pad may be configured to sense a touch scroll.
- the user may move an object displayed on the display unit 151, for example, a cursor or pointer provided on an icon or the like, by scrolling the display unit 151 or touch pad.
- a finger is moved on the display unit 151 or a touch pad, a path being moved by the finger may be visually displayed on the display unit 151. It may be useful to edit an image displayed on the display unit 151.
- one function of the terminal 100 may be implemented. For the example of being touched together, this is an example when the user clamps a body of the mobile terminal 100 using his or her thumb and forefinger.
- the above functions implemented in the mobile terminal 100 for example, there may be an activation or de-activation for the display unit 151 or the touch pad.
- FIGs. 2A and 2B are perspective views illustrating an external appearance of a mobile terminal 100.
- FIG. 2A is a front and a side view illustrating the mobile terminal 100.
- FIG. 2B is a rear and other side view illustrating the mobile terminal 100. Other arrangements and views may also be provided.
- the mobile terminal 100 may be provided with a bar-type terminal body.
- a bar-type terminal body may be provided with a bar-type terminal body.
- embodiments are not only limited to this type of terminal, but are also applicable to various structures of terminals such as a slide type, a folder type, a swivel type, a swing type, and the like, in which two and more bodies are combined with each other in a relatively movable manner.
- the terminal body may include a case (casing, housing, cover, etc.) forming an appearance of the terminal.
- the case or housing may be separated into a front case 101 and a rear case 102.
- Various electronic components may be integrated in a space formed between the front case 101 and the rear case 102.
- At least one middle case may be additionally disposed between the front case 101 and the rear case 102.
- the cases may be formed by injection-molding a synthetic resin or may be also formed of a metal material such as stainless steel (STS), titanium (Ti), or the like.
- STS stainless steel
- Ti titanium
- the display unit 151, the audio output module 152, the camera 121, the user input unit 130, the microphone 122, the interface unit 170, and the like may be arranged on (or at) the terminal body, mainly at or on the front case 101.
- the display unit 151 may occupy most of the front case 101.
- the audio output unit 152 and the camera 121 may be disposed on a region adjacent to one of both ends of the display unit 151, and the user input unit 131 and the microphone 122 are disposed at or on a region adjacent to the other end thereof.
- the user interface 132 and the interface 170, and/or the like, may be disposed on a lateral surface of the front case 101 and the rear case 102.
- the user input unit 130 is manipulated to receive a command for controlling the operation of the mobile terminal 100.
- the user input unit 130 may include a plurality of manipulation units 131, 132.
- the manipulation units 131, 132 may receive various commands.
- the first manipulation unit 131 may be used to receive a command, such as start, end, scroll, and/or the like.
- the second manipulation unit 132 may be used to receive a command, such as controlling a volume level being outputted from the audio output unit 152, or switching it into a touch recognition mode of the display unit 151.
- a camera 121' may be additionally mounted at or on a rear surface of the terminal body, namely the rear case 102.
- the rear camera 121' may have an image capturing direction, which is substantially opposite to the direction of the front camera 121, and may have a different number of pixels from those of the front camera 121.
- the front camera 121 may be configured to have a relatively small number of pixels
- the rear camera 121' may be configured to have a relatively large number of pixels. Accordingly, in an example where the front camera 121 is used for video communication, it may be possible to reduce a size of transmission data when the user captures his or her own face and sends it to the other party in real time.
- the rear camera 121' may be used for the purpose of storing high quality images.
- the cameras 121, 121' may be provided in the terminal body in a rotatable and pop-upable manner.
- a flash 123 and a mirror 124 may be additionally disposed adjacent to the rear camera 121'.
- the flash 123 may illuminate light toward an object when capturing the object with the camera 121'.
- the mirror 124 may allow the user to look at his or her own face, or the like, in a reflected way when capturing himself or herself (in a self-portrait mode) by using the rear camera 121'.
- a rear audio output unit 152' may be additionally disposed at or on a rear surface of the terminal body.
- the rear audio output unit 152' together with the front audio output unit 152 may implement a stereo function, and it may be also used to implement a speaker phone mode during a phone call.
- An antenna 116 for receiving broadcast signals may be additionally disposed at or on a lateral surface of the terminal body.
- the antenna 116 constituting part of a broadcast receiving module 111 may be provided so as to be pulled out from the terminal body.
- a power supply unit 190 (or power supply) for supplying power to the mobile terminal 100 may be mounted at or on the terminal body.
- the power supply unit 190 may be configured so as to be incorporated in the terminal body, or may be directly detachable from the outside of the terminal body.
- a touch pad 135 for detecting a touch may be additionally mounted at or on the rear case 102.
- the touch pad 135 may be also configured with an optical transmission type, similar to the display unit 151.
- a rear display unit for displaying visual information may be additionally mounted at or on the touch pad 135. Information displayed on the both surfaces of the front display unit 151 and rear display unit may be controlled by the touch pad 135.
- the touch pad 135 may operate in conjunction with the display unit 151 of the front case 101.
- the touch pad 135 may be disposed in parallel at a rear side of the display unit 151.
- the touch pad 135 may have a same size as or a smaller size than the display unit 151.
- the user may generate an input signal using a touch sensor provided at the display unit 151 of the mobile terminal 100.
- the mobile terminal 100 may activate the touch sensor only when the display unit 151 is activated, and there is a limit in generating various input signals.
- a mobile terminal capable of generating an input signal using various devices and a control method thereof may be described with reference to the accompanying drawings.
- FIGs. 3A and 3B are perspective views illustrating an external appearance of the mobile terminal 100 according to the present disclosure. A front surface and a lateral surface of the mobile terminal 100 are shown in FIGs. 3A and 3B . Other embodiments and views may also be provided.
- the mobile terminal 100 may include the display unit 151 and the user interface 130.
- the user interface 130 may include a squeeze sensor 231 (or sensor) and a touch sensor 232.
- the display unit 151 may occupy most of the front case 101.
- the audio output unit 152 and the camera 121 may be located in a region adjacent to one end portion of the display unit 151.
- the user interface 130 and the microphone 122 may be located in a region adjacent to the other end portion of the display unit 151.
- the touch sensor 232 may be disposed at a front surface of the display unit 151, and may be formed to sense a touch input.
- the squeeze sensor 231 may be disposed at a lateral surface of the body or housing, and may be formed by a pressure that is greater than a predetermined value being applied thereto. The pressure greater than the predetermined value may be an activation pressure.
- the squeeze sensor 231 may be located at a lateral surface of the front case 101 and the rear case 102 (or the lateral side of the housing).
- the squeeze sensor 231 may be located at the other lateral surface of the front case 101 and the rear case 102.
- the squeeze sensor 231 may be located at two lateral surfaces of the front case 101 and the rear case 102, or the squeeze sensor 231 may be located at all four lateral surfaces thereof.
- the squeeze sensor 231 may include a plurality of squeeze sensors.
- the plurality of squeeze sensors may be located at a lateral surface of the front case 101 and the rear case 102 (i.e., a lateral side of the housing).
- the controller 180 may detect (or determine) a pressure generated according to a location at which each finger is placed.
- the squeeze sensor 231 may detect (or determine) a squeeze state that is generated by fingers applying pressures above a predetermined value (i.e., above the activation pressure).
- the mobile terminal 100 may distinguish a grip state and a squeeze state based on a size (or amount) of a pressure applied to the squeeze sensor unit 231. For example, when a pressure that is less than a predetermined value is applied to the squeeze sensor 231, it may be referred to as a grip state. When a pressure greater than a predetermined value is applied to the squeeze sensor 231, it may be referred to as a squeeze state.
- the controller 180 may perform a control operation according to an input applied to the squeeze sensor 231 in a squeeze state.
- FIGs. 4 and 5 are flow charts of a method of a mobile terminal 100 according to an example embodiment. Other embodiments and operations may also be provided. Embodiments may be described with respect to a mobile terminal 200, which includes components of the mobile terminal 100.
- the mobile terminal 200 may include a display unit 251 (corresponding to the display unit 151), the squeeze sensor 231, the touch sensor 232, and the controller 180.
- a pressure greater than a predetermined value may be applied to the squeeze sensor 231 (S110).
- the main body or housing may include a front surface, a rear surface and a lateral surface.
- the display unit 251 may be disposed at the front surface of the body to display an object.
- the squeeze sensor 231 may include at least one squeeze sensor.
- the squeeze sensor 231 may be disposed at the lateral surface of the body (or housing).
- the squeeze sensor 231 may be disposed only at the lateral surface of the body (or housing), and/or may be disposed at all lateral surfaces of the body (or housing).
- the squeeze sensor 231 may be disposed a first lateral surface and a second lateral surface that cross each other, respectively, among the lateral surfaces of the body, and/or the squeeze sensor 231 may be disposed at a first and a third lateral surface that are not crossed to each other, respectively, among the lateral surfaces of the body.
- the squeeze sensor 231 may convert a pressure applied to a specific portion into an electrical input signal.
- the squeeze sensor 231 may detect (or determine) a size (or amount) of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area.
- the display unit 251 may display an indicator that indicates at least one of a size (or amount) of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area.
- a location of an object displayed on the display unit 251 may change based on a pressure applied position (S120).
- the controller 180 may detect (or determine) one of the squeeze sensors to which the pressure is applied. The controller 180 may recognize a pressure applied position based on the detected result.
- the controller 180 may then change the location of an object displayed on the display unit 251 based on the pressure applied position.
- the controller 180 may determine a priority of each of the plurality of objects, and may change the location of the plurality of objects based on priority of the each of the objects and the pressure applied position. For example, based on a use frequency of an object, the object with a high use frequency may be disposed adjacent to the pressure applied position, and the object with a low use frequency may be disposed far away from the pressure applied position.
- the controller 180 may determine whether the user's hand (holding the body) is the user's left hand, the user's right hand, or both hands based on the pressure applied position. The controller 180 may change the location of an object displayed on the display unit 251 based on the determined result.
- the squeeze sensor 231 may sense a pressure greater than a predetermined value (S210).
- a pressure greater than a predetermined value is applied to the squeeze sensor 131, the touch sensor 232 may adjust sensitivity of the touch sensor 232 to sense a touch input applied to a lateral surface of the body (S220).
- the controller 180 may generate a control command for adjusting sensitivity of the touch sensor 232. Accordingly, the controller 180 may increase sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to the lateral surface of the body in addition to the front surface of the body.
- the display unit 251 may display an indicator indicating a sensitivity of a touch sensor.
- the touch sensor 232 may sense a stylus pen even if the display unit 251 is separated from the stylus pen by a predetermined distance.
- the controller 180 may restore the sensitivity of the touch sensor 232 when a predetermined time has passed after a pressure greater than a predetermined value is applied to the squeeze sensor 231.
- the controller 180 may restore the sensitivity of the touch sensor 232 when the pressure greater than a predetermined value is applied again.
- a location of an object displayed on the display unit 251 may be changed based on a position at which the touch input is sensed (S230).
- the controller 180 may determine a priority of each of the plurality of objects, and the controller 180 may change the location of the plurality of objects based on priority of the each of the objects and the position at which the touch input is sensed. For example, based on a use frequency of an object, the object with a high use frequency may be disposed adjacent to the position at which the touch input is sensed, and the object with a low use frequency may be disposed far away from the position at which the touch input is sensed.
- the controller 180 may determine whether the user's hand (holding the body) is the user's left hand, the user's right hand, or both hands based on the position at which the touch input is sensed. The controller 180 may change the location of an object displayed on the display unit 251 based on the determined result.
- the location of the user's finger may be sensed by the squeeze sensor 231 and the location of objects displayed on the display unit 251 may change according to the location of the finger, thereby allowing the user to conveniently touch objects. Accordingly, the user may control the mobile terminal 100 even with only one hand.
- a pressure applied to the squeeze sensor 231 and a pressure applied to the touch sensor 232 may be distinguished from each other, thereby reducing the user's input error. As a result, a user's convenience may be enhanced.
- FIGs. 6 and 7 are views illustrating an operation example of the mobile terminal 200.
- FIG. 6 shows that the mobile terminal 200 may include the display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the display unit 251 may display a dock screen 252, and the dock screen 252 may include a plurality of objects.
- the controller 180 may recognize (or determine) a pressure applied position by detecting one of the plurality of squeeze sensors to which the pressure is applied. The controller 180 may then change the location of a plurality of objects contained in the dock screen 252. Accordingly, the plurality of objects contained in the dock screen 252 may be displayed in a region of the display unit 251 that is adjacent to the pressure applied position.
- the controller 180 may increase a sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body (or housing).
- the controller 180 may then change a location of the plurality of objects contained in the dock screen 252 based on a position at which a touch input is sensed. Accordingly, the plurality of objects contained in the dock screen 252 may be displayed in a region of the display unit 251 that is adjacent to the position at which a touch input is sensed.
- the display unit 251 may display a plurality of objects.
- the object may be one of an icon, a widget, a thumbnail image and/or an application execution menu.
- the controller 180 may recognize (or determine) a pressure applied position by detecting one of the plurality of squeeze sensors to which a pressure is applied. The controller 180 may then change the location of a plurality of objects based on the pressure applied position. The controller 180 may determine a priority of each of the plurality of objects, and change the location of a plurality of objects based on the priority of each of the objects.
- the controller 180 may increase a sensitivity of the touch sensor 232.
- the controller 180 may then change the location of a plurality of objects based on a position at which a touch input is sensed.
- the controller 180 may determine priority of each of the plurality of objects, and change the location of a plurality of objects based on the priority of each of the objects.
- FIGs. 8 and 9 are views illustrating an operation of the mobile terminal.
- FIG. 8 shows that the mobile terminal 200 may include the display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the display unit 251 may display a plurality of objects.
- the controller 180 may recognize that the user's hand (holding the body) is the user's left hand based on a pressure applied position, as shown in FIG. 8 .
- the controller 180 may then change the location of a plurality of objects based on the pressure applied position. Accordingly, the user may easily control objects displayed on the display unit 251 with the user's left hand holding the body.
- the controller 180 may recognize (or determine) that the user's hand (holding the body) is the user's right hand, and the controller 180 may change a location of a plurality of objects based on a pressure applied position, as shown in FIG. 9 . Accordingly, the user may easily control objects displayed on the display unit 251 with the user's right hand holding the body.
- the controller 180 may increase a sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body.
- the controller 180 may then recognize (or determine) that the user's hand (holding the body) is the user's left hand based on a position at which a touch input is sensed.
- the controller 180 may then change the location of a plurality of objects based on the position at which a touch input is sensed. Accordingly, the user may easily control objects displayed on the display unit 251 with the user's left hand holding the body.
- the controller 180 may recognize (or determine) that the user's hand (holding the body) is the user's right hand, and the controller 180 may change the location of a plurality of objects based on a position at which a touch input is sensed. Accordingly, the user may easily control objects displayed on the display unit 251 with the user's right hand holding the body.
- FIG. 10 is a view illustrating an operation of the mobile terminal 200.
- the mobile terminal 200 may include the display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the display unit 251 may display a dock screen 252, and the dock screen 252 may include a plurality of objects.
- the controller 180 may generate different control commands based on a pattern of the applied pressure.
- the pattern of the applied pressure may include at least one of a size of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area.
- the controller 180 may increase a size of objects displayed on the display unit 251. As shown in FIG. 10 , the controller 180 may display, on the display unit 251, a progress bar 254 indicating a size of the applied pressure, or a pressure applied frequency.
- the controller 180 may reduce the size of objects displayed on the display unit 251 based on a pattern of the applied pressure.
- FIG. 11 is a view illustrating an operation of the mobile terminal 200.
- the mobile terminal 200 may include display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the display unit 251 may display a lock screen in a lock state that restricts an input of a control command to an application.
- the controller 180 may recognize (or determine) a pattern of the applied pressure based on a pressure applied position. The controller 180 may then release the lock state when the pattern of the applied pressure corresponds to a predetermined pattern.
- the display unit 251 may display a home screen while releasing the lock state.
- the controller 180 may increase a sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. The controller 180 may then release the lock state when the pattern of the applied pressure corresponds to a predetermined pattern based on a position at which a touch input is sensed.
- FIGs. 12 and 13 are views illustrating an operation of the mobile terminal.
- the mobile terminal 200 may include the display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the controller 180 may generate a control command associated with the event based on a pattern of the pressure applied to the squeeze sensor 231.
- a popup window 255 containing the received text message may be displayed on the display unit 251.
- the controller 180 may generate a control command corresponding to the first pattern.
- the controller 180 may then execute an application associated with the event. Accordingly, the controller 180 may execute a text message application and display an execution screen on the display unit 251.
- the controller 180 may generate a control command corresponding to the second pattern. The controller 180 may then terminate the event. Accordingly, the controller 180 may terminate the text message application, and allow the popup window 255 that has been displayed on the display unit 251 to disappear.
- the controller 180 may increase a sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. The controller 180 may then execute an application associated with the event or terminate the event based on a pattern of the touch input.
- FIGs. 14 and 15 are views illustrating an operation of the mobile terminal.
- the mobile terminal 200 may include the display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the display unit 251 may display first screen information.
- the controller 180 may change the first screen information to second screen information based on a pattern of the applied pressure.
- the controller 180 may generate a control command corresponding to the first pattern.
- the controller 180 may perform auto scrolling to display the first screen information and second screen information contained in the same page are displayed on the display unit 251. Accordingly, the display unit 251 can display the second screen information by auto scrolling.
- the controller 180 may generate a control command corresponding to the second pattern.
- the controller 180 may perform an operation of turning over a page to display, on the display unit 251, the first screen information and the second screen information contained in another page.
- the controller 180 may increase a sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. The controller 180 may then perform auto scrolling or display a next page on the display unit 251 based on a pattern of the touch input.
- FIG. 16 is a view illustrating an operation of the mobile terminal 200.
- the mobile terminal 200 may include the display unit 251, the squeeze sensor 231, the touch sensor 232, and the controller 180. Other embodiments and views may also be provided.
- the content output unit may output a content.
- the controller 180 may change the output content to another content or adjust a volume level of the output audio based on the pattern of a pressure applied to the squeeze sensor 231.
- the controller 180 may generate a control command based on a pattern of the applied pressure. In other words, the controller 180 may change the output content to another content.
- the controller 180 may adjust a volume level of the output audio or change a channel of the output image based on a pattern of the applied pressure.
- the controller 180 may further transmit a call signal to a preset counterpart based on a pattern of the applied pressure. Accordingly, the user may transmit a call signal to a frequently contacted counterpart in a more convenient manner, and may also transmit a call signal at a higher speed in an emergency state.
- the controller 180 may increase a sensitivity of the touch sensor 232 to allow the touch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. The controller 180 may then change the output content to another content or adjust the volume level of the output content based on a pattern of the touch input.
- An objective may be to provide a mobile terminal and control method thereof capable of generating an input signal using various devices.
- a mobile terminal may include a main body or housing (having a front surface, a rear surface and a lateral surface); a display unit disposed at a front surface of the body, and formed to display an object; and a squeeze sensor (unit) disposed at a lateral surface of the body to sense a pressure above a predetermined value being applied thereto.
- a controller may be configured to change a location of an object displayed on the display unit based on a pressure applied position when the pressure above a predetermined value (or activation pressure) is applied to the squeeze sensor (unit).
- the squeeze sensor (unit) may include at least one squeeze sensor disposed at a lateral surface of the body.
- the controller may detect one of the at least one squeeze sensor to which a pressure is applied, and the controller may recognize the pressure applied position according to the detected result when the pressure above a predetermined value is applied to the squeeze sensor (unit).
- the controller may determine whether the user's hand (holding the body) is his or her left hand, right hand, or both hands based on the pressure applied position.
- the controller may change the location of an object displayed on the display unit based on the determined result.
- the display unit may be formed to display a plurality of objects.
- the controller may determine a priority of each of the plurality of objects.
- the controller may change the location of the plurality of objects based on the priority of each of the objects and the pressure applied position.
- the controller may generate different control commands based on a pattern of the applied pressure when the pressure greater than a predetermined value is applied to the squeeze sensor unit.
- the pattern of the applied pressure may include at least one of a size of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area.
- the display unit may be formed to display a lock screen in a lock state that restricts an input of a control command to an application.
- the controller may release the lock state when the pattern of the applied pressure corresponds to a predetermined pattern in a state that the lock screen is displayed.
- the controller may execute an application associated with the event or terminate the event based on the pattern of the applied pressure.
- the display unit may be formed to display first screen information.
- the controller may change the first screen information to second screen information based on the pattern of the applied pressure in a state that the first screen information is displayed.
- the first and the second screen information may be contained in one page or contained in different first and second pages, respectively.
- the controller may change the first screen information to the second screen information by an auto scrolling to the page when the first and the second screen information are contained in one page.
- the mobile terminal may further include a content output unit configured to output a content.
- the controller may change the output content to another content or adjust a volume level of the output audio based on the pattern of the applied pressure.
- the controller may display an indicator indicating at least one of a size of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area, being applied to the squeeze sensor unit.
- a mobile terminal may include a main body or housing) (having a front surface, a rear surface and a lateral surface); a display unit disposed at a front surface of the body, and formed to display an object; and a touch sensor (unit) disposed at a front surface of the display unit to sense a touch input.
- a squeeze sensor (unit) may be disposed at a lateral surface of the body to sense a pressure above a predetermined value (or activation pressure) being applied thereto.
- a controller may be configured to adjust a sensitivity of the touch sensor (unit) such that the touch sensor (unit) can sense a touch input applied to a lateral surface of the body when the pressure above a predetermined value is applied to the squeeze sensor (unit), and the controller may change the location of an object displayed on the display unit based on a position at which the touch input is sensed when the touch input is sensed at a lateral surface of the body.
- the controller may restore a sensitivity of the touch sensor unit when a predetermined time has passed after the pressure above a predetermined value is applied to the squeeze sensor (unit) or the pressure above a predetermined value is applied again.
- the controller may determine whether the user's hand (holding the body) is his or her left hand, right hand, or both hands based on a position at which the touch input is sensed at a lateral surface of the body, and the controller may change the location of an object displayed on the display unit based on the determined result.
- the display unit may be formed to display a plurality of objects.
- the controller may determine the priority of each of the plurality of objects, and may change the location of the plurality of objects based on the priority of the each of the objects and the position at which the touch input is sensed at a lateral surface of the body.
- the display unit may be formed to display a lock screen in a lock state that restricts an input of a control command to an application.
- the controller may release the lock state when the pattern of the touch input sensed at a lateral surface of the body corresponds to a predetermined pattern in a state that the lock screen is displayed.
- the controller may execute an application associated with the event or terminate the event based on the pattern of the touch input sensed at a lateral surface of the body.
- the display unit may be formed to display first screen information.
- the controller may change the first screen information to second screen information based on the pattern of the touch input sensed at a lateral surface of the body in a state that the first screen information is displayed.
- the first and the second screen information may be contained in one page or contained in different first and second pages, respectively.
- the controller may change the first screen information to the second screen information by an auto scrolling to the page when the first and the second screen information are contained in one page.
- the mobile terminal may further include a content output unit configured to output a content.
- the controller may change the output content to another content or adjust a volume level of the output audio based on the pattern of the touch input sensed at a lateral surface of the body.
- the controller may display an indicator indicating sensitivity of the touch sensor (unit) on the display unit.
- a control method of a mobile terminal may include allowing a squeeze sensor (unit) disposed at a lateral surface of the body to sense a pressure above a predetermined value being applied thereto; and changing the location of an object displayed on the display unit based on a pressure applied position when the pressure above a predetermined value is applied to the squeeze sensor (unit).
- a control method of a mobile terminal may also include allowing a squeeze sensor (unit) disposed at a lateral surface of the body to sense a pressure above a predetermined value being applied thereto; adjusting a sensitivity of the touch sensor (unit) such that the touch sensor (unit) can sense a touch input applied to a lateral surface of the body when the pressure above a predetermined value is applied to the squeeze sensor (unit); and changing a location of an object displayed on the display unit based on a position at which the touch input is sensed when the touch input is sensed at a lateral surface of the body.
- Embodiments of the foregoing method may be implemented as codes readable by a processor on a medium written by a program.
- Examples of the processor-readable media may include ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical data storage device, and the like, and may also include a device implemented in a form of a carrier wave (for example, transmission via the Internet).
- any reference in this specification to "one embodiment,” “an embodiment,” “example embodiment,” etc. means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention.
- the appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
- Measuring Fluid Pressure (AREA)
Abstract
Description
- The present disclosure may relate to a mobile terminal having a squeeze sensor.
- Terminals may be classified as mobile (or portable) terminals and stationary terminals based on its mobility. The mobile terminals may be further classified as handheld terminals and vehicle mount terminals based on whether (or not) the terminal can be directly carried by a user.
- A terminal may be allowed to capture still images or moving images, play music or video files, play games, receive broadcast and the like, so as to be implemented as an integrated multimedia player. Improvement of structural or software elements of the terminal may be taken into consideration to support and enhance functions of the terminal.
- A user may generate an input signal using a touch sensor provided on a display unit of the terminal. However, the touch sensor may be activated only when the display unit is activated, and thus the touch sensor has a limit in generating various input signals.
- Arrangements and embodiments may be described in detail with reference to the following drawings in which like reference numerals refer to like elements and wherein:
-
FIG. 1 is a block diagram illustrating a mobile terminal; -
FIGs. 2A and 2B are perspective views illustrating an external appearance of the mobile terminal; -
FIGs. 3A and 3B are perspective views illustrating an external appearance of the mobile terminal; -
FIGs. 4 and 5 are flow charts for explaining a method of a mobile terminal according to an example embodiment of the present disclosure; and -
FIGs. 6 through 16 are views illustrating an operation example of the mobile terminal. -
FIG. 1 is a block diagram illustrating a mobile terminal according to an example arrangement. Other arrangements may also be provided. -
FIG. 1 shows that amobile terminal 100 may include awireless communication unit 110, an audio/video (A/V)input unit 120, auser input unit 130, asensing unit 140, anoutput unit 150, amemory 160, aninterface unit 170, acontroller 180, apower supply unit 190, and the like. However, constituent elements as shown inFIG. 1 are not necessarily required, and themobile terminal 100 may be implemented with greater or less number of elements than those illustrated elements. - The
wireless communication unit 110 may include one or more elements allowing radio communication between themobile terminal 100 and a wireless communication system, or allowing wireless (or radio) communication between themobile terminal 100 and a network in which themobile terminal 100 is located. For example, thewireless communication unit 110 may include abroadcast receiving module 111, amobile communication module 112, a wireless Internet module 113, a short-range communication module 114, alocation information module 115, and/or the like. - The
broadcast receiving module 111 may receive broadcast signals and/or broadcast associated information from an external broadcast management server through a broadcast channel. The broadcast associated information may be information regarding a broadcast channel, a broadcast program, a broadcast service provider, and/or the like. The broadcast associated information may also be provided through a mobile communication network. The broadcast associated information may be received by themobile communication module 112. The broadcast signal and broadcast-associated information received through thebroadcast receiving module 111 may be stored in thememory 160. - The
mobile communication module 112 may transmit and/or receive a radio signal to and/or from at least one of a base station, an external terminal and/or a server over a mobile communication network. The radio signal may include a voice call signal, a video call signal or various types of data according to text and/or multimedia message transmission and reception. - The wireless Internet module 113 may be a module for supporting wireless Internet access and may be built-in or externally installed to the
mobile terminal 100. A variety of wireless Internet access techniques may be used, such as WLAN (Wireless LAN), Wi-Fi, Wibro (Wireless Broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), and/or the like. - The short-
range communication module 114 may be a module for supporting a short-range communication. A variety of short-range communication technologies may be used, such as Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra WideBand (UWB), ZigBee, and/or the like. - The
location information module 115 may be a module for acquiring a location (or position) of themobile terminal 100. A GPS module may be an example of thelocation information module 115. - The A/V (audio/video)
input unit 120 may receive an audio or video signal, and the A/V (audio/video)input unit 120 may include acamera 121, amicrophone 122, and the like. Thecamera 121 may process an image frame, such as still or moving images, obtained by an image sensor in a video phone call or an image capturing mode. The processed image frame may be displayed on a display unit 151 (or display). The image frames processed by thecamera 121 may be stored in thememory 160 or may be transmitted to an external device through thewireless communication unit 110. Two ormore cameras 121 may be provided according to a use environment of themobile terminal 100. - The
microphone 122 may receive an external audio signal through a microphone in a phone call mode, a recording mode, a voice recognition mode, and the like, and themicrophone 122 may process the audio signal into electrical voice data. The processed voice data may be converted and outputted in a format that is transmittable to a mobile communication base station through themobile communication module 112 in the phone call mode. Themicrophone 122 may implement various types of noise canceling algorithms to cancel noise generated during the process of receiving the external audio signal. - The
user input unit 130 may generate input data to control an operation of themobile terminal 100. Theuser input unit 130 may be configured with a keypad, a dome switch, a touch pad (pressure/capacitance), a jog wheel, a jog switch, and/or the like. - The
sensing unit 140 may detect presence or absence of the user's contact, and a current status of themobile terminal 100 such as an opened or closed configuration, a location of themobile terminal 100, an orientation of themobile terminal 100, an acceleration or deceleration of themobile terminal 100, and the like. Thesensing unit 140 may generate a sensing signal for controlling operation of themobile terminal 100. For example, when themobile terminal 100 is a slide phone type, thesensing unit 140 may sense an opened or closed configuration of the slide phone. Thesensing unit 140 may sense whether (or not) power is supplied from thepower supply unit 190, or whether (or not) an external device is coupled to theinterface unit 170. - The
sensing unit 140 may include aproximity sensor 141. Thesensing unit 140 may include a touch sensor for sensing a touch operation with respect to thedisplay unit 151. Other sensors may also be used. - The touch sensor may be implemented as a touch film, a touch sheet, a touch pad, and/or the like. The touch sensor may be configured to convert changes of a pressure applied to a specific part of the
display unit 151, or a capacitance generated from a specific part of thedisplay unit 151, into electric input signals. The touch sensor may be configured to sense not only a touched position and a touched area, but also a touch pressure. - When the touch sensor and the
display unit 151 form an interlayer structure, thedisplay unit 151 may be used as an input device (rather than an output device). Thedisplay unit 151 may be referred to as a touch screen. - When there is a touch input through the touch screen, the corresponding signals may be transmitted to a touch controller. The touch controller may process signals transferred from the touch sensor, and then transmit data corresponding to the processed signals to the
controller 180. Thecontroller 180 may sense which region of thedisplay unit 151 has been touched. - When the touch screen is a capacitance type, a proximity of a sensing object may be detected by changes of an electromagnetic field according to proximity of a sensing object. The touch screen may be categorized as a
proximity sensor 141. - The
proximity sensor 141 may be a sensor for detecting presence or absence of a sensing object using an electromagnetic field or infrared rays without a mechanical contact. Theproximity sensor 141 may have a longer lifespan and more enhanced utility than a contact sensor. Theproximity sensor 141 may include an optical transmission type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and/or the like. - Hereinafter, for ease of explanation, a behavior of closely approaching the touch screen without contact may be referred to as a proximity touch, whereas a behavior of substantially coming in contact with the touch screen may be referred to as a contact touch.
- The
proximity sensor 141 may sense proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, moving status, etc.). Information relating to the sensed proximity touch and the sensed proximity touch patterns may be output on the touch screen. - The
output unit 150 may generate an output related to visual, auditory, tactile senses. Theoutput unit 150 may include the display unit 151 (or display), anaudio output module 152, an alarm unit 153 (or alarm), ahaptic module 154, and/or the like. - The
display unit 151 may display (output) information processed in themobile terminal 100. For example, when themobile terminal 100 is operated in a phone call mode, thedisplay unit 151 may display a user interface (UI) or graphic user interface (GUI) related to a phone call. When themobile terminal 100 is operated in a video call mode or an image capturing mode, thedisplay unit 151 may display a captured image, a received image, UI, GUI, and/or the like. - The
display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a 3-dimensional (3D) display, and/or an e-ink display. - At least one of those displays (or display devices) included in the
display unit 151 may be configured with a transparent or optical transparent type to allow the user to view the outside therethrough. It may be referred to as a transparent display. An example of the transparent display may be a transparent OLED (TOLED), and/or the like. Under this configuration, the user may view an object positioned at a rear side of the mobile device body through a region occupied by thedisplay unit 151 of the mobile device body. - Two or
more display units 151 may be provided according to implementation of themobile terminal 100. For example, a plurality of thedisplay units 151 may be provided on one surface in a separate or integrated manner, or may be provided on different surfaces, respectively. - The
audio output module 152 may output audio data received from thewireless communication unit 110 or stored in thememory 160, in a call-receiving mode, a call-placing mode, a recording mode, a voice selection mode, a broadcast reception mode, and/or the like. Theaudio output module 152 may output an audio signal related to a function carried out in the mobile terminal 100 (e.g. sound alarming a call received or a message received, and the like). Theaudio output module 152 may include a receiver, a speaker, a buzzer, and/or the like. - The
alarm unit 153 may output signals notifying occurrence of an event from themobile terminal 100. The examples of an event occurring from themobile terminal 100 may include a call received, a message received, a key signal input, a touch input, and/or the like. Thealarm unit 153 may output not only video or audio signals, but also other types of signals such as signals for notifying the occurrence of an event in a vibration manner. Since the video or audio signals may also be output through thedisplay unit 151 or theaudio output unit 152, thedisplay unit 151 and theaudio output module 152 may be categorized as part of thealarm unit 153. - The
haptic module 154 may generate various tactile effects that can be felt by the user. A representative example of the tactile effects generated by the haptic module 155 may include vibration. Vibration generated by the haptic module 155 may have a controllable intensity, a controllable pattern, and/or the like. For example, different vibrations may be output in a synthesized manner or in a sequential manner. - The
haptic module 154 may generate various tactile effects, including not only vibration, but also arrangement of pins vertically moved with respect to a skin surface being touched, air injection force or air suction force through an injection port or suction port, touch by a skin surface, contact with an electrode, effects by stimulus such as an electrostatic force, reproduction of cold or hot feeling using a heat absorbing device or heat emitting device, and/or the like. - The
haptic module 154 may transmit tactile effects through the user's direct contact, or the user's muscular sense using a finger or a hand. Two or morehaptic modules 154 may be provided according to configuration of themobile terminal 100. - The
memory 160 may store a program for operating thecontroller 180, or may temporarily store input/output data (e.g. phonebooks, messages, still images, moving images, and the like). Thememory 160 may store data related to various patterns of vibrations and sounds outputted when performing a touch input on the touch screen. - The
memory 160 may be implemented using any type of suitable storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, optical disk, and the like. Themobile terminal 100 may operate a web storage that performs the storage function of thememory 160 on the Internet. - The interface unit 170 (or interface) may interface the
mobile terminal 100 with external devices. Theinterface unit 170 may allow a data reception from an external device, a power delivery to each component in themobile terminal 100, or a data transmission from themobile terminal 100 to an external device. Theinterface unit 170 may include wired/wireless headset ports, external charger ports, wired/wireless data ports, memory card ports, ports for coupling devices having an identification module, audio Input/Output (I/O) ports, video I/O ports, earphone ports, and/or the like, for example. - The identification module may be a chip for storing various information required to authenticate an authority to use the
mobile terminal 100, which may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), and the like. The device having the identification module (hereafter also referred to as an identification device) may be implemented in a type of smart card. The identification device may be coupled to themobile terminal 100 via a port. - The
interface unit 170 may serve as a path for power to be supplied from an external cradle to themobile terminal 100 when themobile terminal 100 is connected to the external cradle or as a path for transferring various command signals inputted from the cradle by a user to themobile terminal 100. Such various command signals or power inputted from the cradle may operate as signals for recognizing that themobile terminal 100 has accurately been mounted to the cradle. - The
controller 180 may control overall operations of themobile terminal 100. For example, thecontroller 180 may perform the control and processing related to telephony calls, data communications, video calls, and/or the like. Thecontroller 180 may include amultimedia module 181 that provides multimedia playback. Themultimedia module 181 may be configured as part of thecontroller 180 or as a separate component. Thecontroller 180 can perform a pattern recognition processing so as to recognize a handwriting or drawing input on the touch screen as text or image. - The power supply unit 190 (or power supply) may receive external or internal power to provide power required by various components under control of the
controller 180. - Various embodiments described herein may be implemented in a computer or similar device readable medium using software, hardware, or any combination thereof.
- For hardware implementation, embodiments may be implemented by using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and electrical units designed to perform the functions described herein. Such embodiments may be implemented in the
controller 180 itself. - For software implementation, embodiments such as procedures or functions may be implemented together with separate software modules that allow performing of at least one function or operation. Software codes may be implemented by a software application written in any suitable programming language. The software codes may be stored in the
memory 160 and executed by thecontroller 180. - The method of processing a user input to the
mobile terminal 100 may now be described. - The
user input unit 130 may be manipulated to receive a command for controlling operation of themobile terminal 100, and may include a plurality of manipulation units. The manipulation units may be commonly designated as a manipulating portion, and any method may be employed if it is a tactile manner allowing the user to perform manipulation with a tactile feeling. - Various kinds of visual information may be displayed on the
display unit 151. The visual information may be displayed in the form of a character, a numeral, a symbol, a graphic, an icon, and/or the like. For an input of the visual information, at least one of a character, a numeral, a symbol, a graphic, and an icon may be displayed with a predetermined arrangement so as to be implemented in the form of a keypad. Such a keypad may be referred to as a soft key. - The
display unit 151 may operate on an entire region or may operate by being separated into a plurality of regions. In case of the latter, the plurality of regions may be configured to operate in an associative way. For example, an output window and an input window may be displayed on the upper and lower portions of thedisplay unit 151, respectively. The output window and the input window may be regions allocated to output or input information, respectively. A soft key on which numerals for inputting a phone number or the like are displayed is outputted on the input window. When the soft key is touched, a numeral corresponding to the touched soft key is displayed on the output window. When the first manipulating unit is manipulated, a phone call connection for the phone number displayed on the output window may be attempted or a text displayed on the output window may be entered to the application. - The
display unit 151 or touch pad may be configured to sense a touch scroll. The user may move an object displayed on thedisplay unit 151, for example, a cursor or pointer provided on an icon or the like, by scrolling thedisplay unit 151 or touch pad. When a finger is moved on thedisplay unit 151 or a touch pad, a path being moved by the finger may be visually displayed on thedisplay unit 151. It may be useful to edit an image displayed on thedisplay unit 151. - In order to cope with an example where the
display unit 151 and the touch pad are touched together within a predetermined period of time, one function of the terminal 100 may be implemented. For the example of being touched together, this is an example when the user clamps a body of themobile terminal 100 using his or her thumb and forefinger. For one of the above functions implemented in themobile terminal 100, for example, there may be an activation or de-activation for thedisplay unit 151 or the touch pad. -
FIGs. 2A and 2B are perspective views illustrating an external appearance of amobile terminal 100.FIG. 2A is a front and a side view illustrating themobile terminal 100.FIG. 2B is a rear and other side view illustrating themobile terminal 100. Other arrangements and views may also be provided. - Referring to
FIG. 2A , themobile terminal 100 may be provided with a bar-type terminal body. However, embodiments are not only limited to this type of terminal, but are also applicable to various structures of terminals such as a slide type, a folder type, a swivel type, a swing type, and the like, in which two and more bodies are combined with each other in a relatively movable manner. - The terminal body may include a case (casing, housing, cover, etc.) forming an appearance of the terminal. The case or housing may be separated into a
front case 101 and arear case 102. Various electronic components may be integrated in a space formed between thefront case 101 and therear case 102. At least one middle case may be additionally disposed between thefront case 101 and therear case 102. - The cases may be formed by injection-molding a synthetic resin or may be also formed of a metal material such as stainless steel (STS), titanium (Ti), or the like.
- The
display unit 151, theaudio output module 152, thecamera 121, theuser input unit 130, themicrophone 122, theinterface unit 170, and the like may be arranged on (or at) the terminal body, mainly at or on thefront case 101. - The
display unit 151 may occupy most of thefront case 101. Theaudio output unit 152 and thecamera 121 may be disposed on a region adjacent to one of both ends of thedisplay unit 151, and theuser input unit 131 and themicrophone 122 are disposed at or on a region adjacent to the other end thereof. The user interface 132 and theinterface 170, and/or the like, may be disposed on a lateral surface of thefront case 101 and therear case 102. - The
user input unit 130 is manipulated to receive a command for controlling the operation of themobile terminal 100. Theuser input unit 130 may include a plurality ofmanipulation units 131, 132. - The
manipulation units 131, 132 may receive various commands. For example, thefirst manipulation unit 131 may be used to receive a command, such as start, end, scroll, and/or the like. The second manipulation unit 132 may be used to receive a command, such as controlling a volume level being outputted from theaudio output unit 152, or switching it into a touch recognition mode of thedisplay unit 151. - Referring to
FIG. 2B , a camera 121' may be additionally mounted at or on a rear surface of the terminal body, namely therear case 102. The rear camera 121' may have an image capturing direction, which is substantially opposite to the direction of thefront camera 121, and may have a different number of pixels from those of thefront camera 121. - For example, the
front camera 121 may be configured to have a relatively small number of pixels, and the rear camera 121' may be configured to have a relatively large number of pixels. Accordingly, in an example where thefront camera 121 is used for video communication, it may be possible to reduce a size of transmission data when the user captures his or her own face and sends it to the other party in real time. On the other hand, the rear camera 121' may be used for the purpose of storing high quality images. - The
cameras 121, 121' may be provided in the terminal body in a rotatable and pop-upable manner. - A
flash 123 and amirror 124 may be additionally disposed adjacent to the rear camera 121'. Theflash 123 may illuminate light toward an object when capturing the object with the camera 121'. Themirror 124 may allow the user to look at his or her own face, or the like, in a reflected way when capturing himself or herself (in a self-portrait mode) by using the rear camera 121'. - A rear audio output unit 152' may be additionally disposed at or on a rear surface of the terminal body. The rear audio output unit 152' together with the front
audio output unit 152 may implement a stereo function, and it may be also used to implement a speaker phone mode during a phone call. - An
antenna 116 for receiving broadcast signals may be additionally disposed at or on a lateral surface of the terminal body. Theantenna 116 constituting part of abroadcast receiving module 111 may be provided so as to be pulled out from the terminal body. - A power supply unit 190 (or power supply) for supplying power to the
mobile terminal 100 may be mounted at or on the terminal body. Thepower supply unit 190 may be configured so as to be incorporated in the terminal body, or may be directly detachable from the outside of the terminal body. - A
touch pad 135 for detecting a touch may be additionally mounted at or on therear case 102. Thetouch pad 135 may be also configured with an optical transmission type, similar to thedisplay unit 151. Alternatively, a rear display unit for displaying visual information may be additionally mounted at or on thetouch pad 135. Information displayed on the both surfaces of thefront display unit 151 and rear display unit may be controlled by thetouch pad 135. - The
touch pad 135 may operate in conjunction with thedisplay unit 151 of thefront case 101. Thetouch pad 135 may be disposed in parallel at a rear side of thedisplay unit 151. Thetouch pad 135 may have a same size as or a smaller size than thedisplay unit 151. - The user may generate an input signal using a touch sensor provided at the
display unit 151 of themobile terminal 100. However, themobile terminal 100 may activate the touch sensor only when thedisplay unit 151 is activated, and there is a limit in generating various input signals. - A mobile terminal capable of generating an input signal using various devices and a control method thereof may be described with reference to the accompanying drawings.
-
FIGs. 3A and 3B are perspective views illustrating an external appearance of themobile terminal 100 according to the present disclosure. A front surface and a lateral surface of themobile terminal 100 are shown inFIGs. 3A and 3B . Other embodiments and views may also be provided. - The
mobile terminal 100 may include thedisplay unit 151 and theuser interface 130. Theuser interface 130 may include a squeeze sensor 231 (or sensor) and atouch sensor 232. - The
display unit 151 may occupy most of thefront case 101. Theaudio output unit 152 and thecamera 121 may be located in a region adjacent to one end portion of thedisplay unit 151. Theuser interface 130 and themicrophone 122 may be located in a region adjacent to the other end portion of thedisplay unit 151. - Referring to
FIGs. 3A and 3B , thetouch sensor 232 may be disposed at a front surface of thedisplay unit 151, and may be formed to sense a touch input. Thesqueeze sensor 231 may be disposed at a lateral surface of the body or housing, and may be formed by a pressure that is greater than a predetermined value being applied thereto. The pressure greater than the predetermined value may be an activation pressure. - More specifically, referring to
FIG. 3A , thesqueeze sensor 231 may be located at a lateral surface of thefront case 101 and the rear case 102 (or the lateral side of the housing). Thesqueeze sensor 231 may be located at the other lateral surface of thefront case 101 and therear case 102. In other words, thesqueeze sensor 231 may be located at two lateral surfaces of thefront case 101 and therear case 102, or thesqueeze sensor 231 may be located at all four lateral surfaces thereof. - Referring to
FIG. 3B , thesqueeze sensor 231 may include a plurality of squeeze sensors. The plurality of squeeze sensors may be located at a lateral surface of thefront case 101 and the rear case 102 (i.e., a lateral side of the housing). - Accordingly, when a pressure is applied to the
squeeze sensor 231 by a user's left hand or a user's right hand, thecontroller 180 may detect (or determine) a pressure generated according to a location at which each finger is placed. Thesqueeze sensor 231 may detect (or determine) a squeeze state that is generated by fingers applying pressures above a predetermined value (i.e., above the activation pressure). - The
mobile terminal 100 may distinguish a grip state and a squeeze state based on a size (or amount) of a pressure applied to thesqueeze sensor unit 231. For example, when a pressure that is less than a predetermined value is applied to thesqueeze sensor 231, it may be referred to as a grip state. When a pressure greater than a predetermined value is applied to thesqueeze sensor 231, it may be referred to as a squeeze state. Thecontroller 180 may perform a control operation according to an input applied to thesqueeze sensor 231 in a squeeze state. -
FIGs. 4 and 5 are flow charts of a method of amobile terminal 100 according to an example embodiment. Other embodiments and operations may also be provided. Embodiments may be described with respect to amobile terminal 200, which includes components of themobile terminal 100. - The
mobile terminal 200 may include a display unit 251 (corresponding to the display unit 151), thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. - Referring to
FIG. 4 , a pressure greater than a predetermined value may be applied to the squeeze sensor 231 (S110). - The main body or housing may include a front surface, a rear surface and a lateral surface. The
display unit 251 may be disposed at the front surface of the body to display an object. - The
squeeze sensor 231 may include at least one squeeze sensor. Thesqueeze sensor 231 may be disposed at the lateral surface of the body (or housing). Thesqueeze sensor 231 may be disposed only at the lateral surface of the body (or housing), and/or may be disposed at all lateral surfaces of the body (or housing). Thesqueeze sensor 231 may be disposed a first lateral surface and a second lateral surface that cross each other, respectively, among the lateral surfaces of the body, and/or thesqueeze sensor 231 may be disposed at a first and a third lateral surface that are not crossed to each other, respectively, among the lateral surfaces of the body. - The
squeeze sensor 231 may convert a pressure applied to a specific portion into an electrical input signal. Thesqueeze sensor 231 may detect (or determine) a size (or amount) of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area. Thedisplay unit 251 may display an indicator that indicates at least one of a size (or amount) of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area. - When a pressure greater than a predetermined value is applied to the
squeeze sensor 231, a location of an object displayed on thedisplay unit 251 may change based on a pressure applied position (S120). - More specifically, when a pressure greater than a predetermined value is applied to the squeeze sensor 231 (i.e., in case of a squeeze state), the
controller 180 may detect (or determine) one of the squeeze sensors to which the pressure is applied. Thecontroller 180 may recognize a pressure applied position based on the detected result. - The
controller 180 may then change the location of an object displayed on thedisplay unit 251 based on the pressure applied position. - More specifically, when the
display unit 251 includes a plurality of objects, thecontroller 180 may determine a priority of each of the plurality of objects, and may change the location of the plurality of objects based on priority of the each of the objects and the pressure applied position. For example, based on a use frequency of an object, the object with a high use frequency may be disposed adjacent to the pressure applied position, and the object with a low use frequency may be disposed far away from the pressure applied position. - On the other hand, the
controller 180 may determine whether the user's hand (holding the body) is the user's left hand, the user's right hand, or both hands based on the pressure applied position. Thecontroller 180 may change the location of an object displayed on thedisplay unit 251 based on the determined result. - Referring to
FIG. 5 , thesqueeze sensor 231 may sense a pressure greater than a predetermined value (S210). When a pressure greater than a predetermined value is applied to thesqueeze sensor 131, thetouch sensor 232 may adjust sensitivity of thetouch sensor 232 to sense a touch input applied to a lateral surface of the body (S220). - More specifically, when a pressure greater than a predetermined value is applied to the squeeze sensor 231 (i.e., in case of a squeeze state), the
controller 180 may generate a control command for adjusting sensitivity of thetouch sensor 232. Accordingly, thecontroller 180 may increase sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to the lateral surface of the body in addition to the front surface of the body. Thedisplay unit 251 may display an indicator indicating a sensitivity of a touch sensor. On the other hand, thetouch sensor 232 may sense a stylus pen even if thedisplay unit 251 is separated from the stylus pen by a predetermined distance. - The
controller 180 may restore the sensitivity of thetouch sensor 232 when a predetermined time has passed after a pressure greater than a predetermined value is applied to thesqueeze sensor 231. Thecontroller 180 may restore the sensitivity of thetouch sensor 232 when the pressure greater than a predetermined value is applied again. - When a touch input is sensed at a lateral surface of the body, a location of an object displayed on the
display unit 251 may be changed based on a position at which the touch input is sensed (S230). - More specifically, when the
display unit 251 includes a plurality of objects, thecontroller 180 may determine a priority of each of the plurality of objects, and thecontroller 180 may change the location of the plurality of objects based on priority of the each of the objects and the position at which the touch input is sensed. For example, based on a use frequency of an object, the object with a high use frequency may be disposed adjacent to the position at which the touch input is sensed, and the object with a low use frequency may be disposed far away from the position at which the touch input is sensed. - On the other hand, the
controller 180 may determine whether the user's hand (holding the body) is the user's left hand, the user's right hand, or both hands based on the position at which the touch input is sensed. Thecontroller 180 may change the location of an object displayed on thedisplay unit 251 based on the determined result. - As described above, the location of the user's finger may be sensed by the
squeeze sensor 231 and the location of objects displayed on thedisplay unit 251 may change according to the location of the finger, thereby allowing the user to conveniently touch objects. Accordingly, the user may control themobile terminal 100 even with only one hand. - Further, a pressure applied to the
squeeze sensor 231 and a pressure applied to thetouch sensor 232 may be distinguished from each other, thereby reducing the user's input error. As a result, a user's convenience may be enhanced. -
FIGs. 6 and7 are views illustrating an operation example of themobile terminal 200.FIG. 6 shows that themobile terminal 200 may include thedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - The
display unit 251 may display adock screen 252, and thedock screen 252 may include a plurality of objects. - When a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may recognize (or determine) a pressure applied position by detecting one of the plurality of squeeze sensors to which the pressure is applied. Thecontroller 180 may then change the location of a plurality of objects contained in thedock screen 252. Accordingly, the plurality of objects contained in thedock screen 252 may be displayed in a region of thedisplay unit 251 that is adjacent to the pressure applied position. - As another example, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may increase a sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body (or housing). Thecontroller 180 may then change a location of the plurality of objects contained in thedock screen 252 based on a position at which a touch input is sensed. Accordingly, the plurality of objects contained in thedock screen 252 may be displayed in a region of thedisplay unit 251 that is adjacent to the position at which a touch input is sensed. - Referring to
FIG. 7 , thedisplay unit 251 may display a plurality of objects. The object may be one of an icon, a widget, a thumbnail image and/or an application execution menu. - As one example, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may recognize (or determine) a pressure applied position by detecting one of the plurality of squeeze sensors to which a pressure is applied. Thecontroller 180 may then change the location of a plurality of objects based on the pressure applied position. Thecontroller 180 may determine a priority of each of the plurality of objects, and change the location of a plurality of objects based on the priority of each of the objects. - As another example, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may increase a sensitivity of thetouch sensor 232. Thecontroller 180 may then change the location of a plurality of objects based on a position at which a touch input is sensed. Thecontroller 180 may determine priority of each of the plurality of objects, and change the location of a plurality of objects based on the priority of each of the objects. -
FIGs. 8 and9 are views illustrating an operation of the mobile terminal.FIG. 8 shows that themobile terminal 200 may include thedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - The
display unit 251 may display a plurality of objects. - According to an example embodiment, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may recognize that the user's hand (holding the body) is the user's left hand based on a pressure applied position, as shown inFIG. 8 . Thecontroller 180 may then change the location of a plurality of objects based on the pressure applied position. Accordingly, the user may easily control objects displayed on thedisplay unit 251 with the user's left hand holding the body. - The
controller 180 may recognize (or determine) that the user's hand (holding the body) is the user's right hand, and thecontroller 180 may change a location of a plurality of objects based on a pressure applied position, as shown inFIG. 9 . Accordingly, the user may easily control objects displayed on thedisplay unit 251 with the user's right hand holding the body. - As another example, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, as shown inFIG. 8 , thecontroller 180 may increase a sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. Thecontroller 180 may then recognize (or determine) that the user's hand (holding the body) is the user's left hand based on a position at which a touch input is sensed. Thecontroller 180 may then change the location of a plurality of objects based on the position at which a touch input is sensed. Accordingly, the user may easily control objects displayed on thedisplay unit 251 with the user's left hand holding the body. - As shown in
FIG. 9 , thecontroller 180 may recognize (or determine) that the user's hand (holding the body) is the user's right hand, and thecontroller 180 may change the location of a plurality of objects based on a position at which a touch input is sensed. Accordingly, the user may easily control objects displayed on thedisplay unit 251 with the user's right hand holding the body. -
FIG. 10 is a view illustrating an operation of themobile terminal 200. Themobile terminal 200 may include thedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - As shown in
FIG. 10 , thedisplay unit 251 may display adock screen 252, and thedock screen 252 may include a plurality of objects. When a pressure greater than a predetermined value is applied to thesqueeze sensor 231, thecontroller 180 may generate different control commands based on a pattern of the applied pressure. The pattern of the applied pressure may include at least one of a size of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area. - For example, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may increase a size of objects displayed on thedisplay unit 251. As shown inFIG. 10 , thecontroller 180 may display, on thedisplay unit 251, aprogress bar 254 indicating a size of the applied pressure, or a pressure applied frequency. - On the other hand, the
controller 180 may reduce the size of objects displayed on thedisplay unit 251 based on a pattern of the applied pressure. -
FIG. 11 is a view illustrating an operation of themobile terminal 200. Themobile terminal 200 may includedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - The
display unit 251 may display a lock screen in a lock state that restricts an input of a control command to an application. - According to an example embodiment, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may recognize (or determine) a pattern of the applied pressure based on a pressure applied position. Thecontroller 180 may then release the lock state when the pattern of the applied pressure corresponds to a predetermined pattern. Thedisplay unit 251 may display a home screen while releasing the lock state. - As another example, when a pressure greater than a predetermined value is applied to the
squeeze sensor 231, thecontroller 180 may increase a sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. Thecontroller 180 may then release the lock state when the pattern of the applied pressure corresponds to a predetermined pattern based on a position at which a touch input is sensed. -
FIGs. 12 and13 are views illustrating an operation of the mobile terminal. Themobile terminal 200 may include thedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - Referring to
FIGs. 12 and13 , when an event occurs, thecontroller 180 may generate a control command associated with the event based on a pattern of the pressure applied to thesqueeze sensor 231. - For example, as shown in
FIG. 12 , when an event associated with receiving a text message, apopup window 255 containing the received text message may be displayed on thedisplay unit 251. When a pressure is applied to thesqueeze sensor 231 as a first pattern, thecontroller 180 may generate a control command corresponding to the first pattern. Thecontroller 180 may then execute an application associated with the event. Accordingly, thecontroller 180 may execute a text message application and display an execution screen on thedisplay unit 251. - Further, as shown in
FIG. 13 , when a pressure is applied to thesqueeze sensor 231 as a second pattern, thecontroller 180 may generate a control command corresponding to the second pattern. Thecontroller 180 may then terminate the event. Accordingly, thecontroller 180 may terminate the text message application, and allow thepopup window 255 that has been displayed on thedisplay unit 251 to disappear. - As another example, when a pressure is applied to the
squeeze sensor 231 in a state that thepopup window 255 containing the received text message is displayed on thedisplay unit 251, thecontroller 180 may increase a sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. Thecontroller 180 may then execute an application associated with the event or terminate the event based on a pattern of the touch input. -
FIGs. 14 and15 are views illustrating an operation of the mobile terminal. Themobile terminal 200 may include thedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - Referring to
FIGs. 14 and15 , thedisplay unit 251 may display first screen information. Thecontroller 180 may change the first screen information to second screen information based on a pattern of the applied pressure. - For example, as shown in
FIG. 14 , when a pressure is applied to thesqueeze sensor 231 as a first pattern in a state that first screen information is displayed on thedisplay unit 251, thecontroller 180 may generate a control command corresponding to the first pattern. In other words, thecontroller 180 may perform auto scrolling to display the first screen information and second screen information contained in the same page are displayed on thedisplay unit 251. Accordingly, thedisplay unit 251 can display the second screen information by auto scrolling. - As shown in
FIG. 15 , when a pressure is applied to thesqueeze sensor 231 as a second pattern, thecontroller 180 may generate a control command corresponding to the second pattern. Thecontroller 180 may perform an operation of turning over a page to display, on thedisplay unit 251, the first screen information and the second screen information contained in another page. - As another example, when a pressure is applied to the
squeeze sensor 231 in a state that the first screen information is displayed on thedisplay unit 251, thecontroller 180 may increase a sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. Thecontroller 180 may then perform auto scrolling or display a next page on thedisplay unit 251 based on a pattern of the touch input. -
FIG. 16 is a view illustrating an operation of themobile terminal 200. Themobile terminal 200 may include thedisplay unit 251, thesqueeze sensor 231, thetouch sensor 232, and thecontroller 180. Other embodiments and views may also be provided. - Referring to
FIG. 16 , the content output unit may output a content. Thecontroller 180 may change the output content to another content or adjust a volume level of the output audio based on the pattern of a pressure applied to thesqueeze sensor 231. - For example, as shown in
FIG. 16 , when a pressure greater than a predetermined value is applied to thesqueeze sensor 231 in a state that the content output unit outputs a content, thecontroller 180 may generate a control command based on a pattern of the applied pressure. In other words, thecontroller 180 may change the output content to another content. Thecontroller 180 may adjust a volume level of the output audio or change a channel of the output image based on a pattern of the applied pressure. - Further, the
controller 180 may further transmit a call signal to a preset counterpart based on a pattern of the applied pressure. Accordingly, the user may transmit a call signal to a frequently contacted counterpart in a more convenient manner, and may also transmit a call signal at a higher speed in an emergency state. - As another example, when a pressure is applied to the
squeeze sensor 231 in a state that the content output unit outputs a content, thecontroller 180 may increase a sensitivity of thetouch sensor 232 to allow thetouch sensor 232 to sense even a touch input applied to a lateral surface of the body in addition to a front surface of the body. Thecontroller 180 may then change the output content to another content or adjust the volume level of the output content based on a pattern of the touch input. - An objective may be to provide a mobile terminal and control method thereof capable of generating an input signal using various devices.
- A mobile terminal may include a main body or housing (having a front surface, a rear surface and a lateral surface); a display unit disposed at a front surface of the body, and formed to display an object; and a squeeze sensor (unit) disposed at a lateral surface of the body to sense a pressure above a predetermined value being applied thereto. A controller may be configured to change a location of an object displayed on the display unit based on a pressure applied position when the pressure above a predetermined value (or activation pressure) is applied to the squeeze sensor (unit).
- The squeeze sensor (unit) may include at least one squeeze sensor disposed at a lateral surface of the body. The controller may detect one of the at least one squeeze sensor to which a pressure is applied, and the controller may recognize the pressure applied position according to the detected result when the pressure above a predetermined value is applied to the squeeze sensor (unit).
- The controller may determine whether the user's hand (holding the body) is his or her left hand, right hand, or both hands based on the pressure applied position. The controller may change the location of an object displayed on the display unit based on the determined result.
- The display unit may be formed to display a plurality of objects. The controller may determine a priority of each of the plurality of objects. The controller may change the location of the plurality of objects based on the priority of each of the objects and the pressure applied position.
- The controller may generate different control commands based on a pattern of the applied pressure when the pressure greater than a predetermined value is applied to the squeeze sensor unit. The pattern of the applied pressure may include at least one of a size of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area.
- The display unit may be formed to display a lock screen in a lock state that restricts an input of a control command to an application. The controller may release the lock state when the pattern of the applied pressure corresponds to a predetermined pattern in a state that the lock screen is displayed.
- When an event occurs, the controller may execute an application associated with the event or terminate the event based on the pattern of the applied pressure.
- The display unit may be formed to display first screen information. The controller may change the first screen information to second screen information based on the pattern of the applied pressure in a state that the first screen information is displayed.
- The first and the second screen information may be contained in one page or contained in different first and second pages, respectively. The controller may change the first screen information to the second screen information by an auto scrolling to the page when the first and the second screen information are contained in one page.
- The mobile terminal may further include a content output unit configured to output a content. The controller may change the output content to another content or adjust a volume level of the output audio based on the pattern of the applied pressure.
- The controller may display an indicator indicating at least one of a size of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, and/or a pressure applied area, being applied to the squeeze sensor unit.
- A mobile terminal may include a main body or housing) (having a front surface, a rear surface and a lateral surface); a display unit disposed at a front surface of the body, and formed to display an object; and a touch sensor (unit) disposed at a front surface of the display unit to sense a touch input. A squeeze sensor (unit) may be disposed at a lateral surface of the body to sense a pressure above a predetermined value (or activation pressure) being applied thereto. A controller may be configured to adjust a sensitivity of the touch sensor (unit) such that the touch sensor (unit) can sense a touch input applied to a lateral surface of the body when the pressure above a predetermined value is applied to the squeeze sensor (unit), and the controller may change the location of an object displayed on the display unit based on a position at which the touch input is sensed when the touch input is sensed at a lateral surface of the body.
- The controller may restore a sensitivity of the touch sensor unit when a predetermined time has passed after the pressure above a predetermined value is applied to the squeeze sensor (unit) or the pressure above a predetermined value is applied again.
- The controller may determine whether the user's hand (holding the body) is his or her left hand, right hand, or both hands based on a position at which the touch input is sensed at a lateral surface of the body, and the controller may change the location of an object displayed on the display unit based on the determined result.
- The display unit may be formed to display a plurality of objects. The controller may determine the priority of each of the plurality of objects, and may change the location of the plurality of objects based on the priority of the each of the objects and the position at which the touch input is sensed at a lateral surface of the body.
- The display unit may be formed to display a lock screen in a lock state that restricts an input of a control command to an application. The controller may release the lock state when the pattern of the touch input sensed at a lateral surface of the body corresponds to a predetermined pattern in a state that the lock screen is displayed.
- When an event occurs, the controller may execute an application associated with the event or terminate the event based on the pattern of the touch input sensed at a lateral surface of the body.
- The display unit may be formed to display first screen information. The controller may change the first screen information to second screen information based on the pattern of the touch input sensed at a lateral surface of the body in a state that the first screen information is displayed.
- The first and the second screen information may be contained in one page or contained in different first and second pages, respectively. The controller may change the first screen information to the second screen information by an auto scrolling to the page when the first and the second screen information are contained in one page.
- The mobile terminal may further include a content output unit configured to output a content. The controller may change the output content to another content or adjust a volume level of the output audio based on the pattern of the touch input sensed at a lateral surface of the body.
- The controller may display an indicator indicating sensitivity of the touch sensor (unit) on the display unit.
- A control method of a mobile terminal may include allowing a squeeze sensor (unit) disposed at a lateral surface of the body to sense a pressure above a predetermined value being applied thereto; and changing the location of an object displayed on the display unit based on a pressure applied position when the pressure above a predetermined value is applied to the squeeze sensor (unit).
- A control method of a mobile terminal may also include allowing a squeeze sensor (unit) disposed at a lateral surface of the body to sense a pressure above a predetermined value being applied thereto; adjusting a sensitivity of the touch sensor (unit) such that the touch sensor (unit) can sense a touch input applied to a lateral surface of the body when the pressure above a predetermined value is applied to the squeeze sensor (unit); and changing a location of an object displayed on the display unit based on a position at which the touch input is sensed when the touch input is sensed at a lateral surface of the body.
- Embodiments of the foregoing method may be implemented as codes readable by a processor on a medium written by a program. Examples of the processor-readable media may include ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical data storage device, and the like, and may also include a device implemented in a form of a carrier wave (for example, transmission via the Internet).
- Any reference in this specification to "one embodiment," "an embodiment," "example embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with any embodiment, it is submitted that it is within the purview of one skilled in the art to affect such feature, structure, or characteristic in connection with other ones of the embodiments.
- Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that numerous other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, various variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.
Claims (15)
- A mobile terminal(1 00), comprising:a housing having a front surface, a rear surface and a lateral surface;a display(151) at the front surface of the housing to display an object at a first location of a display area;a sensor(231) at the lateral surface of the housing to sense a pressure being applied thereto; anda controller(180) to change a location of the object from the first location to a second location of the display area when the sensed pressure is greater than a predetermined value.
- The mobile terminal(100) of claim 1, wherein the second location of the display area corresponds to a position or a location of the sensor(231).
- The mobile terminal(100) of any one of claims 1 and 2, wherein the sensor(231) includes a plurality of squeeze sensors at the lateral surface of the housing, and
the controller(180) determines one of the plurality of squeeze sensors to which the pressure is applied, and the controller(180) determines a pressure applied position based at least in part on the determined result when the sensed pressure greater than the predetermined value is applied to the sensor(231). - The mobile terminal(100) of any one of claims 1, 2 and 3, wherein the controller(180) determines whether a user's hand is the user's left hand, the user's right hand, or both hands based on the pressure applied position, and the controller(180) changes the location of the object displayed on the display area based at least in part on the determined result.
- The mobile terminal(100) of any one of claims 3 and 4, wherein the display(151) to display a plurality of objects, and
the controller(180) determines a priority of each of the plurality of objects, and the controller(180) changes the location of the plurality of objects based at least in part on the priority of each of the plurality of objects and the pressure applied position. - The mobile terminal(100) of claim 1, wherein the controller(180) generates different control commands based on a pattern of the pressure applied when the sensed pressure greater than the predetermined value is applied to the sensor(231), and
the pattern of the applied pressure includes one of an amount of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, or a pressure applied area. - The mobile terminal(100) of claim 6, wherein the display(151) to display a lock screen in a lock state that restricts an input of a control command to an application, and
the controller(180) releases the lock state when the pattern of the pressure applied corresponds to a predetermined pattern while the lock screen is displayed on the display(151). - The mobile terminal(100) of claim 6, wherein when an event occurs, the controller(180) executes an application associated with the event or terminates the event based at least in part on the pattern of the applied pressure.
- The mobile terminal(100) of claim 6, wherein the display(151) to display first screen information, and
the controller(180) changes the first screen information to second screen information based at least in part on the pattern of the pressure applied while the first screen information is displayed on the display(151). - The mobile terminal(100) of claim 9, wherein the first screen information and the second screen information are provided in one page or are contained in different first and second pages, respectively, and
the controller(180) changes the first screen information to the second screen information by an auto scrolling to the page when the first screen information and the second screen information are provided in one page. - The mobile terminal(1 00) of claim 6, further comprising:a content output unit(152) to output a content,wherein the controller(180) changes the output content to another content or adjusts a volume level of output audio based on the pattern of the applied pressure.
- The mobile terminal(100) of any one of claims 1 and 5, wherein the controller(180) displays, on the display(151), an indicator that indicates one of an amount of the applied pressure, a pressure applied frequency, a pressure applied time, a pressure applied position, or a pressure applied area.
- A display method of a mobile terminal(100) that includes a housing, and a display(151) disposed at a front surface of the housing, the method comprising:sensing a pressure on a sensor(231) at a lateral surface of the housing(S110); andchanging a location of an object displayed at a first location of the display(151) to a second location of the display(151) when the sensed pressure is determined to be greater than a predetermined value, wherein the second location is in a vicinity of a position where the pressure at the lateral surface of the housing was sensed to be greater than the predetermined value(S120).
- The method of claim 13, wherein the second location corresponds to a position or a location of the sensor(231).
- The method of any one of claims 13 and 14, wherein the sensor(231) includes a plurality of squeeze sensors at the lateral surface of the housing, and the method further comprises:determining one of the plurality of squeeze sensors to which the pressure is applied, anddetermining a pressure applied position based at least in part on the determined result when the sensed pressure greater than the predetermined value is applied to the sensor(231).
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120068744A KR101995486B1 (en) | 2012-06-26 | 2012-06-26 | Mobile terminal and control method thereof |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2680099A2 true EP2680099A2 (en) | 2014-01-01 |
EP2680099A3 EP2680099A3 (en) | 2016-07-20 |
EP2680099B1 EP2680099B1 (en) | 2019-08-21 |
Family
ID=47789948
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13000951.7A Active EP2680099B1 (en) | 2012-06-26 | 2013-02-25 | Mobile terminal and control method thereof |
Country Status (4)
Country | Link |
---|---|
US (1) | US9167059B2 (en) |
EP (1) | EP2680099B1 (en) |
KR (1) | KR101995486B1 (en) |
CN (1) | CN103513763B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2522755A (en) * | 2013-12-05 | 2015-08-05 | Lenovo Singapore Pte Ltd | Contact signature control of device |
US9489502B2 (en) | 2014-02-04 | 2016-11-08 | Lenovo (Singapore) Pte. Ltd. | Biometric authentication display |
US9697342B2 (en) | 2014-02-04 | 2017-07-04 | Lenovo (Singapore) Pte. Ltd. | Biometric authentication stripe |
CN107153504A (en) * | 2017-04-24 | 2017-09-12 | 珠海市魅族科技有限公司 | A kind of control processing method and mobile terminal |
CN108304124A (en) * | 2017-01-11 | 2018-07-20 | 北京小米移动软件有限公司 | Pressure touch inductive terminations, pressure touch inducing method and device, electronic equipment |
CN108744494A (en) * | 2018-05-17 | 2018-11-06 | Oppo广东移动通信有限公司 | game application control method, device, storage medium and electronic equipment |
US10162954B2 (en) | 2014-02-04 | 2018-12-25 | Lenovo (Singapore) Pte. Ltd. | Biometric account card |
Families Citing this family (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11064910B2 (en) | 2010-12-08 | 2021-07-20 | Activbody, Inc. | Physical activity monitoring system |
US9230064B2 (en) | 2012-06-19 | 2016-01-05 | EZ as a Drink Productions, Inc. | Personal wellness device |
US10102345B2 (en) | 2012-06-19 | 2018-10-16 | Activbody, Inc. | Personal wellness management platform |
US10133849B2 (en) | 2012-06-19 | 2018-11-20 | Activbody, Inc. | Merchandizing, socializing, and/or gaming via a personal wellness device and/or a personal wellness platform |
US9086796B2 (en) * | 2013-01-04 | 2015-07-21 | Apple Inc. | Fine-tuning an operation based on tapping |
US9354786B2 (en) | 2013-01-04 | 2016-05-31 | Apple Inc. | Moving a virtual object based on tapping |
US9229476B2 (en) | 2013-05-08 | 2016-01-05 | EZ as a Drink Productions, Inc. | Personal handheld electronic device with a touchscreen on a peripheral surface |
US9262064B2 (en) * | 2013-07-09 | 2016-02-16 | EZ as a Drink Productions, Inc. | Handheld computing platform with integrated pressure sensor and associated methods of use |
CN104915073B (en) * | 2014-03-14 | 2018-06-01 | 敦泰科技有限公司 | Hand-held type touch device |
WO2015156217A1 (en) * | 2014-04-11 | 2015-10-15 | シャープ株式会社 | Mobile terminal device |
KR102245363B1 (en) | 2014-04-21 | 2021-04-28 | 엘지전자 주식회사 | Display apparatus and controlling method thereof |
US10124246B2 (en) | 2014-04-21 | 2018-11-13 | Activbody, Inc. | Pressure sensitive peripheral devices, and associated methods of use |
KR20160013760A (en) * | 2014-07-28 | 2016-02-05 | 삼성전자주식회사 | Method and device for measuring pressure based on touch input |
US20160132285A1 (en) * | 2014-11-12 | 2016-05-12 | Blackberry Limited | Portable electronic device including touch-sensitive display and method of controlling audio output |
CN104598067B (en) * | 2014-12-24 | 2017-12-29 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN104717358B (en) * | 2015-01-30 | 2018-01-19 | 努比亚技术有限公司 | Method for controlling mobile terminal and device |
KR102304461B1 (en) * | 2015-02-24 | 2021-09-24 | 삼성디스플레이 주식회사 | Foldable display apparatus |
CN104902309A (en) * | 2015-05-26 | 2015-09-09 | 努比亚技术有限公司 | Multimedia file sharing method and device for mobile terminal |
US10372212B2 (en) * | 2015-05-29 | 2019-08-06 | Google Llc | Techniques for simulated physical interaction between users via their mobile computing devices |
CN104978028B (en) * | 2015-06-23 | 2018-09-04 | 广东欧珀移动通信有限公司 | A kind of control method and mobile terminal of mobile terminal |
KR101719999B1 (en) | 2015-07-10 | 2017-03-27 | 엘지전자 주식회사 | Mobile terminal |
CN105278752B (en) * | 2015-10-30 | 2018-08-14 | 努比亚技术有限公司 | A kind of touch-operated device and method |
CN105824405A (en) * | 2015-11-26 | 2016-08-03 | 维沃移动通信有限公司 | Mobile terminal information display method and device |
JP6470674B2 (en) * | 2015-12-22 | 2019-02-13 | ミネベアミツミ株式会社 | Portable device |
KR101853961B1 (en) * | 2016-04-19 | 2018-06-20 | (주)휴맥스 | Apparatus and method of providing media service |
JP6098986B1 (en) * | 2016-05-12 | 2017-03-22 | 株式会社コンフォートビジョン研究所 | Mobile terminal device |
KR20170129372A (en) * | 2016-05-17 | 2017-11-27 | 삼성전자주식회사 | Electronic device comprising display |
KR102561736B1 (en) * | 2016-06-01 | 2023-08-02 | 삼성전자주식회사 | Method for activiating a function using a fingerprint and electronic device including a touch display supporting the same |
KR102521032B1 (en) * | 2016-06-17 | 2023-04-13 | 삼성전자주식회사 | User input processing method and electronic device performing thereof |
US10067668B2 (en) * | 2016-09-09 | 2018-09-04 | Htc Corporation | Portable electronic device, operating method for the same, and non-transitory computer readable recording medium |
JP6950700B2 (en) * | 2016-09-28 | 2021-10-13 | ソニーグループ株式会社 | Sensing devices and electronic devices |
CN106412301A (en) * | 2016-09-29 | 2017-02-15 | 努比亚技术有限公司 | Screen control system based on proximity sensor and mobile terminal |
CN106354268A (en) * | 2016-09-29 | 2017-01-25 | 努比亚技术有限公司 | Screen control system based on proximity sensor and mobile terminal |
CN106254596A (en) * | 2016-09-29 | 2016-12-21 | 努比亚技术有限公司 | A kind of kneading identification system based on proximity transducer and mobile terminal |
CN106453704B (en) * | 2016-09-29 | 2020-09-01 | 努比亚技术有限公司 | Kneading identification system based on proximity sensor and mobile terminal |
CN106648052A (en) * | 2016-09-29 | 2017-05-10 | 努比亚技术有限公司 | Application triggering system based on proximity detector and mobile terminal |
CN106254597A (en) * | 2016-09-29 | 2016-12-21 | 努比亚技术有限公司 | A kind of gripping identification system based on proximity transducer |
CN106341502A (en) * | 2016-09-29 | 2017-01-18 | 努比亚技术有限公司 | Proximity sensor-based application triggering system and mobile terminal |
CN106878556A (en) * | 2017-02-15 | 2017-06-20 | 京东方科技集团股份有限公司 | A kind of mobile terminal |
CN107203309A (en) * | 2017-05-16 | 2017-09-26 | 珠海市魅族科技有限公司 | View switching method and device, computer installation and computer-readable recording medium |
KR102363707B1 (en) | 2017-08-03 | 2022-02-17 | 삼성전자주식회사 | An electronic apparatus comprising a force sensor and a method for controlling electronic apparatus thereof |
CN107754308A (en) * | 2017-09-28 | 2018-03-06 | 网易(杭州)网络有限公司 | Information processing method, device, electronic equipment and storage medium |
KR102585873B1 (en) * | 2017-09-29 | 2023-10-11 | 삼성전자주식회사 | Method and apparatus for executing application using a barometer |
KR20190054397A (en) * | 2017-11-13 | 2019-05-22 | 삼성전자주식회사 | Display apparatus and the control method thereof |
US11089446B2 (en) * | 2018-01-11 | 2021-08-10 | Htc Corporation | Portable electronic device, operating method for the same, and non-transitory computer readable recording medium |
KR102519800B1 (en) | 2018-07-17 | 2023-04-10 | 삼성디스플레이 주식회사 | Electronic device |
CN109525688A (en) * | 2018-11-13 | 2019-03-26 | 汪赛 | A kind of direction of not distinguishing can put the cart before the horse the mobile phone used up and down |
US20200220927A1 (en) * | 2019-01-09 | 2020-07-09 | Dahyu Patel | Activity synchronization |
KR102393641B1 (en) * | 2020-06-15 | 2022-05-02 | 엘지전자 주식회사 | Home appliance and control method thereof |
KR20220064162A (en) * | 2020-11-11 | 2022-05-18 | 삼성전자주식회사 | An electronic device including a stretchable display |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7800592B2 (en) * | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
US6625283B1 (en) * | 1999-05-19 | 2003-09-23 | Hisashi Sato | Single hand keypad system |
US20040046739A1 (en) * | 2002-09-11 | 2004-03-11 | Palm, Inc. | Pliable device navigation method and apparatus |
KR100608576B1 (en) * | 2004-11-19 | 2006-08-03 | 삼성전자주식회사 | Apparatus and method for controlling a potable electronic device |
KR101345755B1 (en) * | 2007-09-11 | 2013-12-27 | 삼성전자주식회사 | Apparatus and method for controlling operation in a mobile terminal |
US8502800B1 (en) * | 2007-11-30 | 2013-08-06 | Motion Computing, Inc. | Method for improving sensitivity of capacitive touch sensors in an electronic device |
EP3654141A1 (en) * | 2008-10-06 | 2020-05-20 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying graphical user interface depending on a user's contact pattern |
KR20100039194A (en) * | 2008-10-06 | 2010-04-15 | 삼성전자주식회사 | Method for displaying graphic user interface according to user's touch pattern and apparatus having the same |
KR101564222B1 (en) * | 2009-05-26 | 2015-11-06 | 삼성전자주식회사 | Apparatus and method for unlocking a locking mode of portable terminal |
CN101699369A (en) * | 2009-11-18 | 2010-04-28 | 英华达(南昌)科技有限公司 | Pressure triggering device used for electronic equipment and electronic device comprising same |
KR101624528B1 (en) * | 2009-12-02 | 2016-06-07 | 엘지전자 주식회사 | Mobile terminal and operation control method thereof |
CN101751179B (en) * | 2009-12-16 | 2012-05-23 | 深圳市汇顶科技有限公司 | Method and system for automatically calibrating sensitivity of touch detection, and touch control terminal |
US8347238B2 (en) * | 2009-12-16 | 2013-01-01 | Apple Inc. | Device, method, and graphical user interface for managing user interface content and user interface elements by dynamic snapping of user interface elements to alignment guides |
KR101688134B1 (en) * | 2009-12-16 | 2016-12-20 | 엘지전자 주식회사 | Mobile terminal having a side touch input device and method for executingfunctions of thereof |
CN101799737B (en) * | 2010-01-06 | 2012-04-25 | 华为终端有限公司 | Method and terminal for displaying picture/interface |
US20110210942A1 (en) * | 2010-02-26 | 2011-09-01 | Sanyo Electric Co., Ltd. | Display apparatus and vending machine |
US8452260B2 (en) * | 2010-03-25 | 2013-05-28 | Hewlett-Packard Development Company, L.P. | Methods and apparatus for unlocking an electronic device |
EP2618626B1 (en) * | 2010-09-13 | 2018-05-23 | Taiwan Semiconductor Manufacturing Company, Ltd. | Mobile terminal and method for controlling operation thereof |
-
2012
- 2012-06-26 KR KR1020120068744A patent/KR101995486B1/en active IP Right Grant
-
2013
- 2013-01-07 US US13/735,768 patent/US9167059B2/en not_active Expired - Fee Related
- 2013-02-25 EP EP13000951.7A patent/EP2680099B1/en active Active
- 2013-03-15 CN CN201310084320.5A patent/CN103513763B/en not_active Expired - Fee Related
Non-Patent Citations (1)
Title |
---|
None |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2522755A (en) * | 2013-12-05 | 2015-08-05 | Lenovo Singapore Pte Ltd | Contact signature control of device |
GB2522755B (en) * | 2013-12-05 | 2018-04-18 | Lenovo Singapore Pte Ltd | Contact signature control of device |
US9489502B2 (en) | 2014-02-04 | 2016-11-08 | Lenovo (Singapore) Pte. Ltd. | Biometric authentication display |
US9697342B2 (en) | 2014-02-04 | 2017-07-04 | Lenovo (Singapore) Pte. Ltd. | Biometric authentication stripe |
US10162954B2 (en) | 2014-02-04 | 2018-12-25 | Lenovo (Singapore) Pte. Ltd. | Biometric account card |
CN108304124A (en) * | 2017-01-11 | 2018-07-20 | 北京小米移动软件有限公司 | Pressure touch inductive terminations, pressure touch inducing method and device, electronic equipment |
CN108304124B (en) * | 2017-01-11 | 2020-09-18 | 北京小米移动软件有限公司 | Pressure touch sensing terminal, pressure touch sensing method and device and electronic equipment |
CN107153504A (en) * | 2017-04-24 | 2017-09-12 | 珠海市魅族科技有限公司 | A kind of control processing method and mobile terminal |
CN108744494A (en) * | 2018-05-17 | 2018-11-06 | Oppo广东移动通信有限公司 | game application control method, device, storage medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
KR20140000932A (en) | 2014-01-06 |
EP2680099B1 (en) | 2019-08-21 |
EP2680099A3 (en) | 2016-07-20 |
KR101995486B1 (en) | 2019-07-02 |
US20130344919A1 (en) | 2013-12-26 |
US9167059B2 (en) | 2015-10-20 |
CN103513763A (en) | 2014-01-15 |
CN103513763B (en) | 2017-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9167059B2 (en) | Mobile terminal and control method thereof | |
US9147395B2 (en) | Mobile terminal and method for recognizing voice thereof | |
EP2665243B1 (en) | Mobile terminal and control method thereof | |
US9116613B2 (en) | Mobile terminal for supporting various input modes and control method thereof | |
US9001151B2 (en) | Mobile terminal for displaying a plurality of images during a video call and control method thereof | |
KR101886753B1 (en) | Mobile terminal and control method thereof | |
EP2731028A2 (en) | Mobile terminal and control method thereof | |
EP2693318B1 (en) | Mobile terminal and control method thereof | |
KR20140051719A (en) | Mobile terminal and control method thereof | |
US20140007013A1 (en) | Mobile terminal and control method thereof | |
EP2709344A2 (en) | Mobile terminal and control method thereof | |
KR101925327B1 (en) | Mobile terminal and control method thereof | |
KR101300260B1 (en) | Mobile terminal and control method thereof | |
KR101917692B1 (en) | Mobile terminal | |
KR20130028573A (en) | Mobile terminal and control method thereof | |
KR101978958B1 (en) | Mobile terminal and control method thereof | |
KR101853857B1 (en) | Mobile terminal and control method thereof | |
KR101260771B1 (en) | Mobile terminal and control method thereof | |
KR20140072752A (en) | Mobile terminal and control method thereof | |
KR20130030690A (en) | Mobile terminal and control method thereof | |
KR20130064420A (en) | Mobile terminal and control method thereof | |
KR20130012848A (en) | Mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 1/16 20060101AFI20160616BHEP |
|
17P | Request for examination filed |
Effective date: 20160902 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20170828 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20190313 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: LG ELECTRONICS INC. |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAJ | Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR1 |
|
GRAL | Information related to payment of fee for publishing/printing deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR3 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
GRAR | Information related to intention to grant a patent recorded |
Free format text: ORIGINAL CODE: EPIDOSNIGR71 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
INTC | Intention to grant announced (deleted) | ||
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
INTG | Intention to grant announced |
Effective date: 20190712 |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013059355 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1170478 Country of ref document: AT Kind code of ref document: T Effective date: 20190915 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20190821 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20191223 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20191121 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20191121 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20191221 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1170478 Country of ref document: AT Kind code of ref document: T Effective date: 20190821 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20200106 Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20200224 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013059355 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG2D | Information on lapse in contracting state deleted |
Ref country code: IS |
|
26N | No opposition filed |
Effective date: 20200603 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20200225 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20200229 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200225 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200229 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200229 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200229 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200225 Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200225 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20200229 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602013059355 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210901 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190821 |