EP2711825A2 - System for providing a user interface for use by portable and other devices - Google Patents
System for providing a user interface for use by portable and other devices Download PDFInfo
- Publication number
- EP2711825A2 EP2711825A2 EP13185053.9A EP13185053A EP2711825A2 EP 2711825 A2 EP2711825 A2 EP 2711825A2 EP 13185053 A EP13185053 A EP 13185053A EP 2711825 A2 EP2711825 A2 EP 2711825A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- input
- portable terminal
- touch screen
- finger
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 33
- 238000012545 processing Methods 0.000 claims abstract description 14
- 230000004044 response Effects 0.000 claims description 44
- 238000001514 detection method Methods 0.000 claims description 17
- 230000008569 process Effects 0.000 claims description 16
- 210000003811 finger Anatomy 0.000 description 70
- 238000004891 communication Methods 0.000 description 47
- 241001422033 Thestylus Species 0.000 description 30
- 239000010410 layer Substances 0.000 description 18
- 238000010586 diagram Methods 0.000 description 12
- 238000010295 mobile communication Methods 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 4
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 239000002041 carbon nanotube Substances 0.000 description 2
- 229910021393 carbon nanotube Inorganic materials 0.000 description 2
- 239000011247 coating layer Substances 0.000 description 2
- 239000004020 conductor Substances 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 239000011810 insulating material Substances 0.000 description 2
- 230000007257 malfunction Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 239000010409 thin film Substances 0.000 description 2
- 238000001771 vacuum deposition Methods 0.000 description 2
- 239000004642 Polyimide Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000000149 argon plasma sintering Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 229910052681 coesite Inorganic materials 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 229910052906 cristobalite Inorganic materials 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 239000003989 dielectric material Substances 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 229910021389 graphene Inorganic materials 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000001459 lithography Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 239000011368 organic material Substances 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- -1 polyethylene terephthalate Polymers 0.000 description 1
- 239000005020 polyethylene terephthalate Substances 0.000 description 1
- 229920000139 polyethylene terephthalate Polymers 0.000 description 1
- 229920001721 polyimide Polymers 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000000377 silicon dioxide Substances 0.000 description 1
- 235000012239 silicon dioxide Nutrition 0.000 description 1
- 238000004544 sputter deposition Methods 0.000 description 1
- 229910052682 stishovite Inorganic materials 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 229910052905 tridymite Inorganic materials 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0443—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a single layer of sensing electrodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0446—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present disclosure relates to a user interface of a portable or other processing device such as a phone, notebook or computer, for processing input commands by finger, stylus (e.g. a stylus) and other devices via a touch screen.
- a portable or other processing device such as a phone, notebook or computer, for processing input commands by finger, stylus (e.g. a stylus) and other devices via a touch screen.
- a portable device typically includes a touch screen for receiving user input.
- a touch screen is used to distinguishably detect a finger input or the like and a stylus (stylus) input.
- the touch screen typically prioritizes a stylus input so that when there is a stylus input, the touch screen ignores a finger input to prevent malfunction associated with an inadvertent palm touch.
- a user performs a finger touch while grasping a stylus, since a recognition distance of a stylus is relatively large, the finger touch is sometimes ignored, which may be recognized as a malfunction of the touch screen and cause problems.
- a system according to invention principles addresses this deficiency and related problems
- a user interface system supports processing concurrent finger input and the stylus input commands in accordance with an intention of the user.
- the system detects a hover input command, detects a finger input command concurrently with the hover input command, calculates a distance between positions of the hover input command and finger input command, compares the calculated distance with a predetermined threshold and at least one of, ignores and processes the finger input command in response to a result of the comparison.
- a portable terminal includes a machine-readable storage medium including a program executable by a processor for processing a touch input command.
- the portable terminal comprises a touch screen that displays input data and detects a hover input command and a finger input command.
- a controller calculates a distance between positions of the hover input command and the finger input command when the finger input is detected concurrently with the hover input command, compares the calculated distance with a predetermined threshold and ignores or processes the finger input command in response to a result of the comparison.
- first and second may be used to describe various components
- such components are not limited by the above terms.
- the above terms are used only to distinguish one component from another.
- a first component may be referred to as a second component without departing from the scope of the present invention, and likewise a second component may be referred to as a first component.
- the term of and/or encompasses a combination of plural items or any one of the plural items.
- a stylus as used herein comprises a pointed instrument used as an input device on a touch screen or pressure-sensitive screen and may comprise a pen, writing instrument, or other hand held pointing instrument.
- FIG. 1 shows a block diagram schematically illustrating a portable terminal according to an embodiment of the present invention
- FIG. 2 is a front perspective view of a portable terminal according to an embodiment of the present invention
- FIG. 3 is a rear perspective view of a portable terminal according to an embodiment of the present invention.
- a portable terminal 100 may be connected with an external device (not shown) by using an external device connector such as a sub communication module 130, a connector 165, and an earphone connecting jack 167.
- the external device includes various devices attached to or detached from the portable terminal 100 through a cable, such as an earphone, an external speaker, a Universal Serial Bus (USB) memory, a charger, a cradle/dock, a DMB (Digital Multimedia Broadcasting) antenna, a mobile payment related device, a health management device (blood sugar tester, for example), a game machine, a car navigation device, for example.
- the external device includes a Bluetooth communication device, a short distance communication device such as a Near Field Communication (NFC) device, a WiFi Direct communication device, and a wireless Access Point (AC) which may be wirelessly connected.
- the external device may include another device, a mobile phone, a smart phone, a tablet PC, a desktop PC, and a server.
- the portable terminal 100 may comprise a smart phone, a mobile phone, a game machine, a TV, a display device, a head unit for a vehicle, a notebook, a laptop, a tablet PC, a Personal Media Player (PMP), a Personal Digital Assistant (PDA) or a watch, for example.
- the portable terminal 100 may be implemented as a pocket size portable mobile terminal having a wireless communication function.
- the portable terminal 100 includes a touch screen 190 and a touch screen controller 195. Further, the portable terminal 100 includes a controller 110, a mobile communication module 120, a sub communication module 130, a multimedia module 140, a camera module 150, a GPS module 155, an input/output module 160, a sensor module 170, a storage unit 175, and a power supplier 180.
- the sub communication module 130 includes at least one of a wireless LAN module 131 and a short distance communication module 132
- the multimedia module 140 includes at least one of a broadcasting communication module 141, an audio reproduction module 142, and a video reproduction module 143.
- the camera module 150 includes at least one of a first camera 151 and a second camera 152.
- the input/output module 160 includes at least one of a button 161, a microphone 162, a speaker 163, a vibration motor 164, a connector 165, a keypad 166, and an earphone connecting jack 167.
- the controller 110 includes a CPU 111, a ROM 112 storing a control program for controlling the portable terminal 100, and a RAM 113 used as a storage area for storing a signal or data input from the outside of the apparatus 100 or for work performed in the portable terminal 100.
- the CPU 111 includes a single core, a dual core, a triple core, a quad core or comprises another architecture.
- the CPU 111, the ROM 112, and the RAM 113 may be mutually connected to each other through an internal bus.
- the controller 110 controls the mobile communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the input/output module 160, the sensor module 170, the storage unit 175, the power supplier 180, the touch screen 190, and the touch screen controller 195.
- the mobile communication module 120, the sub communication module 130, and the broadcasting communication module 141 of the multimedia module 140 may be collectively called a communication unit, and the communication unit is provided for a direct connection with an external device or a connection through a network and may be a wired or wireless communication unit.
- the communication unit can transmit data to the controller 110, the storage unit 175, and the camera module 150 in a wired manner or wirelessly, or receive data from an external communication line or the air and transmit the data to the controller 110 or store the data in the storage unit 175.
- the mobile communication module 120 enables the portable terminal 100 to be connected with the external device through mobile communication by using one antenna or a plurality of antennas according to a control of the controller 110.
- the mobile communication module 120 transmits or receives a wireless signal for exchanging, unidirectionally transmitting, or receiving data of voice phone communication, video phone communication, a Short Message Service (SMS), or a Multimedia Message Service (MMS) to/from a mobile phone (not shown), a smart phone (not shown), a tablet PC, or another device (not shown) having a phone number input into the apparatus 100.
- SMS Short Message Service
- MMS Multimedia Message Service
- the sub communication module 130 may include at least one of the wireless LAN module 131 and the short distance communication module 132.
- the sub communication module 130 may include just the wireless LAN module 131, just the near field communication module 132, or both the wireless LAN module 131 and the near field communication module 132.
- the wireless LAN module 131 may be Internet-connected according to a control of the controller 110 in a place where a wireless Access Point (AP) (not shown) is installed.
- the wireless LAN module 131 supports a wireless LAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers.
- the short distance communication module 132 wirelessly performs near field communication between the portable terminal 100 and an image forming apparatus (not shown) in response to control of the controller 110.
- a short distance communication method includes Bluetooth, Infrared Data Association (IrDA) communication, WiFi-Direct communication and Near Field Communication (NFC), for example.
- the portable terminal 100 may include at least one of the mobile communication module 120, the wireless LAN module 131, and the short distance communication module 132.
- the portable terminal 100 may include a combination of the mobile communication module 120, the wireless LAN module 131, and the short distance communication module 132.
- the multimedia module 140 includes the broadcasting communication module 141, the audio reproduction module 142, or the video reproduction module 143.
- the broadcasting communication module 141 receives a broadcasting signal (for example, a TV broadcasting signal, a radio broadcasting signal, or a data broadcasting signal) and broadcasting supplemental information (for example, Electric Program Guide: EPG or Electric Service Guide: ESG) output from a broadcasting station through a broadcasting communication antenna (not shown) in response to control of the controller 110.
- the audio reproduction module 142 reproduces a digital audio file (for example, a file having a file extension of mp3, wma, ogg, or wav) stored or received in response to control of the controller 110.
- the video reproduction module 143 reproduces a digital video file (for example, a file having a file extension of mpeg, mpg, mp4, avi, mov, or mkv) stored or received in response to a control of the controller 110.
- the video reproduction module 143 reproduces the digital audio file.
- the multimedia module 140 includes the audio reproduction module 142 or the video reproduction module 143 except for the broadcasting communication module 141. Further, the audio reproduction module 142 or the video reproduction module 143 of the multimedia module 140 may be included in the controller 110.
- the camera module 150 includes at least one of the first camera 151 and the second camera 152 for photographing a still image or a video according to a control of the controller 110. Further, the first camera 151 or the second camera 152 includes an auxiliary light source (for example, a flash (not shown) providing light required for the photographing.
- the first camera 151 may be disposed on a front surface of the apparatus 100, and the second camera 152 may be disposed on a back surface of the apparatus 100. Alternatively, the first camera 151 and the second camera 152 may be closely located to each other (for example, an interval between the first camera 151 and the second camera 152 is larger than 1 cm and smaller than 8 cm) and acquire a three dimensional still image or a three dimensional video.
- the cameras 151 and 152 include a lens system, an image sensor and a flash source, for example.
- the cameras 151 and 152 convert an optical signal input (or photographed) through the lens system to an image signal and output the converted image signal to the controller 110.
- the user acquires a video or a still image through the cameras 151 and 152.
- the lens system forms an image of a subject by converging a light incident from the outside.
- the lens system includes at least one lens and each lens may be a convex lens and an aspheric lens, for example.
- the lens system has symmetry with respect to an optical axis passing through the center thereof, and the optical axis is defined as a center axis.
- the image sensor detects the optical image formed by the external light incident through the lens system as an electrical image signal.
- the image sensor has a plurality of pixel units placed in an M ⁇ N matrix structure and includes a photodiode and a plurality of transistors.
- the pixel unit accumulates charges generated by the incident light, and a voltage derived from accumulated charges indicates luminance of incident light.
- the image signal output from the image sensor consists of a set of voltages (that is, pixel values) output from the pixel units and the image signal indicates one frame (that is, a still image). Further, a frame comprises M ⁇ N pixels.
- the image sensor includes a Charge-Coupled Device (CCD) image sensor, a Complementary Mental-Oxide Semiconductor (CMOS) image sensor, for example.
- CCD Charge-Coupled Device
- CMOS Complementary Mental-Oxide Semiconductor
- a driver drives the image sensor according to a control of the controller 110.
- the driver drives entire pixels of the image sensor or pixels in an area of interest comprising a subset of the entire pixels in response to a control signal received from the controller 110 and image data output from the pixels is output to the controller 110.
- the controller 110 processes the image input from the cameras 151 and 152 or the image stored in the storage unit 175 as frames and outputs an image frame converted to be suitable for screen characteristics (size, picture quality, resolution, for example) of the touch screen 190.
- the GPS module 155 receives radio waves from a plurality of GPS satellites (not shown) in Earth's orbit and calculate a position of the portable terminal 100 by using Time of Arrival from the GPS satellites to the portable terminal 100.
- the input/output module 160 includes at least one of, the button 161, the microphone 162, the speaker 163, the vibration motor 164, the connector 165, and the keypad 166.
- the input/output module 160 except for the connector 165 is used for receiving a user input or informing the user of information.
- Other examples of the input/output module 160 are not limited thereto, but a mouse, a trackball, a joystick, or a cursor control such as cursor direction keys may be provided for information communication with the controller 110 and a control of a motion of the cursor on the touch screen 190.
- the button 161 may be formed on a front surface 100a, a side surface 100b, or a back surface 100c ( Figure 3 ) of the portable terminal 100, and may include at least one of a power button 161d, volume buttons 161e having a volume increase button 161f and a volume decrease button 161g, a menu button 161b, a home button 161a, a back button 161c, and a search button.
- the microphone 162 receives a voice or a sound to generate an electrical signal in response to a control of the controller 110.
- the speaker 163 outputs sounds corresponding to various signals (for example, a wireless signal, a broadcasting signal, a digital audio file, a digital video file, taking a picture, for example) of the mobile communication module 120, the sub communication module 130, the multimedia module 140, or the camera module 150 to the outside of the portable terminal 100, in response to a control of the controller 110.
- the speaker 163 outputs a sound (for example, button tone corresponding to phone communication, ringing tone, and a voice of another user) corresponding to a function performed by the portable terminal 100.
- One speaker 163 or a plurality of speakers 163 may be formed on a suitable position or positions of the housing of the portable terminal 100.
- the vibration motor 164 converts an electrical signal to a mechanical vibration in response to control of the controller 110. For example, when portable terminal 100 in a vibration mode receives voice or video phone communication from another device (not shown), a vibration motor is operated.
- One vibration motor 164 or a plurality of vibration motors 164 may be formed within the housing of the portable terminal 100.
- the vibration motor operates in accordance with a touch action of the user on the touch screen or successive touch motions or a gesture on the touch screen 190.
- the connector 165 may be used as an interface for connecting the apparatus with an external device (not shown) or a power source (not shown).
- the portable terminal 100 transmits or receives data stored in the storage unit 175 of the apparatus 100 to or from an external device (not shown) through a wired cable connected to the connector 165 in response to control of the controller 110.
- the external device may be a docking station, and the data may be an input signal transmitted from an external input device, for example, a mouse or a keyboard.
- the portable terminal 100 receives power from the power source through the wired cable connected to the connector 165 and charges a battery (not shown) using the power source.
- the keypad 166 receives a key input from the user for the control of the portable terminal 100.
- the keypad 166 includes a physical keypad (not shown) formed in the portable terminal 100 or a virtual keypad (not shown) displayed on the display unit 190.
- the physical keypad (not shown) formed in the portable terminal 100 may be excluded in response to a capability or structure of the portable terminal 100.
- An earphone (not shown) is inserted into the earphone connecting jack 167 to be connected with portable terminal 100.
- the sensor module 170 includes at least one sensor for detecting a state (position, direction and motion, for example) of the portable terminal 100.
- the sensor module 170 includes at least one of a proximity sensor for detecting whether a user approaches the portable terminal 100, an illumination sensor (not shown) for detecting an amount of ambient light of the portable terminal 100, a motion/direction sensor for detecting motions of the portable terminal 100 (for example, rotation, acceleration, retardation, vibration of the portable terminal 100), and an altimeter for measuring an atmospheric pressure to detect an altitude.
- the motion/direction sensor may include an acceleration sensor, a geo-magnetic sensor (not shown) for detecting a point of the compass by using the Earth's magnetic field, a gravity sensor for detecting a gravity action direction, a gyro sensor, an impact sensor, a GPS and a compass sensor, for example. At least one sensor detects a state, generates a signal corresponding to the detection, and transmits the signal to the controller 110.
- the sensors of the sensor module 170 may be present or omitted from portable terminal 100.
- the storage unit 175 stores a signal or data input/output in response to the operation of the communication module 120, the sub communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the input/output module 160, the sensor module 170, or the touch screen 190.
- the storage unit 175 stores a control program and applications for controlling the portable terminal 100 or the controller 110.
- the term "storage unit” is used to refer to a random data storage device such as the storage unit 175, the ROM 112 and the RAM 113 within the controller 110, or a memory card (for example, an SD card or a memory stick) installed in the portable terminal 100.
- the storage unit 175 stores images for providing applications having various functions such as navigation, a video phone call, a game for example and Graphical User Interfaces (GUIs) related to the applications, databases related to a method of providing user information, a document, and the user interface, data, background images (menu screen, standby screen for example) required for driving the portable terminal 100, operating programs, or images acquired by the camera.
- the storage unit 175 is a machine-readable storage medium (readable by a computer, for example), and a machine-readable medium is defined herein as a medium for providing data to a machine to perform a specific function.
- the storage unit 175 includes a non-volatile medium and a volatile medium. Such media or of a type enabling commands transmitted from, or stored by, the media are detectable by a physical device in a machine reading the commands.
- the machine-readable medium includes at least one of a floppy disk, a flexible disk, a hard disk, a magnetic tape, a Compact Disk Read-Only Memory (CD-ROM), an optical disk, a punch card, a paper tape, a Read-Only Memory (RAM), a Programmable ROM (PROM), an Erasable PROM (EPROM), and a flash-EPROM, for example.
- the power supplier 180 supplies power to a battery or a plurality of batteries (not shown) in the portable terminal 100 in response to a control of the controller 110.
- the battery or the plurality of batteries (not shown) supply power to the portable terminal 100.
- the power supply 180 provides power input from an external power source (not shown) to the portable terminal 100 through a wired cable connected to the connector 165.
- the power supply 180 supplies power wirelessly input from an external power source to the portable terminal 100 through a wireless charging unit.
- the touch screen 190 provides user interface display images corresponding to various services (for example, phone communication, data transmission, broadcasting, and photography) to the user.
- the touch screen 190 transmits an analog signal corresponding to at least one touch input to the user interface via the touch screen controller 195.
- the touch screen 190 receives at least one touch through a touch system (for example, a finger or a stylus). Further, the touch screen 190 can receive successive touch motions or a gesture as input commands.
- the touch screen 190 transmits an analog (or digital) signal corresponding to the successive motions of the input touch to the touch screen controller 195.
- a stylus 168 may be formed in a lower side surface of the portable terminal 100.
- the stylus 168 may be stored while being inserted into the portable terminal and may be withdrawn and removed from the portable terminal 100 when being used.
- a stylus attachment/detachment switch (not shown) operating in accordance with attachment and detachment of the stylus 168 is located in one area with in the portable terminal into which the stylus 168 is inserted and provides a signal corresponding to the attachment and detachment of the stylus 168 to the controller 110.
- a touch is not limited to be between the touch screen 190 and a touch element (a finger or a stylus ) and may include a non-contact (for example, a case where a physical distance interval between the touch screen 190 and the touch element is 1 cm or shorter).
- a detection threshold interval of the touch screen 190 may be changed in response to configuration information or structure of the portable terminal 100.
- the touch screen 190 changes an output value in response to an interval between the touch screen 190 and the touch element such that a touch event between the touch screen 190 and the touch element and an input (for example, hovering) event in a non-contact state are distinguishably detected. That is, the touch screen 190 is implemented to process a value (for example, a current value, a voltage value, a capacitance value) detected by the touch event in a different manner than a value detected by the hovering event.
- a value for example, a current value, a voltage value, a capacitance value
- the touch screen controller 195 converts an analog signal received from the touch screen 190 to a digital signal (for example, (X,Y) coordinates and a detection value) and transmits the converted digital signal to the controller 110.
- the controller 110 controls the touch screen 190 using the digital signal received from the touch screen controller 195.
- the controller 110 allows a short-cut icon displayed on the touch screen 190 to be executed in response to a touch event or a hovering event.
- the touch screen controller 195 is included in the controller 110 or the touch screen 190.
- the touch screen controller 195 calculates a distance between the touch element and the touch screen 190 based on a value output from the touch screen 190, and converts the calculated distance value to a digital signal (for example, a Z coordinate) and provides the converted digital signal to the controller 110.
- a digital signal for example, a Z coordinate
- the touch screen controller 190 determines whether the user input element (e.g., a stylus) and the touch screen 190 contact each other based on the value output from the touch screen 190, converts the value indicating whether the user input element and the touch screen 190 contact each other to a digital signal, and provides the digital signal to the controller 110.
- the touch screen 190 includes at least two touch screen panels which detect the input by the finger and the input by the stylus, respectively.
- the at least two touch screen panels provide different output values to the touch screen controller 195, and the touch screen controller 195 recognizes and distinguishes the values input from the at least two touch screen panels to determine whether the input from the touch screen 190 is the input by the finger or the stylus.
- the touch screen 190 in an embodiment has a structure in which one touch screen panel is a capacitive type and another touch screen panel is an Electromagnetic Resonance (EMR) type used in combination.
- EMR Electromagnetic Resonance
- the touch screen may include touch keys such as the menu button 161b, the back button 161c and accordingly, a finger input includes a touch input on the touch key as well as a finger input on the touch screen 190.
- the touch screen 190 is disposed on a center of the front surface 100a of the portable terminal 100.
- the touch screen 190 has a large size to occupy most of the front surface 100a of the portable terminal 100.
- FIG. 2 shows an example where a main home screen is displayed on the touch screen 190 and is a first screen displayed on the touch screen 190 when power of the portable terminal 100 is turned on. Further, when the portable terminal 100 has different home screens of several pages, the main home screen may be a first home screen of the home screens of several pages.
- Short-cut icons 191-1, 191-2, and 191-3 are used for executing frequently used applications and a main menu icon 191-4, time, weather for example may be displayed on the home screen.
- a status bar 192 displays the status of the portable terminal 100 such as a battery charging status, a received signal intensity, and a current time.
- the touch keys such as the home button 161a, the menu button 161b, the back button 161c for example, may alternatively comprise mechanical keys, or a combination thereof may be formed below the touch screen 190. Further, the touch keys may be a part of the touch screen 190.
- the touch screen 190 displays a main home screen. For example, when the home button 161a is pressed in a state where a menu screen or an application screen is displayed on the touch screen 190, the main home screen is displayed on the touch screen 190. That is, when the home button 161a is touched while applications are executed on the touch screen 190, the main home screen shown in FIG. 2 may be displayed on the touch screen 190.
- the home button 161a may be used to display recently used applications or a task manager on the touch screen 190.
- the menu button 161b provides a connection menu which can be used on the touch screen 190.
- the connection menu includes a widget addition menu, a background changing menu, a search menu, an editing menu or an environment setup menu for example.
- the back button 161c can be used for displaying the screen which was executed just before the currently executed screen or for terminating the most recently used application.
- the first camera 151, the illumination sensor 170a, and the proximity sensor 170b may be disposed on edges of the front surface 100a of the portable terminal 100.
- the second camera 152, the flash 153, and the speaker 163 may be disposed on a rear surface 100c of the portable terminal 100.
- the power button 161d and the volume buttons 161e may be disposed on left and right side surfaces of the portable terminal 100, and a terrestrial DMB antenna 141a for broadcasting reception and the earphone connecting jack 167 may be disposed on an upper side surface.
- one or a plurality of microphones 162 may be disposed on upper and lower side surfaces 100b of the portable terminal 100.
- the DMB antenna 141a may be fixed to the portable terminal 100 or may be formed to be detachable from the portable terminal 100.
- An earphone may be inserted into the earphone connecting jack 167.
- the connector 165 is formed in a lower side surface of the portable terminal 100.
- a plurality of electrodes are formed in the connector 165 and may be connected with an external device through a wired cable.
- FIG. 4 is a perspective view separately illustrating main components of the touch screen.
- the touch screen 190 has a configuration in which a first touch panel 410 for detecting a finger input from a top to a bottom, a display unit 420 for a screen display, and a second touch panel 430 for detecting a stylus input are stacked close to each other or sequentially stacked with an interval therebetween.
- the display unit 420 has a plurality of pixels and displays an image through the pixels.
- a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an LED for example may be used for the display unit 420.
- the first touch panel 410 includes a window 411 exposed through a front surface of the portable terminal 100 and a sensor layer 412 for detecting information (position, intensity for example) of a finger input, and the sensor layer 412 is deposited on a separate substrate over the window 411 or directly deposited on the window 411.
- the first touch panel 410 may be constructed to provide the touch keys such as the menu button 161b, the back button 161c for example located below the screen exposed to the user.
- An upper surface of the window 411 is included in at least a part of the front surface of the touch screen 190 exposed to the outside.
- the window 411 may be formed with a transparent insulating material for visible light. Examples of the insulating material may include resin such as polyimide and polyethylene terephthalate or plastic.
- a hard coating layer having high hardness is deposited on the upper surface of the window 411 to prevent a scratch and to improve the hardness and provide an antiglare function.
- the hard coating layer may be formed with a material generated by adding light scattering agents to general hard coating agents.
- the sensor layer 412 includes a sensor for detecting a position when a passive user input means contacts a surface of the window 411 and has preset patterns for the detection.
- the sensor layer 412 may have various patterns such as a linear grid pattern, a diamond pattern for example, and the linear grid pattern is described as an example in the present embodiment.
- the sensor layer 412 may be deposited on a lower surface of the window 411 or a lower end (or lower surface) may be attached to an upper end (upper surface) of the display unit 420.
- FIG. 5 shows a diagram illustrating an example of a pattern of a sensor layer.
- the sensor layer 412 includes first electrode lines 510 and second electrode lines 520.
- a cross-sectional view shown in a lower part of FIG. 5 illustrates the first electrode lines 510 TX1, TX2, and TX3 and the second electrode lines 520 RX.
- Each of the first electrode lines 510 extends in a first direction (for example, an x axis or a horizontal direction) and are disposed with an equal interval or different intervals in a second direction (for example, a y axis or a vertical direction) orthogonally crossing the first direction.
- Each of the second electrode lines 520 extends in the second direction orthogonally crossing the first direction and are disposed with an equal interval or different intervals in the first direction.
- An insulating layer 530 is disposed between the first electrode lines 510 and the second electrode lines 520 to electrically insulate the first electrode lines 510 and the second electrode lines 520.
- An insulating dielectric material such as SiO2 for example may be used as a material of the insulating layer 530.
- the sensor layer 412 is formed with a transparent conductive material for the visible light, and an example of the conductive material may include an organic material containing carbon such as carbon nanotube (CNT) or graphene.
- the sensor layer 412 may be formed through a process of forming a conductive thin film by a vacuum deposition process and then patterning the conductive thin film by a lithography process.
- An example of the vacuum deposition process may include E-beam, Sputtering, for example.
- a scan signal having a predetermined waveform is applied to the sensor layer 412.
- a detection signal waveform is changed due to capacitance between the sensor layer 412 and the first user input means.
- the controller 110 analyzes the detection signal and detects whether the first user input means contacts the surface of the window 411 and determines a contact position in the grid of first and second electrode lines 510 and 520. For example, when the first user input means contacts the touch screen 190, capacitance of a corresponding sensing point 540 increases.
- the controller 110 detects generation of a finger touch event based on a detection signal having a peak value equal to or larger than a threshold (or a minimum value equal to or smaller than the threshold) and also detects a finger input position.
- the threshold is a value by which a noise and a normal signal can be distinguished.
- the threshold is experimentally set, and may be set to have, for example, a voltage equal to or larger than 0 V or a capacitance value equal to or larger than 0 pf.
- a finger is an example of the first user input means, and the first user input means has no limitation as long as it is a means which provides capacitance between the sensor layer 412 and the first user input means. Such means are collectively called passive or first user input means.
- voltages that is, scan signals
- the touch screen controller 195 In order to perform the sensor function, voltages (that is, scan signals) having a predetermined waveform from the touch screen controller 195 are sequentially applied to the first electrode lines 510, and the second electrode lines 520 and output detection signals in response to the scan signals are provided to the touch screen controller 195.
- Points where the first and second electrode lines 510 and 520 cross are the sensing points 540, and the sensing points 540 are disposed in a matrix structure in the present embodiment. That is, a finger input position is determined as one of positions of the sensing points 540.
- capacitance of the sensing points 540 is changed due to the capacitance between the sensor layer 412 and the first user input means. Due to the change in the capacitance, voltage waveforms of the detection signals output from the second electrode lines 520 are changed and an input position and/or an input intensity of the first user input means is detected in response to the detected changed voltage waveforms.
- FIG. 6 shows a diagram of the second touch panel 430 comprising a touch panel of the Electromagnetic Resonance (EMR) type and includes first and second loop units 610 and 620.
- the second touch panel 430 is operated by a control of the touch screen controller 195 and outputs detected signals to the touch screen controller 195.
- the first loop unit 610 includes a plurality of first loops 611 and the second loop unit 620 includes a plurality of second loops 621.
- the first loop unit 610 and the second loop unit 620 may be disposed to be orthogonal to each other.
- the first loop unit 610 extends relatively long in a y axis in comparison with an x axis, and accordingly, is used to detect an x axis coordinate of a stylus input position.
- the second loop unit 620 extends relatively long in an x axis in comparison with a y axis, and accordingly, is used to detect a y axis coordinate of a stylus input position.
- Each of the first and second loops 611 and 621 output a first signal of a fixed first frequency input in a form of an electromagnetic wave from the touch screen controller 195 . Further, the first and second loops 611 and 621 detect a second signal of a second frequency output in a form of an electromagnetic wave from a stylus corresponding to an active second user input means, and outputs the detected second signal to the touch screen controller 195.
- the first frequency and the second frequency may be different from each other.
- the stylus located adjacent to the second touch panel 430 receives a first signal output in a form of an electromagnetic wave from the second touch panel 430 and in response generates a second or third signal in a form of an electromagnetic wave according to operation of a resonance circuit within the stylus.
- the stylus resonant circuit emits the generated second or third signal which is detected by coils 610 and 620.
- the stylus When the stylus does not contact the touch screen 190, the stylus outputs a second signal of a fixed frequency. When the stylus contacts the touch screen 190, the stylus outputs a third signal of a second or third frequency which changes in response to contact pressure. Alternatively, in one embodiment the stylus outputs a second signal of a fixed second frequency regardless of the contact between the stylus and the touch screen 190. The stylus may output a third signal of a fixed second frequency including data indicating whether the stylus contacts the touch screen 190. Further, the stylus is one embodiment, and other means can be used as a stylus if the means can output a second and/or third signals of the second and/or third frequency in response to an input of the first signal of the first frequency. The means may be collectively called the second user input means.
- the stylus includes a resonance circuit including a coil for detecting a position of the second touch panel 430 in the EMR type and a condenser.
- FIGs. 7 and 8 show diagrams illustrating detection of a stylus input position where each of the first and second loops 611 and 621 is indicated by one line.
- the second loop 621 (hereinafter, referred to as a Y2 loop) emits a first signal in a form of an electromagnetic wave, and stylus 10 in response to the first signal, generates and emits a second signal in a form of an electromagnetic wave.
- the first loops 611 (hereinafter, referred to as X1, X2, and X3 loops) sequentially detect the second signal.
- the touch screen controller 195 derives an x axis coordinate of a stylus position in response to a peak or minimum value of an output of multiple output values provided by loops 611 derived in response to the second signal.
- controller 195 derives an x axis coordinate of a stylus position in response to comparison of the peak value with a first threshold and comparison of the minimum value with a second threshold.
- a threshold may be set as a voltage equal to or larger than 0 V or an electrical current value equal to or larger than 0 A.
- the first loop 611 (eg, an X2 loop) emits a first signal in a form of an electromagnetic wave
- the stylus 10 generates and emits a second signal in a form of an electromagnetic wave in response to the first signal.
- the second loops 621 (hereinafter, referred to as Y1, Y2, and Y3 loops) sequentially detect the second signal.
- the touch screen controller 195 derives a y axis coordinate of a stylus input position in response to a peak or minimum value of an output of multiple output values provided by loops 621 derived in response to the second signal.
- controller 195 derives a y axis coordinate of a stylus position in response to comparison of the peak value with a first threshold and comparison of the minimum value with a second threshold.
- FIG. 9 is a diagram illustrating a hover input.
- An input recognition distance as used herein comprises a maximum distance between a user input means (stylus or finger) and the touch screen 190 within which the controller 110 or the touch screen controller 195 can detect and output an input coordinate.
- An input recognition distance of the second touch panel 430 is larger than an input recognition distance of the first touch panel 410. Since the first touch panel 410 has a relatively small input recognition distance (that is, the input recognition distance is about 0) Finger input touch detection by system 100 is limited to contact with the touch screen 190.
- the second touch panel 430 in contrast, detects a stylus hover input and a stylus touch (contact) input.
- the second touch panel 430 In response to a distance between the stylus 10 and the touch screen 190 being larger than 0 and ranges within the input recognition distance, the second touch panel 430 detects and outputs a second signal. In response to a distance between the stylus 10 and the touch screen 190 is 0, the second touch panel 430 detects and outputs a third signal. That is, the second touch panel 430 detects and outputs the second signal in response to a hover operation of a user, and detects and outputs a third signal in response to a touch operation of the user.
- a stylus hover input and a stylus touch input is distinguished by existence or non-existence of pressure applied on the touch screen 190 by the stylus 10. When the pressure is 0, the second touch panel 430 outputs the second signal. When the pressure is larger than 0, the second touch panel 430 outputs the third signal.
- a finger input When a user makes a finger input using a finger 11 in a state comprising grasping the stylus 10, the finger input may be ignored. Further, when the user makes a stylus input, a touch of a palm 12 may occur.
- the system distinguishes a finger input and a stylus input and an associated intention of the user. Inputs by parts of the body such as a finger, a palm for example are collectively called a finger input herein.
- FIG. 10 shows a flowchart of a method of processing multiple touch inputs.
- a hover input is detected.
- the controller 110 detects a stylus hover input. That is, the controller 110 detects and recognizes a stylus hover event of a user in response to a detection value of the touch screen 190.
- the second touch panel 430 outputs a first signal of a fixed first frequency in a form of an electromagnetic wave and detects a second signal of a fixed second frequency output in a form of an electromagnetic wave from the stylus.
- the controller 110 detects generation of the stylus hover event in response to the second signal having a peak value equal to or larger than a threshold and also detects a stylus input position and/or intensity.
- a finger input is detected.
- the controller 110 detects the finger input in response to a detection value of the touch screen 190. That is, the controller 110 detects the finger touch (or palm touch) event based on the detection value of the touch screen 190.
- a scan signal is applied to the sensor layer 412 of the first touch panel 410 and the sensor layer 412 outputs a detection signal.
- the controller 110 detects generation of the finger touch event based on the detection signal having a peak value equal to or larger than a threshold and also detects a finger input position and/or intensity.
- controller 110 calculates a distance between a hover input position and a finger input position.
- FIGs. 11A to 11C show diagrams for describing a process of calculating a distance between a hover input position and a finger input position.
- FIG. 11A shows a hover input pattern 1110 and a finger input pattern 1120 detected by the controller 110, and the controller 110 calculates a distance between positions of the hover input pattern 1110 and the finger input pattern 1120.
- the position of each of the patterns 1110 and 1120 are recorded using a center coordinate and an edge coordinate for example.
- a center coordinate of each of the patterns 1110 and 1120 is indicates position of each of the patterns 1110 and 1120.
- the positions of the patterns 1110 and 1120 are determined by fixed coordinates of the objects such as center coordinates of the objects.
- the finger input pattern 1120 may be generated by a touch between the palm and the touch screen during a process in which the user attempts to perform a stylus input.
- the distance D1 between the positions of the hover input pattern 1110 and the finger input pattern 1120 has a value larger than 30 mm.
- FIG. 11B shows a hover input pattern 1112 and a finger input pattern 1122 detected by the controller 110, and the controller 110 calculates a distance D2 between pattern 1112 and pattern 1122 positions.
- the finger input pattern 1122 is generated by a touch between the palm and the touch screen 190 in a state where the user grasps the stylus, for example.
- the distance D2 between the positions of the hover input pattern 1112 and the finger input pattern 1122 has a value equal to or smaller than 30 mm.
- FIG. 11C shows a hover input pattern 1114 and finger input patterns 1124 and 1126 detected by the controller 110, and the controller 110 calculates a distance D3 between positions of the hover input pattern 1114 and the first finger input pattern 1124 and a distance D4 between pattern 1114 and pattern 1126 positions.
- the finger input patterns 1124 and 1126 may be generated by a touch between two fingers (for example, a thumb and a middle finger) and the touch screen 190 is in a state where the user grasps a stylus.
- the distance D3 between the positions of the hover input pattern 1114 and the first finger input pattern 1124 has a value equal to or smaller than 30 mm
- the distance D4 between the positions of the hover input pattern 1114 and the second finger input pattern 1126 generally has a value larger than 30 mm.
- the controller 110 compares the distance D3 which is shorter than the distance D4 with a threshold. Controller 110 processes or ignores the detected first finger input pattern 1124 and pattern 1126 in response to comparison of D3 and D4 with one or more respective thresholds and in response to a relative comparison of D3 and D4.
- controller 110 compares a calculated distance with a threshold.
- the threshold may be experimentally determined, and may be set as, for example, a value ranging from 20 mm to 40 mm or 30 mm to 50 mm.
- controller 110 ignores the finger input when the calculated distance exceeds the threshold.
- the finger input position may correspond to the short-cut icons 191-1, 191-2, and 191-3, the main menu icon 191-4, the home button 161a, the menu button 161b, the back button 161c, and a menu within an application window, or may be associated with a selection of a position within the touch screen 190.
- the controller 110 performs a program operation corresponding to the finger input position.
- Controller 110 selects objects (application, menu, icon for example), executes objects and selects positions for example.
- the controller 110 does not perform the program operation corresponding to the finger input position but may indicate occurrence of the finger input for the user (for example, through a vibration, a sound, an indicator for example).
- controller 110 processes the finger input when the calculated distance is within the threshold.
- the controller 110 performs the program operation corresponding to the finger input position, by selecting objects, executing objects, selecting positions for example.
- controller 110 uses the previously described method to advantageously derive the intention of the user and processes the finger input in accordance with this intention.
- the embodiments of the present invention can be implemented in software, hardware, or a combination thereof. Any such software may be stored, for example, in a volatile or non-volatile storage device such as a ROM, a memory such as a RAM, a memory chip, a memory device, or a memory IC, or a recordable optical or magnetic medium such as a CD, a DVD, a magnetic disk, or a magnetic tape, regardless of its ability to be erased or its ability to be re-recorded. It can be also appreciated that the memory included in the mobile terminal is one example of machine-readable devices suitable for storing a program including instructions that are executed by a processor device to thereby implement embodiments of the present invention.
- embodiments of the present invention provide a program including codes for implementing a system or method claimed in any claim of the accompanying claims and a machine-readable device for storing such a program. Further, this program may be electronically conveyed through any medium such as a communication signal transferred via a wired or wireless connection, and embodiments of the present invention appropriately include equivalents thereto.
- the portable terminal can receive the program from a program providing apparatus connected to the portable terminal wirelessly or through a wire and store the received program.
- the program providing apparatus may include a memory for storing a program containing instructions for allowing the portable terminal to perform a preset content protecting method and information required for the content protecting method, a communication unit for performing wired or wireless communication with the portable terminal, and a controller for transmitting the corresponding program to the portable terminal in response to a request of the portable terminal or automatically.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Calculators And Similar Devices (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- The present disclosure relates to a user interface of a portable or other processing device such as a phone, notebook or computer, for processing input commands by finger, stylus (e.g. a stylus) and other devices via a touch screen.
- A portable device (a mobile terminal such as a phone, notebook, computer, and watch for example) typically includes a touch screen for receiving user input. A touch screen is used to distinguishably detect a finger input or the like and a stylus (stylus) input. The touch screen typically prioritizes a stylus input so that when there is a stylus input, the touch screen ignores a finger input to prevent malfunction associated with an inadvertent palm touch. However, where a user performs a finger touch while grasping a stylus, since a recognition distance of a stylus is relatively large, the finger touch is sometimes ignored, which may be recognized as a malfunction of the touch screen and cause problems. A system according to invention principles addresses this deficiency and related problems
- A user interface system according to invention principles supports processing concurrent finger input and the stylus input commands in accordance with an intention of the user. The system detects a hover input command, detects a finger input command concurrently with the hover input command, calculates a distance between positions of the hover input command and finger input command, compares the calculated distance with a predetermined threshold and at least one of, ignores and processes the finger input command in response to a result of the comparison.
- A portable terminal includes a machine-readable storage medium including a program executable by a processor for processing a touch input command. The portable terminal comprises a touch screen that displays input data and detects a hover input command and a finger input command. A controller calculates a distance between positions of the hover input command and the finger input command when the finger input is detected concurrently with the hover input command, compares the calculated distance with a predetermined threshold and ignores or processes the finger input command in response to a result of the comparison.
- The above and other aspects, features, and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram schematically illustrating a portable terminal according to invention principles; -
FIG. 2 is a front perspective view of a portable terminal according to invention principles; -
FIG. 3 is a rear perspective view of a portable terminal according to invention principles; -
FIG. 4 is a perspective view separately showing main components of a touch screen according to invention principles; -
FIG. 5 is a diagram illustrating an example of a pattern of a sensor layer according to invention principles; -
FIG. 6 is a diagram for describing a second touch panel according to invention principles; -
FIGs. 7 and8 are diagrams for describing a method of detecting a stylus input position according to invention principles; -
FIG. 9 is a diagram for describing a hovering input according to invention principles; -
FIG. 10 is a flowchart illustrating a method of processing a multi-touch according to invention principles; and -
FIGs. 11A, 11B and 11C are diagrams for describing a process of calculating a distance between a hovering input position and a finger input position according to invention principles. - The present invention may have various modifications and embodiments and thus will be described with reference to specific embodiments in detail. However, the present invention is not limited to the specific embodiments but should be construed as including all modifications, equivalents, and substitutes within the spirit and scope of the present invention.
- While terms including ordinal numbers, such as "first" and "second," etc., may be used to describe various components, such components are not limited by the above terms. The above terms are used only to distinguish one component from another. For example, a first component may be referred to as a second component without departing from the scope of the present invention, and likewise a second component may be referred to as a first component. The term of and/or encompasses a combination of plural items or any one of the plural items.
- The terms used herein are merely used to describe specific embodiments, and are not intended to limit the present invention. An expression used in the singular encompasses the expression of the plural, unless it has a clearly different meaning in the context. The terms such as "include" and/or "have" may be construed to denote a certain characteristic, number, step, operation, constituent element, component or a combination thereof, but may not be construed to exclude the existence of or a possibility of addition of one or more other characteristics, numbers, steps, operations, constituent elements, components or combinations thereof.
- Unless defined otherwise, all terms used herein have the same meaning as commonly understood by those of skill in the art. Such terms as those defined in a generally used dictionary are to be interpreted to have the meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted to have ideal or excessively formal meanings unless clearly defined in the present specification. No claim element herein is to be construed under the provisions of 35 U.S.C. 112, sixth paragraph, unless the element is expressly recited using the phrase "means for". A stylus as used herein comprises a pointed instrument used as an input device on a touch screen or pressure-sensitive screen and may comprise a pen, writing instrument, or other hand held pointing instrument.
-
FIG. 1 shows a block diagram schematically illustrating a portable terminal according to an embodiment of the present invention,FIG. 2 is a front perspective view of a portable terminal according to an embodiment of the present invention, andFIG. 3 is a rear perspective view of a portable terminal according to an embodiment of the present invention. - Referring to
FIG. 1 , aportable terminal 100 may be connected with an external device (not shown) by using an external device connector such as asub communication module 130, aconnector 165, and anearphone connecting jack 167. The external device includes various devices attached to or detached from theportable terminal 100 through a cable, such as an earphone, an external speaker, a Universal Serial Bus (USB) memory, a charger, a cradle/dock, a DMB (Digital Multimedia Broadcasting) antenna, a mobile payment related device, a health management device (blood sugar tester, for example), a game machine, a car navigation device, for example. Further, the external device includes a Bluetooth communication device, a short distance communication device such as a Near Field Communication (NFC) device, a WiFi Direct communication device, and a wireless Access Point (AC) which may be wirelessly connected. In addition, the external device may include another device, a mobile phone, a smart phone, a tablet PC, a desktop PC, and a server. - The
portable terminal 100 may comprise a smart phone, a mobile phone, a game machine, a TV, a display device, a head unit for a vehicle, a notebook, a laptop, a tablet PC, a Personal Media Player (PMP), a Personal Digital Assistant (PDA) or a watch, for example. Theportable terminal 100 may be implemented as a pocket size portable mobile terminal having a wireless communication function. - The
portable terminal 100 includes atouch screen 190 and atouch screen controller 195. Further, theportable terminal 100 includes acontroller 110, amobile communication module 120, asub communication module 130, amultimedia module 140, acamera module 150, aGPS module 155, an input/output module 160, asensor module 170, astorage unit 175, and apower supplier 180. Thesub communication module 130 includes at least one of awireless LAN module 131 and a shortdistance communication module 132, and themultimedia module 140 includes at least one of abroadcasting communication module 141, anaudio reproduction module 142, and avideo reproduction module 143. Thecamera module 150 includes at least one of afirst camera 151 and asecond camera 152. The input/output module 160 includes at least one of abutton 161, amicrophone 162, aspeaker 163, avibration motor 164, aconnector 165, akeypad 166, and anearphone connecting jack 167. - The
controller 110 includes aCPU 111, aROM 112 storing a control program for controlling theportable terminal 100, and aRAM 113 used as a storage area for storing a signal or data input from the outside of theapparatus 100 or for work performed in theportable terminal 100. TheCPU 111 includes a single core, a dual core, a triple core, a quad core or comprises another architecture. TheCPU 111, theROM 112, and theRAM 113 may be mutually connected to each other through an internal bus. Thecontroller 110 controls themobile communication module 120, thesub communication module 130, themultimedia module 140, thecamera module 150, theGPS module 155, the input/output module 160, thesensor module 170, thestorage unit 175, thepower supplier 180, thetouch screen 190, and thetouch screen controller 195. - The
mobile communication module 120, thesub communication module 130, and thebroadcasting communication module 141 of themultimedia module 140 may be collectively called a communication unit, and the communication unit is provided for a direct connection with an external device or a connection through a network and may be a wired or wireless communication unit. The communication unit can transmit data to thecontroller 110, thestorage unit 175, and thecamera module 150 in a wired manner or wirelessly, or receive data from an external communication line or the air and transmit the data to thecontroller 110 or store the data in thestorage unit 175. - The
mobile communication module 120 enables theportable terminal 100 to be connected with the external device through mobile communication by using one antenna or a plurality of antennas according to a control of thecontroller 110. Themobile communication module 120 transmits or receives a wireless signal for exchanging, unidirectionally transmitting, or receiving data of voice phone communication, video phone communication, a Short Message Service (SMS), or a Multimedia Message Service (MMS) to/from a mobile phone (not shown), a smart phone (not shown), a tablet PC, or another device (not shown) having a phone number input into theapparatus 100. - The
sub communication module 130 may include at least one of thewireless LAN module 131 and the shortdistance communication module 132. For example, thesub communication module 130 may include just thewireless LAN module 131, just the nearfield communication module 132, or both thewireless LAN module 131 and the nearfield communication module 132. - The
wireless LAN module 131 may be Internet-connected according to a control of thecontroller 110 in a place where a wireless Access Point (AP) (not shown) is installed. Thewireless LAN module 131 supports a wireless LAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers. The shortdistance communication module 132 wirelessly performs near field communication between theportable terminal 100 and an image forming apparatus (not shown) in response to control of thecontroller 110. A short distance communication method includes Bluetooth, Infrared Data Association (IrDA) communication, WiFi-Direct communication and Near Field Communication (NFC), for example. - The
portable terminal 100 may include at least one of themobile communication module 120, thewireless LAN module 131, and the shortdistance communication module 132. For example, theportable terminal 100 may include a combination of themobile communication module 120, thewireless LAN module 131, and the shortdistance communication module 132. - The
multimedia module 140 includes thebroadcasting communication module 141, theaudio reproduction module 142, or thevideo reproduction module 143. Thebroadcasting communication module 141 receives a broadcasting signal (for example, a TV broadcasting signal, a radio broadcasting signal, or a data broadcasting signal) and broadcasting supplemental information (for example, Electric Program Guide: EPG or Electric Service Guide: ESG) output from a broadcasting station through a broadcasting communication antenna (not shown) in response to control of thecontroller 110. Theaudio reproduction module 142 reproduces a digital audio file (for example, a file having a file extension of mp3, wma, ogg, or wav) stored or received in response to control of thecontroller 110. Thevideo reproduction module 143 reproduces a digital video file (for example, a file having a file extension of mpeg, mpg, mp4, avi, mov, or mkv) stored or received in response to a control of thecontroller 110. Thevideo reproduction module 143 reproduces the digital audio file. - The
multimedia module 140 includes theaudio reproduction module 142 or thevideo reproduction module 143 except for thebroadcasting communication module 141. Further, theaudio reproduction module 142 or thevideo reproduction module 143 of themultimedia module 140 may be included in thecontroller 110. - The
camera module 150 includes at least one of thefirst camera 151 and thesecond camera 152 for photographing a still image or a video according to a control of thecontroller 110. Further, thefirst camera 151 or thesecond camera 152 includes an auxiliary light source (for example, a flash (not shown) providing light required for the photographing. Thefirst camera 151 may be disposed on a front surface of theapparatus 100, and thesecond camera 152 may be disposed on a back surface of theapparatus 100. Alternatively, thefirst camera 151 and thesecond camera 152 may be closely located to each other (for example, an interval between thefirst camera 151 and thesecond camera 152 is larger than 1 cm and smaller than 8 cm) and acquire a three dimensional still image or a three dimensional video. - The
cameras cameras controller 110. The user acquires a video or a still image through thecameras - A driver drives the image sensor according to a control of the
controller 110. The driver drives entire pixels of the image sensor or pixels in an area of interest comprising a subset of the entire pixels in response to a control signal received from thecontroller 110 and image data output from the pixels is output to thecontroller 110. - The
controller 110 processes the image input from thecameras storage unit 175 as frames and outputs an image frame converted to be suitable for screen characteristics (size, picture quality, resolution, for example) of thetouch screen 190. - The
GPS module 155 receives radio waves from a plurality of GPS satellites (not shown) in Earth's orbit and calculate a position of theportable terminal 100 by using Time of Arrival from the GPS satellites to theportable terminal 100. - The input/
output module 160 includes at least one of, thebutton 161, themicrophone 162, thespeaker 163, thevibration motor 164, theconnector 165, and thekeypad 166. The input/output module 160 except for theconnector 165 is used for receiving a user input or informing the user of information. Other examples of the input/output module 160 are not limited thereto, but a mouse, a trackball, a joystick, or a cursor control such as cursor direction keys may be provided for information communication with thecontroller 110 and a control of a motion of the cursor on thetouch screen 190. - The
button 161 may be formed on afront surface 100a, aside surface 100b, or aback surface 100c (Figure 3 ) of theportable terminal 100, and may include at least one of apower button 161d,volume buttons 161e having avolume increase button 161f and avolume decrease button 161g, amenu button 161b, ahome button 161a, aback button 161c, and a search button. - The
microphone 162 receives a voice or a sound to generate an electrical signal in response to a control of thecontroller 110. - The
speaker 163 outputs sounds corresponding to various signals (for example, a wireless signal, a broadcasting signal, a digital audio file, a digital video file, taking a picture, for example) of themobile communication module 120, thesub communication module 130, themultimedia module 140, or thecamera module 150 to the outside of theportable terminal 100, in response to a control of thecontroller 110. Thespeaker 163 outputs a sound (for example, button tone corresponding to phone communication, ringing tone, and a voice of another user) corresponding to a function performed by theportable terminal 100. Onespeaker 163 or a plurality ofspeakers 163 may be formed on a suitable position or positions of the housing of theportable terminal 100. - The
vibration motor 164 converts an electrical signal to a mechanical vibration in response to control of thecontroller 110. For example, whenportable terminal 100 in a vibration mode receives voice or video phone communication from another device (not shown), a vibration motor is operated. Onevibration motor 164 or a plurality ofvibration motors 164 may be formed within the housing of theportable terminal 100. The vibration motor operates in accordance with a touch action of the user on the touch screen or successive touch motions or a gesture on thetouch screen 190. - The
connector 165 may be used as an interface for connecting the apparatus with an external device (not shown) or a power source (not shown). Theportable terminal 100 transmits or receives data stored in thestorage unit 175 of theapparatus 100 to or from an external device (not shown) through a wired cable connected to theconnector 165 in response to control of thecontroller 110. The external device may be a docking station, and the data may be an input signal transmitted from an external input device, for example, a mouse or a keyboard. Theportable terminal 100 receives power from the power source through the wired cable connected to theconnector 165 and charges a battery (not shown) using the power source. - The
keypad 166 receives a key input from the user for the control of theportable terminal 100. Thekeypad 166 includes a physical keypad (not shown) formed in theportable terminal 100 or a virtual keypad (not shown) displayed on thedisplay unit 190. The physical keypad (not shown) formed in theportable terminal 100 may be excluded in response to a capability or structure of theportable terminal 100. - An earphone (not shown) is inserted into the
earphone connecting jack 167 to be connected withportable terminal 100. - The
sensor module 170 includes at least one sensor for detecting a state (position, direction and motion, for example) of theportable terminal 100. For example, thesensor module 170 includes at least one of a proximity sensor for detecting whether a user approaches theportable terminal 100, an illumination sensor (not shown) for detecting an amount of ambient light of theportable terminal 100, a motion/direction sensor for detecting motions of the portable terminal 100 (for example, rotation, acceleration, retardation, vibration of the portable terminal 100), and an altimeter for measuring an atmospheric pressure to detect an altitude. Further, the motion/direction sensor may include an acceleration sensor, a geo-magnetic sensor (not shown) for detecting a point of the compass by using the Earth's magnetic field, a gravity sensor for detecting a gravity action direction, a gyro sensor, an impact sensor, a GPS and a compass sensor, for example. At least one sensor detects a state, generates a signal corresponding to the detection, and transmits the signal to thecontroller 110. The sensors of thesensor module 170 may be present or omitted fromportable terminal 100. - The
storage unit 175 stores a signal or data input/output in response to the operation of thecommunication module 120, thesub communication module 130, themultimedia module 140, thecamera module 150, theGPS module 155, the input/output module 160, thesensor module 170, or thetouch screen 190. Thestorage unit 175 stores a control program and applications for controlling theportable terminal 100 or thecontroller 110. The term "storage unit" is used to refer to a random data storage device such as thestorage unit 175, theROM 112 and theRAM 113 within thecontroller 110, or a memory card (for example, an SD card or a memory stick) installed in theportable terminal 100. - The
storage unit 175 stores images for providing applications having various functions such as navigation, a video phone call, a game for example and Graphical User Interfaces (GUIs) related to the applications, databases related to a method of providing user information, a document, and the user interface, data, background images (menu screen, standby screen for example) required for driving theportable terminal 100, operating programs, or images acquired by the camera. Thestorage unit 175 is a machine-readable storage medium (readable by a computer, for example), and a machine-readable medium is defined herein as a medium for providing data to a machine to perform a specific function. Thestorage unit 175 includes a non-volatile medium and a volatile medium. Such media or of a type enabling commands transmitted from, or stored by, the media are detectable by a physical device in a machine reading the commands. - The machine-readable medium includes at least one of a floppy disk, a flexible disk, a hard disk, a magnetic tape, a Compact Disk Read-Only Memory (CD-ROM), an optical disk, a punch card, a paper tape, a Read-Only Memory (RAM), a Programmable ROM (PROM), an Erasable PROM (EPROM), and a flash-EPROM, for example.
- The
power supplier 180 supplies power to a battery or a plurality of batteries (not shown) in theportable terminal 100 in response to a control of thecontroller 110. The battery or the plurality of batteries (not shown) supply power to theportable terminal 100. Further, thepower supply 180 provides power input from an external power source (not shown) to theportable terminal 100 through a wired cable connected to theconnector 165. In addition, thepower supply 180 supplies power wirelessly input from an external power source to theportable terminal 100 through a wireless charging unit. - The
touch screen 190 provides user interface display images corresponding to various services (for example, phone communication, data transmission, broadcasting, and photography) to the user. Thetouch screen 190 transmits an analog signal corresponding to at least one touch input to the user interface via thetouch screen controller 195. Thetouch screen 190 receives at least one touch through a touch system (for example, a finger or a stylus). Further, thetouch screen 190 can receive successive touch motions or a gesture as input commands. Thetouch screen 190 transmits an analog (or digital) signal corresponding to the successive motions of the input touch to thetouch screen controller 195. - Further, a
stylus 168 may be formed in a lower side surface of theportable terminal 100. Thestylus 168 may be stored while being inserted into the portable terminal and may be withdrawn and removed from theportable terminal 100 when being used. In addition, a stylus attachment/detachment switch (not shown) operating in accordance with attachment and detachment of thestylus 168 is located in one area with in the portable terminal into which thestylus 168 is inserted and provides a signal corresponding to the attachment and detachment of thestylus 168 to thecontroller 110. - Furthermore, a touch is not limited to be between the
touch screen 190 and a touch element (a finger or a stylus ) and may include a non-contact (for example, a case where a physical distance interval between thetouch screen 190 and the touch element is 1 cm or shorter). A detection threshold interval of thetouch screen 190 may be changed in response to configuration information or structure of theportable terminal 100. Particularly, thetouch screen 190 changes an output value in response to an interval between thetouch screen 190 and the touch element such that a touch event between thetouch screen 190 and the touch element and an input (for example, hovering) event in a non-contact state are distinguishably detected. That is, thetouch screen 190 is implemented to process a value (for example, a current value, a voltage value, a capacitance value) detected by the touch event in a different manner than a value detected by the hovering event. - The
touch screen controller 195 converts an analog signal received from thetouch screen 190 to a digital signal (for example, (X,Y) coordinates and a detection value) and transmits the converted digital signal to thecontroller 110. Thecontroller 110 controls thetouch screen 190 using the digital signal received from thetouch screen controller 195. For example, thecontroller 110 allows a short-cut icon displayed on thetouch screen 190 to be executed in response to a touch event or a hovering event. In one embodiment, thetouch screen controller 195 is included in thecontroller 110 or thetouch screen 190. - Further, the
touch screen controller 195 calculates a distance between the touch element and thetouch screen 190 based on a value output from thetouch screen 190, and converts the calculated distance value to a digital signal (for example, a Z coordinate) and provides the converted digital signal to thecontroller 110. - Moreover, the
touch screen controller 190 determines whether the user input element (e.g., a stylus) and thetouch screen 190 contact each other based on the value output from thetouch screen 190, converts the value indicating whether the user input element and thetouch screen 190 contact each other to a digital signal, and provides the digital signal to thecontroller 110. In addition, in order to distinguishably detect an input by a finger and an input by a stylus, thetouch screen 190 includes at least two touch screen panels which detect the input by the finger and the input by the stylus, respectively. The at least two touch screen panels provide different output values to thetouch screen controller 195, and thetouch screen controller 195 recognizes and distinguishes the values input from the at least two touch screen panels to determine whether the input from thetouch screen 190 is the input by the finger or the stylus. For example, thetouch screen 190 in an embodiment has a structure in which one touch screen panel is a capacitive type and another touch screen panel is an Electromagnetic Resonance (EMR) type used in combination. Further, as described above, the touch screen may include touch keys such as themenu button 161b, theback button 161c and accordingly, a finger input includes a touch input on the touch key as well as a finger input on thetouch screen 190. - Referring to
FIG. 2 , thetouch screen 190 is disposed on a center of thefront surface 100a of theportable terminal 100. Thetouch screen 190 has a large size to occupy most of thefront surface 100a of theportable terminal 100.FIG. 2 shows an example where a main home screen is displayed on thetouch screen 190 and is a first screen displayed on thetouch screen 190 when power of theportable terminal 100 is turned on. Further, when theportable terminal 100 has different home screens of several pages, the main home screen may be a first home screen of the home screens of several pages. Short-cut icons 191-1, 191-2, and 191-3 are used for executing frequently used applications and a main menu icon 191-4, time, weather for example may be displayed on the home screen. Further, astatus bar 192 displays the status of theportable terminal 100 such as a battery charging status, a received signal intensity, and a current time. - The touch keys such as the
home button 161a, themenu button 161b, theback button 161c for example, may alternatively comprise mechanical keys, or a combination thereof may be formed below thetouch screen 190. Further, the touch keys may be a part of thetouch screen 190. When thehome button 161a is selected, thetouch screen 190 displays a main home screen. For example, when thehome button 161a is pressed in a state where a menu screen or an application screen is displayed on thetouch screen 190, the main home screen is displayed on thetouch screen 190. That is, when thehome button 161a is touched while applications are executed on thetouch screen 190, the main home screen shown inFIG. 2 may be displayed on thetouch screen 190. In addition, thehome button 161a may be used to display recently used applications or a task manager on thetouch screen 190. Themenu button 161b provides a connection menu which can be used on thetouch screen 190. The connection menu includes a widget addition menu, a background changing menu, a search menu, an editing menu or an environment setup menu for example. Theback button 161c can be used for displaying the screen which was executed just before the currently executed screen or for terminating the most recently used application. - The
first camera 151, theillumination sensor 170a, and theproximity sensor 170b may be disposed on edges of thefront surface 100a of theportable terminal 100. Thesecond camera 152, theflash 153, and thespeaker 163 may be disposed on arear surface 100c of theportable terminal 100. For example, thepower button 161d and thevolume buttons 161e may be disposed on left and right side surfaces of theportable terminal 100, and aterrestrial DMB antenna 141a for broadcasting reception and theearphone connecting jack 167 may be disposed on an upper side surface. Further, one or a plurality ofmicrophones 162 may be disposed on upper and lower side surfaces 100b of theportable terminal 100. TheDMB antenna 141a may be fixed to theportable terminal 100 or may be formed to be detachable from theportable terminal 100. An earphone may be inserted into theearphone connecting jack 167. Further, theconnector 165 is formed in a lower side surface of theportable terminal 100. A plurality of electrodes are formed in theconnector 165 and may be connected with an external device through a wired cable. -
FIG. 4 is a perspective view separately illustrating main components of the touch screen. As illustrated inFIG. 4 , thetouch screen 190 has a configuration in which afirst touch panel 410 for detecting a finger input from a top to a bottom, adisplay unit 420 for a screen display, and asecond touch panel 430 for detecting a stylus input are stacked close to each other or sequentially stacked with an interval therebetween. Thedisplay unit 420 has a plurality of pixels and displays an image through the pixels. A Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an LED for example may be used for thedisplay unit 420. - The
first touch panel 410 includes awindow 411 exposed through a front surface of theportable terminal 100 and asensor layer 412 for detecting information (position, intensity for example) of a finger input, and thesensor layer 412 is deposited on a separate substrate over thewindow 411 or directly deposited on thewindow 411. Thefirst touch panel 410 may be constructed to provide the touch keys such as themenu button 161b, theback button 161c for example located below the screen exposed to the user. An upper surface of thewindow 411 is included in at least a part of the front surface of thetouch screen 190 exposed to the outside. Thewindow 411 may be formed with a transparent insulating material for visible light. Examples of the insulating material may include resin such as polyimide and polyethylene terephthalate or plastic. - A hard coating layer having high hardness is deposited on the upper surface of the
window 411 to prevent a scratch and to improve the hardness and provide an antiglare function. For example, the hard coating layer may be formed with a material generated by adding light scattering agents to general hard coating agents. Thesensor layer 412 includes a sensor for detecting a position when a passive user input means contacts a surface of thewindow 411 and has preset patterns for the detection. Thesensor layer 412 may have various patterns such as a linear grid pattern, a diamond pattern for example, and the linear grid pattern is described as an example in the present embodiment. Thesensor layer 412 may be deposited on a lower surface of thewindow 411 or a lower end (or lower surface) may be attached to an upper end (upper surface) of thedisplay unit 420. -
FIG. 5 shows a diagram illustrating an example of a pattern of a sensor layer. Thesensor layer 412 includesfirst electrode lines 510 and second electrode lines 520. A cross-sectional view shown in a lower part ofFIG. 5 illustrates thefirst electrode lines 510 TX1, TX2, and TX3 and thesecond electrode lines 520 RX. Each of thefirst electrode lines 510 extends in a first direction (for example, an x axis or a horizontal direction) and are disposed with an equal interval or different intervals in a second direction (for example, a y axis or a vertical direction) orthogonally crossing the first direction. Each of thesecond electrode lines 520 extends in the second direction orthogonally crossing the first direction and are disposed with an equal interval or different intervals in the first direction. - An insulating
layer 530 is disposed between thefirst electrode lines 510 and thesecond electrode lines 520 to electrically insulate thefirst electrode lines 510 and the second electrode lines 520. An insulating dielectric material such as SiO2 for example may be used as a material of the insulatinglayer 530. Thesensor layer 412 is formed with a transparent conductive material for the visible light, and an example of the conductive material may include an organic material containing carbon such as carbon nanotube (CNT) or graphene. Thesensor layer 412 may be formed through a process of forming a conductive thin film by a vacuum deposition process and then patterning the conductive thin film by a lithography process. An example of the vacuum deposition process may include E-beam, Sputtering, for example. - In order to perform a sensor function, a scan signal having a predetermined waveform is applied to the
sensor layer 412. When a first user input means contacts the surface of thewindow 411, a detection signal waveform is changed due to capacitance between thesensor layer 412 and the first user input means. Thecontroller 110 analyzes the detection signal and detects whether the first user input means contacts the surface of thewindow 411 and determines a contact position in the grid of first andsecond electrode lines touch screen 190, capacitance of acorresponding sensing point 540 increases. Thecontroller 110 detects generation of a finger touch event based on a detection signal having a peak value equal to or larger than a threshold (or a minimum value equal to or smaller than the threshold) and also detects a finger input position. The threshold is a value by which a noise and a normal signal can be distinguished. The threshold is experimentally set, and may be set to have, for example, a voltage equal to or larger than 0 V or a capacitance value equal to or larger than 0 pf. Meanwhile, a finger is an example of the first user input means, and the first user input means has no limitation as long as it is a means which provides capacitance between thesensor layer 412 and the first user input means. Such means are collectively called passive or first user input means. - In order to perform the sensor function, voltages (that is, scan signals) having a predetermined waveform from the
touch screen controller 195 are sequentially applied to thefirst electrode lines 510, and thesecond electrode lines 520 and output detection signals in response to the scan signals are provided to thetouch screen controller 195. Points where the first andsecond electrode lines window 411, capacitance of the sensing points 540 is changed due to the capacitance between thesensor layer 412 and the first user input means. Due to the change in the capacitance, voltage waveforms of the detection signals output from thesecond electrode lines 520 are changed and an input position and/or an input intensity of the first user input means is detected in response to the detected changed voltage waveforms. -
FIG. 6 shows a diagram of thesecond touch panel 430 comprising a touch panel of the Electromagnetic Resonance (EMR) type and includes first andsecond loop units second touch panel 430 is operated by a control of thetouch screen controller 195 and outputs detected signals to thetouch screen controller 195. Thefirst loop unit 610 includes a plurality offirst loops 611 and thesecond loop unit 620 includes a plurality ofsecond loops 621. - The
first loop unit 610 and thesecond loop unit 620 may be disposed to be orthogonal to each other. Thefirst loop unit 610 extends relatively long in a y axis in comparison with an x axis, and accordingly, is used to detect an x axis coordinate of a stylus input position. Thesecond loop unit 620 extends relatively long in an x axis in comparison with a y axis, and accordingly, is used to detect a y axis coordinate of a stylus input position. - Each of the first and
second loops touch screen controller 195 . Further, the first andsecond loops touch screen controller 195. The first frequency and the second frequency may be different from each other. The stylus located adjacent to thesecond touch panel 430 receives a first signal output in a form of an electromagnetic wave from thesecond touch panel 430 and in response generates a second or third signal in a form of an electromagnetic wave according to operation of a resonance circuit within the stylus. The stylus resonant circuit emits the generated second or third signal which is detected bycoils - When the stylus does not contact the
touch screen 190, the stylus outputs a second signal of a fixed frequency. When the stylus contacts thetouch screen 190, the stylus outputs a third signal of a second or third frequency which changes in response to contact pressure. Alternatively, in one embodiment the stylus outputs a second signal of a fixed second frequency regardless of the contact between the stylus and thetouch screen 190. The stylus may output a third signal of a fixed second frequency including data indicating whether the stylus contacts thetouch screen 190. Further, the stylus is one embodiment, and other means can be used as a stylus if the means can output a second and/or third signals of the second and/or third frequency in response to an input of the first signal of the first frequency. The means may be collectively called the second user input means. The stylus includes a resonance circuit including a coil for detecting a position of thesecond touch panel 430 in the EMR type and a condenser. -
FIGs. 7 and8 show diagrams illustrating detection of a stylus input position where each of the first andsecond loops stylus 10 in response to the first signal, generates and emits a second signal in a form of an electromagnetic wave. The first loops 611 (hereinafter, referred to as X1, X2, and X3 loops) sequentially detect the second signal. Thetouch screen controller 195 derives an x axis coordinate of a stylus position in response to a peak or minimum value of an output of multiple output values provided byloops 611 derived in response to the second signal. Specifically,controller 195 derives an x axis coordinate of a stylus position in response to comparison of the peak value with a first threshold and comparison of the minimum value with a second threshold. For example, a threshold may be set as a voltage equal to or larger than 0 V or an electrical current value equal to or larger than 0 A. - Referring to
FIG. 8 , the first loop 611 (eg, an X2 loop) emits a first signal in a form of an electromagnetic wave, and thestylus 10 generates and emits a second signal in a form of an electromagnetic wave in response to the first signal. The second loops 621 (hereinafter, referred to as Y1, Y2, and Y3 loops) sequentially detect the second signal. Thetouch screen controller 195 derives a y axis coordinate of a stylus input position in response to a peak or minimum value of an output of multiple output values provided byloops 621 derived in response to the second signal. Specifically,controller 195 derives a y axis coordinate of a stylus position in response to comparison of the peak value with a first threshold and comparison of the minimum value with a second threshold. -
FIG. 9 is a diagram illustrating a hover input. An input recognition distance as used herein comprises a maximum distance between a user input means (stylus or finger) and thetouch screen 190 within which thecontroller 110 or thetouch screen controller 195 can detect and output an input coordinate. An input recognition distance of thesecond touch panel 430 is larger than an input recognition distance of thefirst touch panel 410. Since thefirst touch panel 410 has a relatively small input recognition distance (that is, the input recognition distance is about 0) Finger input touch detection bysystem 100 is limited to contact with thetouch screen 190. Thesecond touch panel 430 in contrast, detects a stylus hover input and a stylus touch (contact) input. - In response to a distance between the
stylus 10 and thetouch screen 190 being larger than 0 and ranges within the input recognition distance, thesecond touch panel 430 detects and outputs a second signal. In response to a distance between thestylus 10 and thetouch screen 190 is 0, thesecond touch panel 430 detects and outputs a third signal. That is, thesecond touch panel 430 detects and outputs the second signal in response to a hover operation of a user, and detects and outputs a third signal in response to a touch operation of the user. For example, a stylus hover input and a stylus touch input is distinguished by existence or non-existence of pressure applied on thetouch screen 190 by thestylus 10. When the pressure is 0, thesecond touch panel 430 outputs the second signal. When the pressure is larger than 0, thesecond touch panel 430 outputs the third signal. - When a user makes a finger input using a
finger 11 in a state comprising grasping thestylus 10, the finger input may be ignored. Further, when the user makes a stylus input, a touch of apalm 12 may occur. The system distinguishes a finger input and a stylus input and an associated intention of the user. Inputs by parts of the body such as a finger, a palm for example are collectively called a finger input herein. -
FIG. 10 shows a flowchart of a method of processing multiple touch inputs. In step S1010 a hover input is detected. When the stylus enters within the input recognition distance of thesecond touch panel 430 on the surface of thetouch screen 190 of theportable terminal 100, thecontroller 110 detects a stylus hover input. That is, thecontroller 110 detects and recognizes a stylus hover event of a user in response to a detection value of thetouch screen 190. Specifically, thesecond touch panel 430 outputs a first signal of a fixed first frequency in a form of an electromagnetic wave and detects a second signal of a fixed second frequency output in a form of an electromagnetic wave from the stylus. Thecontroller 110 detects generation of the stylus hover event in response to the second signal having a peak value equal to or larger than a threshold and also detects a stylus input position and/or intensity. - In step S1020 a finger input is detected. When a user performs a finger touch while a stylus hover input is maintained, the
controller 110 detects the finger input in response to a detection value of thetouch screen 190. That is, thecontroller 110 detects the finger touch (or palm touch) event based on the detection value of thetouch screen 190. Specifically, a scan signal is applied to thesensor layer 412 of thefirst touch panel 410 and thesensor layer 412 outputs a detection signal. Thecontroller 110 detects generation of the finger touch event based on the detection signal having a peak value equal to or larger than a threshold and also detects a finger input position and/or intensity. Instep S1030 controller 110 calculates a distance between a hover input position and a finger input position. -
FIGs. 11A to 11C show diagrams for describing a process of calculating a distance between a hover input position and a finger input position.FIG. 11A shows a hoverinput pattern 1110 and afinger input pattern 1120 detected by thecontroller 110, and thecontroller 110 calculates a distance between positions of the hoverinput pattern 1110 and thefinger input pattern 1120. The position of each of thepatterns patterns patterns patterns patterns finger input pattern 1120 may be generated by a touch between the palm and the touch screen during a process in which the user attempts to perform a stylus input. The distance D1 between the positions of the hoverinput pattern 1110 and thefinger input pattern 1120 has a value larger than 30 mm. -
FIG. 11B shows a hoverinput pattern 1112 and afinger input pattern 1122 detected by thecontroller 110, and thecontroller 110 calculates a distance D2 betweenpattern 1112 andpattern 1122 positions. Thefinger input pattern 1122 is generated by a touch between the palm and thetouch screen 190 in a state where the user grasps the stylus, for example. In this case, the distance D2 between the positions of the hoverinput pattern 1112 and thefinger input pattern 1122 has a value equal to or smaller than 30 mm. -
FIG. 11C shows a hoverinput pattern 1114 andfinger input patterns controller 110, and thecontroller 110 calculates a distance D3 between positions of the hoverinput pattern 1114 and the firstfinger input pattern 1124 and a distance D4 betweenpattern 1114 andpattern 1126 positions. In this example, thefinger input patterns touch screen 190 is in a state where the user grasps a stylus. In this case, the distance D3 between the positions of the hoverinput pattern 1114 and the firstfinger input pattern 1124 has a value equal to or smaller than 30 mm, and the distance D4 between the positions of the hoverinput pattern 1114 and the secondfinger input pattern 1126 generally has a value larger than 30 mm. In such a multiple-touch occurrence, thecontroller 110 compares the distance D3 which is shorter than the distance D4 with a threshold.Controller 110 processes or ignores the detected firstfinger input pattern 1124 andpattern 1126 in response to comparison of D3 and D4 with one or more respective thresholds and in response to a relative comparison of D3 and D4. - In
step S1040 controller 110 compares a calculated distance with a threshold. The threshold may be experimentally determined, and may be set as, for example, a value ranging from 20 mm to 40 mm or 30 mm to 50 mm. Instep S1050 controller 110 ignores the finger input when the calculated distance exceeds the threshold. For example, the finger input position may correspond to the short-cut icons 191-1, 191-2, and 191-3, the main menu icon 191-4, thehome button 161a, themenu button 161b, theback button 161c, and a menu within an application window, or may be associated with a selection of a position within thetouch screen 190. When the finger input is effective, thecontroller 110 performs a program operation corresponding to the finger input position.Controller 110 selects objects (application, menu, icon for example), executes objects and selects positions for example. When the finger input is ignored, thecontroller 110 does not perform the program operation corresponding to the finger input position but may indicate occurrence of the finger input for the user (for example, through a vibration, a sound, an indicator for example). - In
step S1060 controller 110 processes the finger input when the calculated distance is within the threshold. When the finger input is effective, thecontroller 110 performs the program operation corresponding to the finger input position, by selecting objects, executing objects, selecting positions for example. In response to concurrently receiving the finger input and the stylus input,controller 110 uses the previously described method to advantageously derive the intention of the user and processes the finger input in accordance with this intention. - It may be appreciated that the embodiments of the present invention can be implemented in software, hardware, or a combination thereof. Any such software may be stored, for example, in a volatile or non-volatile storage device such as a ROM, a memory such as a RAM, a memory chip, a memory device, or a memory IC, or a recordable optical or magnetic medium such as a CD, a DVD, a magnetic disk, or a magnetic tape, regardless of its ability to be erased or its ability to be re-recorded. It can be also appreciated that the memory included in the mobile terminal is one example of machine-readable devices suitable for storing a program including instructions that are executed by a processor device to thereby implement embodiments of the present invention. Therefore, embodiments of the present invention provide a program including codes for implementing a system or method claimed in any claim of the accompanying claims and a machine-readable device for storing such a program. Further, this program may be electronically conveyed through any medium such as a communication signal transferred via a wired or wireless connection, and embodiments of the present invention appropriately include equivalents thereto.
- Further, the portable terminal can receive the program from a program providing apparatus connected to the portable terminal wirelessly or through a wire and store the received program. The program providing apparatus may include a memory for storing a program containing instructions for allowing the portable terminal to perform a preset content protecting method and information required for the content protecting method, a communication unit for performing wired or wireless communication with the portable terminal, and a controller for transmitting the corresponding program to the portable terminal in response to a request of the portable terminal or automatically.
- While the present invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the present invention as defined by the appended claims.
Claims (12)
- A method of processing a touch input of a portable terminal, the method characterized by:detecting a hover input of at least one of a stylus and a portion of a hand;detecting a finger input concurrently with receiving the hover input;calculating a distance between positions of the hover input and the finger input;comparing the calculated distance with a predetermined threshold; andperforming one of,(a) ignoring and(b) processing the finger input in response to a result of the comparison.
- The method of claim 1, characterized in that, when the calculated distance exceeds the threshold, the finger input is ignored.
- The method of claim 1 or 2, characterized in that when the calculated distance is within the threshold, the finger input is processed.
- The method of claim 3, characterized in that processing the finger input comprises selecting or executing an object corresponding to the position of the finger input.
- The method of any one of claims 1 to 4, characterized in that the hover input is performed by a stylus spaced apart from a touch screen of the portable terminal.
- A machine-readable storage medium characterized by recording a program for executing the method of one of claims 1 to 5.
- A portable terminal characterized by comprising the machine-readable storage medium of claim 6.
- A portable terminal (100) for processing a touch input, the portable terminal (100) characterized by:a touch screen (190) that displays input data and detects a hover input concurrent with a finger input; anda controller (110) that,
calculates a distance between positions of the hover input and the finger input in response to detection of the finger input concurrent with the hover input,
compares the calculated distance with a predetermined threshold, and
one of (a) ignores and (b) processes the finger input in response to a result of the comparison. - The portable terminal of claim 8, characterized in that the controller (110) ignores the finger input when the calculated distance exceeds the threshold.
- The portable terminal of claim 8 or 9, characterized in that the controller (110) processes the finger input when the calculated distance is within the threshold.
- The portable terminal of claim 10, characterized in that the processing of the finger input comprises at least one of, selection and execution, of an object corresponding to the position of the finger input.
- The portable terminal of any one of claims 8 to 11, characterized in that the hover input is performed by a stylus physically separated from the touch screen (190) of the portable terminal (100) and without physical contact with the touch screen (190).
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120106793A KR101987098B1 (en) | 2012-09-25 | 2012-09-25 | Method for processing touch input, machine-readable storage medium and portable terminal |
Publications (3)
Publication Number | Publication Date |
---|---|
EP2711825A2 true EP2711825A2 (en) | 2014-03-26 |
EP2711825A3 EP2711825A3 (en) | 2016-03-16 |
EP2711825B1 EP2711825B1 (en) | 2020-11-11 |
Family
ID=49231288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13185053.9A Active EP2711825B1 (en) | 2012-09-25 | 2013-09-18 | System for providing a user interface for use by portable and other devices |
Country Status (5)
Country | Link |
---|---|
US (1) | US9195357B2 (en) |
EP (1) | EP2711825B1 (en) |
KR (1) | KR101987098B1 (en) |
CN (1) | CN103677561B (en) |
AU (1) | AU2013228012B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016169151A1 (en) * | 2015-04-21 | 2016-10-27 | 中兴通讯股份有限公司 | Method and apparatus for connection between terminals, and storage medium |
EP3125080A4 (en) * | 2014-03-28 | 2017-03-01 | Panasonic Intellectual Property Management Co., Ltd. | Display device |
US9983695B2 (en) | 2015-03-19 | 2018-05-29 | Lenovo (Singapore)Pte. Ltd. | Apparatus, method, and program product for setting a cursor position |
WO2018140289A1 (en) * | 2017-01-25 | 2018-08-02 | Microsoft Technology Licensing, Llc | Redrawing a user interface based on pen proximity |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103412678B (en) * | 2013-08-12 | 2016-04-06 | 北京京东方光电科技有限公司 | A kind of display device |
KR20150103455A (en) * | 2014-03-03 | 2015-09-11 | 삼성전기주식회사 | Touchscreen apparatus and method for sensing touch input |
CN103941913B (en) * | 2014-03-28 | 2016-10-05 | 上海天马微电子有限公司 | Inductive touch screen, driving detection method thereof and coordinate input device |
CN103926614A (en) * | 2014-04-22 | 2014-07-16 | 上海遥薇(集团)有限公司 | Multifunctional card with video function |
KR102311221B1 (en) * | 2014-04-28 | 2021-10-13 | 삼성전자주식회사 | operating method and electronic device for object |
CN104199587B (en) | 2014-07-22 | 2018-09-07 | 上海天马微电子有限公司 | Inductive touch screen, driving detection method thereof and coordinate input device |
US9626020B2 (en) * | 2014-09-12 | 2017-04-18 | Microsoft Corporation | Handedness detection from touch input |
US9430085B2 (en) * | 2014-09-12 | 2016-08-30 | Microsoft Technology Licensing, Llc | Classification of touch input as being unintended or intended |
KR20160034135A (en) | 2014-09-19 | 2016-03-29 | 삼성전자주식회사 | Device for Handling Touch Input and Method Thereof |
JP6278889B2 (en) * | 2014-12-22 | 2018-02-14 | アルプス電気株式会社 | INPUT DEVICE, ITS CONTROL METHOD, AND PROGRAM |
CN105159539B (en) * | 2015-09-10 | 2018-06-01 | 京东方科技集团股份有限公司 | Touch-control response method, device and the wearable device of wearable device |
JP2017083973A (en) * | 2015-10-23 | 2017-05-18 | 富士通株式会社 | Terminal display device, display control method and display control program |
US10216405B2 (en) * | 2015-10-24 | 2019-02-26 | Microsoft Technology Licensing, Llc | Presenting control interface based on multi-input command |
KR102553493B1 (en) * | 2016-07-01 | 2023-07-10 | 삼성전자주식회사 | Touch sensing device and pen and method for measuring position |
EP3521989B1 (en) * | 2016-09-30 | 2020-11-18 | Toppan Printing Co., Ltd. | Light adjustment apparatus |
CN106476480B (en) * | 2016-10-29 | 2018-09-21 | 合肥职业技术学院 | A kind of intelligent and safe learning pen for promoting parent-offspring to link up |
CN106775003B (en) * | 2016-11-23 | 2020-11-10 | 广州视源电子科技股份有限公司 | Interaction device, color control method and device |
CN110462568A (en) * | 2017-12-14 | 2019-11-15 | 深圳市汇顶科技股份有限公司 | Coordinate determination method, device, electronic equipment and the storage medium of stylus |
CN108762653B (en) * | 2018-04-26 | 2020-10-30 | 北京集创北方科技股份有限公司 | Touch positioning method and device and electronic equipment |
DE102018120760B4 (en) * | 2018-07-12 | 2022-11-17 | Tdk Electronics Ag | Pen-type input and/or output device and method for generating a haptic signal |
US11886656B2 (en) * | 2019-04-10 | 2024-01-30 | Hideep Inc. | Electronic device and control method therefor |
JP6568331B1 (en) * | 2019-04-17 | 2019-08-28 | 京セラ株式会社 | Electronic device, control method, and program |
CN115495055B (en) * | 2022-11-03 | 2023-09-08 | 杭州实在智能科技有限公司 | RPA element matching method and system based on interface region identification technology |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7847789B2 (en) * | 2004-11-23 | 2010-12-07 | Microsoft Corporation | Reducing accidental touch-sensitive device activation |
US8446374B2 (en) * | 2008-11-12 | 2013-05-21 | Apple Inc. | Detecting a palm touch on a surface |
US8797280B2 (en) * | 2010-05-26 | 2014-08-05 | Atmel Corporation | Systems and methods for improved touch screen response |
US9244545B2 (en) * | 2010-12-17 | 2016-01-26 | Microsoft Technology Licensing, Llc | Touch and stylus discrimination and rejection for contact sensitive computing devices |
-
2012
- 2012-09-25 KR KR1020120106793A patent/KR101987098B1/en active IP Right Grant
-
2013
- 2013-09-12 AU AU2013228012A patent/AU2013228012B2/en not_active Ceased
- 2013-09-13 US US14/026,404 patent/US9195357B2/en active Active
- 2013-09-18 EP EP13185053.9A patent/EP2711825B1/en active Active
- 2013-09-25 CN CN201310446383.0A patent/CN103677561B/en active Active
Non-Patent Citations (1)
Title |
---|
None |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3125080A4 (en) * | 2014-03-28 | 2017-03-01 | Panasonic Intellectual Property Management Co., Ltd. | Display device |
US9772713B2 (en) | 2014-03-28 | 2017-09-26 | Panasonic Intellectual Property Management Co., Ltd. | Display device |
US9983695B2 (en) | 2015-03-19 | 2018-05-29 | Lenovo (Singapore)Pte. Ltd. | Apparatus, method, and program product for setting a cursor position |
EP3070582B1 (en) * | 2015-03-19 | 2019-06-05 | Lenovo (Singapore) Pte. Ltd. | Apparatus, method, and program product for setting a cursor position |
WO2016169151A1 (en) * | 2015-04-21 | 2016-10-27 | 中兴通讯股份有限公司 | Method and apparatus for connection between terminals, and storage medium |
WO2018140289A1 (en) * | 2017-01-25 | 2018-08-02 | Microsoft Technology Licensing, Llc | Redrawing a user interface based on pen proximity |
US10496190B2 (en) | 2017-01-25 | 2019-12-03 | Microsoft Technology Licensing, Llc | Redrawing a user interface based on pen proximity |
Also Published As
Publication number | Publication date |
---|---|
US9195357B2 (en) | 2015-11-24 |
EP2711825A3 (en) | 2016-03-16 |
AU2013228012A1 (en) | 2014-04-10 |
US20140085259A1 (en) | 2014-03-27 |
CN103677561B (en) | 2018-07-31 |
KR20140039924A (en) | 2014-04-02 |
KR101987098B1 (en) | 2019-09-30 |
AU2013228012B2 (en) | 2018-10-04 |
EP2711825B1 (en) | 2020-11-11 |
CN103677561A (en) | 2014-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2013228012B2 (en) | System for providing a user interface for use by portable and other devices | |
US10401964B2 (en) | Mobile terminal and method for controlling haptic feedback | |
US9977529B2 (en) | Method for switching digitizer mode | |
US10162512B2 (en) | Mobile terminal and method for detecting a gesture to control functions | |
KR102129374B1 (en) | Method for providing user interface, machine-readable storage medium and portable terminal | |
US10387014B2 (en) | Mobile terminal for controlling icons displayed on touch screen and method therefor | |
AU2014200250B2 (en) | Method for providing haptic effect in portable terminal, machine-readable storage medium, and portable terminal | |
US20140210758A1 (en) | Mobile terminal for generating haptic pattern and method therefor | |
US20140160045A1 (en) | Terminal and method for providing user interface using a pen | |
US20140285453A1 (en) | Portable terminal and method for providing haptic effect | |
US20140317499A1 (en) | Apparatus and method for controlling locking and unlocking of portable terminal | |
US9658762B2 (en) | Mobile terminal and method for controlling display of object on touch screen | |
US20140340336A1 (en) | Portable terminal and method for controlling touch screen and system thereof | |
KR20140143052A (en) | Input device having multi-level driver and user device including the same | |
US10114496B2 (en) | Apparatus for measuring coordinates and control method thereof | |
US20140348334A1 (en) | Portable terminal and method for detecting earphone connection | |
US20150002420A1 (en) | Mobile terminal and method for controlling screen | |
US10101830B2 (en) | Electronic device and method for controlling operation according to floating input | |
KR20140092106A (en) | Apparatus and method for processing user input on touch screen and machine-readable storage medium | |
KR102129319B1 (en) | Method for processing touch input, machine-readable storage medium and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/0488 20130101AFI20160210BHEP Ipc: G06F 3/0354 20130101ALI20160210BHEP |
|
17P | Request for examination filed |
Effective date: 20160909 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20180115 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/0488 20130101AFI20200603BHEP Ipc: G06F 3/044 20060101ALI20200603BHEP Ipc: G06F 3/0354 20130101ALI20200603BHEP |
|
INTG | Intention to grant announced |
Effective date: 20200618 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1334126 Country of ref document: AT Kind code of ref document: T Effective date: 20201115 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013073944 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1334126 Country of ref document: AT Kind code of ref document: T Effective date: 20201111 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210212 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210211 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210311 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210311 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210211 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013073944 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20210812 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20210824 Year of fee payment: 9 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MM Effective date: 20211001 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20210930 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210311 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211001 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210918 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210918 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210930 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210930 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210930 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20210930 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20220918 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20130918 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220918 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20201111 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240822 Year of fee payment: 12 |