CN101529368B - Methods for determining a cursor position from a finger contact with a touch screen display - Google Patents

Methods for determining a cursor position from a finger contact with a touch screen display Download PDF

Info

Publication number
CN101529368B
CN101529368B CN2007800405082A CN200780040508A CN101529368B CN 101529368 B CN101529368 B CN 101529368B CN 2007800405082 A CN2007800405082 A CN 2007800405082A CN 200780040508 A CN200780040508 A CN 200780040508A CN 101529368 B CN101529368 B CN 101529368B
Authority
CN
China
Prior art keywords
user interface
primary importance
interface object
numeral
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2007800405082A
Other languages
Chinese (zh)
Other versions
CN101529368A (en
Inventor
B·奥丁
S·福斯塔
G·克里斯蒂
S·O·勒梅
I·乔德里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/850,015 external-priority patent/US7843427B2/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN101529368A publication Critical patent/CN101529368A/en
Application granted granted Critical
Publication of CN101529368B publication Critical patent/CN101529368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The present invention discloses a portable device with a touch screen display detects a contact area of a finger with the touch screen display and then determines a first position associated with the contact area. The cursor position of the finger contact is determined, at least in part, based on: the first position, one or more distances between the first position and one or more of the user interface objects; and one or more activation susceptibility numbers, each associated with a respective user interface object in the plurality of user interface objects. If the cursor position falls into the hidden hit region of a virtual push button on the touch screen display, the portable device is activated to perform operations associated with the virtual push button.

Description

Be used for from the method that contact confirm cursor position of finger with touch-screen display
Technical field
Disclosure embodiment relates generally to portable electric appts, in particular contacting self-adaptation to confirm cursor position and come the portable set of executable operations subsequently according to cursor position from finger and touch-screen display.
Background technology
Along with portable electric appts becomes more compact, and increase, design and make the user carry out mutual user interface with multifunctional equipment at an easy rate to have become great challenge by the quantity of the function of carrying out to locking equipment.This challenge is remarkable especially for having than the little a lot of handheld portable devices of screen of desk-top computer or laptop computer.Why unfavorable this situation is, is because user interface is not only the approach of user's received content, and is that the user receives the approach to the response of user action or behavior that user action or behavior comprise that the user attempts the characteristic of access means, instrument and function.Some portable communication devices (for example mobile phone---be known as mobile phone, cell phone etc. sometimes) rely on the function of adding more button, the density that increases button, heavy duty (overload) button or use the complex menu system to make user capture, storage and deal with data.These Traditional user interface cause the user must remember complicated button order and menu level usually.
Many legacy user interfaces for example comprise those interfaces of physical button also being inflexible.This is disadvantageous, disposes and/or revise user interface because it possibly stop by the application or the user that operate on the portable set.When relating to a time-out with the difficulty of memory multikey order and needed time cost of menu level and activation expectation button, so ineffective activity is very gloomy to most of users.
In order to improve availability, some portable electric appts has used touch-screen to reproduce the virtual push button of soft keyboard and dial (of a telephone) and so on.From the contacting of user finger and virtual push button, portable set is confirmed or the multinomial service that the user asks, and correspondingly takes action.But; Because different user has different fingerprint shapes usually; Therefore; Concerning these portable sets, according to different fingerprint shapes and come to discern accurately and adaptively the virtual push button that the user wants with different contexts that the different services of portable set support are associated and just become challenge.
Correspondingly, need at present be configured to adaptively from finger and touch-screen contact confirm cursor position and subsequently according to the portable multifunction device of cursor position executable operations.This configuration has increased efficient, usefulness and the user satisfaction of portable multifunction device.
Summary of the invention
Above-mentioned defective and the other problems that is associated with the user interface that is used for portable set reduce or eliminate through portable multifunction device disclosed herein.In certain embodiments; This equipment has: the touch-sensitive display (also claiming " touch-screen ") that has graphic user interface (GUI); One or more processors, storer, and be kept at one or more modules, program or the instruction set that is used to carry out multiple function in the storer.In certain embodiments, the user mainly comes to carry out alternately with GUI through contact of the finger on the touch-sensitive display and gesture.In certain embodiments, said function can comprise make a phone call, video conference, Email delivery, instant message transmission, blog, digital photography, digital video, Web browsing, digital music are play and/or digital video is play.Be used for carrying out other computer programs that the instruction of these functions can be included in computer-readable recording medium or be configured to carried out by one or more processors.
One aspect of the present invention relates to a kind of computer implemented method of being carried out by the portable multifunction device with touch-screen display.This portable electric appts detects the contact area of finger and touch-screen display, and definite then primary importance that is associated with this contact area.The cursor position of finger contact is at least partly according to following definite: primary importance, the one or more distances between primary importance and the one or more user interface object; And one or more activation susceptibilitys numeral, wherein each activate susceptibility numeral all with a plurality of user interface object in the respective user interfaces object associated.
Another aspect of the present invention relates to the graphic user interface on a kind of portable multifunction device with touch-screen display.This graphic user interface comprises cursor and a plurality of user interface object.The position of cursor is at least partly according to following definite: with the primary importance that finger is associated with the contact area of touch-screen display, the one or more distances between primary importance and the one or more user interface object; And one or more activation susceptibilitys numeral, wherein each activate susceptibility numeral all with a plurality of user interface object in the corresponding user interfaces object associated.
Another aspect of the present invention relates to a kind of portable electric appts with touch-screen display, and wherein this touch-screen display has a plurality of user interface object.This equipment comprises one or more processors, storer, and the program that is kept in the storer and is configured to carried out by one or more processors.This program comprises: be used to detect the instruction of finger and the contact area of touch-screen display; Be used for the instruction of definite primary importance that is associated with the touch area; And be used at least in part based on the following instruction of confirming cursor position;: primary importance, the one or more distances between primary importance and the one or more user interface object; And one or more activation susceptibilitys numeral, wherein each activate susceptibility numeral all with a plurality of user interface object in the respective user interfaces object associated.
Another aspect of the present invention relates to a kind of computer program, and wherein this product comprises computer-readable recording medium and comprises computer program mechanism (for example one or more computer program) within it.This computer program mechanism comprises instruction, and wherein when being carried out by portable electric appts, this instruction makes equipment: the contact area that detects finger and touch-screen display; Definite primary importance that is associated with contact area; And part is confirmed cursor position according to following at least: primary importance, the one or more distances between primary importance and the one or more user interface object; And one or more activation susceptibilitys numeral, wherein each activate susceptibility numeral all with a plurality of user interface object in the respective user interfaces object associated.
Another aspect of the present invention relates to a kind of portable electric appts with touch-screen display.This equipment comprises: the device that is used to detect finger and the contact area of touch-screen display; The device that is used for definite primary importance that is associated with contact area; And be used for: primary importance, the one or more distances between primary importance and the one or more user interface object at least in part based on the following device of confirming cursor position; And one or more activation susceptibilitys numeral, wherein each activate susceptibility numeral all with a plurality of user interface object in the corresponding user interfaces object associated.
Description of drawings
In order to understand the above embodiment of the present invention and other embodiment better, should be with reference to below in conjunction with the specific descriptions of accompanying drawing to embodiment, in whole accompanying drawings, similar Reference numeral is represented corresponding parts.
Figure 1A and 1B show the block diagram according to the portable multifunction device with touch-sensitive display of some embodiment.
Fig. 2 shows the portable multifunction device with touch-screen according to some embodiment.
Fig. 3 shows the exemplary user interface that is used for the unlock of portable electronic equipment according to some embodiment.
Fig. 4 A and 4B show the exemplary user interface that is used for the application menu on the portable multifunction device according to some embodiment.
Fig. 5 shows being used for from the process flow diagram that contact the process of confirming cursor position of finger with touch-screen according to some embodiment.
Fig. 6 A~6L shows the illustrative methods that is used for confirming the cursor position on the touch-screen display according to some embodiment.
Fig. 6 M~6O shows the illustrative methods that is used for when keying in word with the soft keyboard button, dynamically adjusting the susceptibility numeral that is associated with the soft keyboard button according to some embodiment.
Embodiment
Following mask body reference implementation example, the example of embodiment is shown in the drawings.In the specific descriptions below, a large amount of details have been set forth so that complete understanding of the present invention to be provided.Yet will be understood by those skilled in the art that does not have these details can embodiment of the present invention yet.In other instances, do not specifically describe known method, process, parts, circuit and network, in order to avoid unnecessarily blur the each side of embodiment.
Should be appreciated that first, second waits and describes various elements although used term at this, these elements should not be confined to these terms yet.These terms only are used for an element and another element region are separated.For example, first gesture can be known as second gesture, and similarly, second gesture can be known as first gesture, and can not deviate from the scope of the utility model.
Be merely the description certain embodiments at this term that in instructions of the present invention, uses, and be not intended to limit the present invention.Like what in instructions of the present invention and accompanying claims, use, singulative " " and " being somebody's turn to do " are intended to also comprise plural form, indicate only if context is otherwise clear.Should be appreciated that term in this use " and/or " refer to and comprise one or more relevant any and all possible combinations of listing projects.Should also be appreciated that term " comprises " and/or " comprising " in being used in this instructions the time; Expression exists characteristic, integral body (integer), step, operation, element and/or the parts of statement, but does not get rid of existence or increase one or more characteristics, integral body, step, operation, element, parts and/or its set.
The embodiment of portable multifunction device, the correlated process that is used for the user interface of such equipment and uses such equipment are described.In certain embodiments, this equipment is the portable communication device such as mobile phone, and it also comprises other functions, for example PDA and/or music player functionality.
The virtual point thumb wheel that on touch-screen or touch-screen, shows (click wheel), user interface can comprise the physical points thumb wheel.Click wheel is the user interface apparatus that the point that can contact with thumb wheel based on the user of the angular displacement of thumb wheel or equipment provides navigation command.Click wheel also can be used to provide the user command corresponding to selecting one or more projects, for example, and when the user of equipment presses at least a portion of thumb wheel or presses the center of thumb wheel.As replacement, interrupt with touch screen surface on the contacting of click wheel image, can indicate user command corresponding to selection.For succinctly, in ensuing discussion, comprise that the portable multifunction device of touch-screen is used as exemplary embodiment.Yet should be appreciated that; Some user interfaces and correlated process can be applied to other equipment; For example personal computer and laptop computer, it can comprise one or more other physical user interface equipment such as physical points thumb wheel, physical keyboard, mouse and/or operating rod.
This equipment is supported multiple application, and in using below for example one or multinomial: phone application, video conference application, e-mail applications, instant message transrecieving application, blog applications, photograph album management application, digital camera applications, digital camera applications, web page browsing application, digital music player application and/or video frequency player are used.
The various application that can on this equipment, carry out can be used at least one public physical user-interface device, for example touch-screen.One or more functions of touch-screen and can be applied to from one at the corresponding informance that shows on the equipment and next use and/or each is used, to be conditioned and/or to change.With this mode, the public physical arrangement (such as touch-screen) of equipment can support to have a plurality of application of the directly perceived distinct user interface of possibility.
User interface can comprise one or more soft keyboard embodiment.Soft keyboard embodiment can comprise the standard (QWERTY) and/or the non-standard configuration of the symbol on the icon that is shown of keyboard; The U.S. Patent application of for example submitting to 11/459 that is entitled as " Keyboard For PortableElectronic Devices " on July 24th, 2006; 606 and the U.S. Patent application 11/459 that is entitled as " Touch Screen Keyboards For Portable ElectronicDevices " submitted on July 24th, 2006; Described in 615, its full content is incorporated into this by reference.The icon (or soft key) that the quantity of the key during said keyboard embodiment can comprise that its quantity is with respect to existing physical keyboard---for example typewriter---will be lacked.This can be so that the user be easy to select the one or more icons in the keyboard, thereby selects one or more corresponding symbols.Said keyboard embodiment can be adaptive.For example, the icon that is shown can be modified according to the user action such as selecting one or more icons and/or one or more corresponding symbol.One or more application on portable set can utilize public and/or different keyboard embodiment.Like this, employed keyboard embodiment can pass through and revise and be adapted at least some application.In certain embodiments, one or more keyboard embodiment can pass through modification and be adapted to each user.For example, one or more keyboard embodiment can use historical (lexicography, slang, individual use) based on each user's word and be adapted to this user through revising.Some keyboard embodiment can be adjusted to when using soft keyboard embodiment, thereby reduce the probability of user error when selecting one or more icons to select one or more symbol.
The embodiment of present turning facilities.Figure 1A and 1B show the structural drawing according to the portable multifunction device with touch-sensitive display 112 100 of some embodiment.For convenience's sake, touch-sensitive display 112 is known as " touch-screen " sometimes, and also can be considered to or be called the touch-sensitive display system.Equipment 100 can comprise storer 102 (it can comprise one or more computer-readable recording mediums), Memory Controller 122, one or more processing unit (CPU) 120, peripheral interface 118, RF circuit 108, voicefrequency circuit 110, loudspeaker 111, microphone 113, I/O (I/O) subsystem 106, other inputs or opertaing device 116 and outside port 124.Equipment 100 can comprise one or more optical sensors 164.These parts can be through one or more communication buss or signal wire 103 communications.
Should be appreciated that equipment 100 only is an example of portable multifunction device 100, and with comparing of illustrating, equipment 100 can have more or less parts, can make up two or more parts, perhaps can have the various parts configuration or arranges.Various parts shown in Figure 1A and the 1B can use the combination of hardware, software or hardware and software to realize, comprise one or more signal Processing and/or special IC.
Storer 102 can comprise high-speed random access memory, and also can comprise nonvolatile memory, for example one or more disk storage devices, flash memory device or other non-volatile solid-state memory devices.Miscellaneous part by the equipment 100 such as CPU 120 and peripheral interface 118 can be by Memory Controller 122 controls to the access of storer 102.
Peripheral interface 118 can be coupled to the input and output peripherals of equipment CPU 120 and storer 102.One or more processors 120 can move or carry out the various software programs that are stored in the storer 102 and/or instruction set various functions and the deal with data with actuating equipment 100.
In certain embodiments, peripheral interface 118, CPU 120 and Memory Controller 122 can realize on one chip that for example chip 104.In certain embodiments, they also can independently realized respectively on the chip.
RF (radio frequency) circuit 108 receives and sends the RF signal, and the RF signal also is known as electromagnetic signal.RF circuit 108 is an electromagnetic signal/convert electromagnetic signal into electric signal with electrical signal conversion, and through electromagnetic signal and communication network and other communication apparatus communications.RF circuit 108 can comprise the known circuit that is used to carry out these functions, includes but not limited to antenna system, RF transceiver, one or more amplifier, tuner, one or more oscillator, digital signal processor, CODEC chipset, Subscriber Identity Module (SIM) card, storer or the like.RF circuit 108 can be through radio communication and network and other devices communicatings; Wherein network is such as the Internet---also be known as world wide web (WWW), in-house network and/or wireless network, for example cellular phone network, WLAN (LAN) and/or Metropolitan Area Network (MAN) (MAN).Radio communication can be used in multiple communication standard, agreement and the technology any one; Include but not limited to: global system for mobile communications (GSM), strengthen data gsm environments (EDGE), high-speed downlink packet and insert (HSDPA), WCDMA (W-CDMA), CDMA (CDMA), time division multiple access (TDMA) (TDMA), bluetooth, Wireless Fidelity (Wi-Fi) (IEEE 802.11a for example; IEEE 802.11b; IEEE 802.11g and/or IEEE802.11n), voice (VoIP), Wi-MAX, the agreement (for example Internet Message Access Protocol (IMAP) and/or post office protocol (POP)) that is used for Email, instant message transrecieving (for example scalable message is handled conversation initialized protocol (SIMPLE) and/or instant message and on-the-spot service (IMPS) and/or the Short Message Service (SMS) of on-the-spot agreement (XMPP), instant messages and on-site support expansion); Or any other appropriate communication agreement, the communication protocol of also not developed when being included in applying date of this document.
Voicefrequency circuit 110, loudspeaker 111, microphone 113 provide the COBBAIF between user and the equipment 100.The voice data that voicefrequency circuit 110 receives from peripheral interface 118 converts voice data into electric signal, and electric signal is sent to loudspeaker 111.Loudspeaker 111 is the human sound wave that can hear with electrical signal conversion.Voicefrequency circuit 110 also receives by microphone 113 from the next electric signal of sound wave conversion.Voicefrequency circuit 110 is voice data with electrical signal conversion and voice data is sent to peripheral interface 118 is used for handling.Voice data can obtain and/or send to storer 102 and/or RF circuit 108 from storer 102 and/or RF circuit 108 through peripheral interface 118.In certain embodiments, voicefrequency circuit 110 also can comprise headset (headset) jack (for example, 212 of Fig. 2).Headset jack provides the interface between voicefrequency circuit 110 and the removable audio frequency I/O peripherals (for example only the headphone of output or have output (earphone that for example is used for one or two ear) and import the headset of (for example microphone)).
I/O subsystem 106 is coupled to peripheral interface 118 with the I/O peripherals such as touch-screen 112 and other input/opertaing devices 116 on the equipment 100.I/O subsystem 106 can comprise display controller 156 and the one or more input controllers 160 that are used for other inputs or opertaing device.One or more input controllers 160 receive electric signal/electric signal is sent to other inputs or opertaing device 116 from other inputs or opertaing device 116.Other input/opertaing devices 116 can comprise physical button (for example pushbutton switch (pushbutton), rocking bar button (rocker button) etc.), rotating disk, slide switch, operating rod, click wheel or the like.In the embodiment that some can supply replace, one or more input controllers 160 can be coupled to following any (or not being coupled to each): keyboard, infrared port, USB port and the pointing device such as mouse.What one or more buttons (for example 208 of Fig. 2) can comprise the volume control that is used for loudspeaker 111 and/or microphone 113 heightens/turns down button.One or more buttons can comprise pushbutton switch (for example 206 of Fig. 2).The pressing button switch can be removed the locking of touch-screen 112 or begin on touch-screen, to make the processing that uses gesture unlocker device fast; Like the U.S. Patent application of submitting on Dec 23rd, 2,005 11/322 that is entitled as " Unlocking a Device by Performing Gestures on an Unlock Image "; 549 is described, and it is incorporated into this by reference.Long period pressing button switch (for example 206) can open or close the power supply of equipment 100.The user also can customize the function of one or more buttons.Touch-screen 112 is used to realize virtual or soft key and one or more soft keyboard.
Touch-sensitive touch-screen 112 provides inputting interface and the output interface between equipment and the user.Display controller 156 receives from the electric signal of touch-screen 112 and/or sends electric signal to touch-screen 112.Touch-screen 112 demonstration visions are exported to the user.Vision output can comprise figure, text, icon, video or its any combination (being referred to as " figure ").In certain embodiments, some or all of vision outputs can will be described the more details of user interface object below corresponding to user interface object.
Touch-screen 112 can have based on sense of touch (haptic) and/or stereognosis (tactile) accepts touch sensitive surface, sensor or sensor groups from user's input.Contacting on touch-screen 112 and display controller 156 (together with any relevant module and/or the instruction set in the storer 102) the senses touch screen 112 (and contact any move or interrupt); And convert and be presented at detected contact into user interface object (for example, one or more soft keys, icon, webpage or image) on the touch-screen 112 mutual.In the exemplary embodiment, the contact point between touch-screen 112 and the user is corresponding to user's finger.
Touch-screen 112 can use LCD (LCD) technology or LPD (light emitting polymer display) technology, although can use other display technologies in other embodiments.Use any technology in the multiple touch-sensing technology of known or later exploitation; Touch-screen 112 can detect with display controller 156 and contact and move or interrupt; The touch-sensing technology includes but not limited to: electric capacity, resistance, infrared and surface acoustic wave technique, and other elements of other proximity transducers (proximity sensor) array or be used for one or more points of confirming to contact with touch-screen 112.
Touch-sensitive display in some embodiment of touch-screen 112 can be similar to the multiple point touching in following United States Patent (USP), described responsive dull and stereotyped (multi-touch sensitivetablet): 6,323,846 people such as () Westerman, 6; 570; 557 (people such as Westerman) and/or 6,677,932 (Westerman); And/or it is responsive dull and stereotyped to be similar to the multiple point touching of describing among the open 2002/0015024A1 of United States Patent (USP), and these files are incorporated into this by reference.Yet touch-screen 112 can show the vision output from portable set 100, and touch sensitive tablets do does not provide vision output.
Touch-sensitive display in some embodiment of touch-screen 112 can be as described in the following application: the U.S. Patent application No.11/381 that on May 2nd, (1) 2006 submitted to, 313, " Multipoint Touch Surface Controller "; The U.S. Patent application No.10/840 that on May 6th, (2) 2004 submitted to, 862, " MultipointTouchscreen "; The U.S. Patent application No.10/903 that on July 30th, (3) 2004 submitted to, 964, " Gestures For Touch Sensitive Input Devices "; The U.S. Patent application No.11/048 that on January 31st, (4) 2005 submitted to, 264, " Gestures For TouchSensitive Input Devices "; The U.S. Patent application No.11/038 that on January 18th, (5) 2005 submitted to, 590, " Mode-Based Graphical User Interfaces For TouchSensitive Input Devices "; The U.S. Patent application No.11/228 that on September 16th, (6) 2005 submitted to, 758, " Virtual Input Device Placement On A Touch ScreenUser Interface "; The U.S. Patent application No.11/228 that on September 16th, (7) 2005 submitted to, 700, " Operation Of A Computer With A Touch ScreenInterface "; The U.S. Patent application No.11/228 that on September 16th, (8) 2005 submitted to, 737, " Activating Virtual Keys Of A Touch-Screen Virtual Keyboard "; And the U.S. Patent application No.11/367 of submission on March 3rd, (9) 2006,749, " Multi-Functional Hand-Held Device ".All these applications are incorporated into this by reference.
Touch-screen 112 can have the resolution above 100dpi.In one exemplary embodiment, touch-screen 112 has the resolution of about 160dpi.The user can use any suitable object or accessory such as stylus (stylus), finger etc. to contact with touch-screen 112.In certain embodiments; User interface is designed to main the use based on the contact and the gesture of finger and comes work; Usually, because the area that finger contacts with touch-screen 112 is bigger, so be far smaller than input based on stylus based on the contact of finger and the accuracy of gesture.In certain embodiments, equipment changes the rough input based on finger accurate pointer/cursor position into or is used to carry out the order that user expectation is moved.
In certain embodiments, except touch-screen, equipment 100 can also comprise and be used to activate or the touch pad (not shown) of deactivation specific function.In certain embodiments, touch pad can be the touch sensitive region of equipment, and they are different with touch-screen, does not show vision output.Touch pad can be the touch sensitive surface of opening in 112 minutes with touch-screen, or the extension of the touch sensitive surface that is formed by touch-screen.
In certain embodiments, equipment 100 can comprise that physics or virtual point thumb wheel are as input control apparatus 116.Through point of rotation thumb wheel or through move with the contact point of click wheel (for example; The amount of movement of contact point is measured through its angular displacement with respect to the central point of click wheel), navigation or mutual between one or more Drawing Objects that the user can be in being presented at touch-screen 112 (below be called icon) with it.Click wheel also can be used to select one or more icons that show.For example, user's at least a portion of press points thumb wheel or relevant button downwards.Can handle by input controller 160 through user command and navigation command that click wheel provides by the user, also can handle by one or more modules and/or instruction set in the storer 102.For the virtual point thumb wheel, click wheel and click wheel controller can be respectively the parts of touch-screen 112 and display controller 156.For the virtual point thumb wheel, click wheel can be transparent or semitransparent object, and it is mutual in response to user and equipment, on touch-screen display, occurs or disappearance.In certain embodiments, the virtual point thumb wheel is displayed on the touch-screen of portable multifunction device and by the user and operates with contacting of touch-screen.
Equipment 100 also can comprise the power-supply system (powersystem) 162 that is used to each parts power supply.Power-supply system 162 can comprise power-supply management system; One or more power supplys (for example battery, interchange (AC)), charging system, power failure detection circuit, power supply changeover device or inverter, power supply status indicator (for example, light emitting diode (LED)) produce, manage and distribute relevant miscellaneous part with power in various and the portable set.
Equipment 100 also can comprise one or more optical sensors 164.Figure 1A and 1B show the optical sensor of the optical sensor controller 158 that is coupled in the I/O subsystem 106.Optical sensor 164 can comprise charge-coupled device (CCD) or complementary metal oxide semiconductor (CMOS) (CMOS) phototransistor.Optical sensor 164 receives the light from the one or more lens projects of process of environment, and converts light the data of presentation video into.With image-forming module 143 (being also referred to as camera model), optical sensor 164 can be caught still image or video.In certain embodiments, optical sensor is positioned at the back side of equipment 100, and with relative at the positive touch-screen display 112 of equipment, thereby touch-screen display 112 can be used as the view finder that is used to obtain static state and/or video image.In certain embodiments, optical sensor 164 is positioned at the front of equipment, thereby when the user checks other video conferencing participants on touch-screen display, can obtain user's image for video conference.In certain embodiments; The position of optical sensor 164 can (for example be changed by the user; Through lens in the slewing shell and sensor), thereby can being used to video conference and static state and/or video image with touch-screen display 112, obtains by single optical sensor 164.
Equipment 100 also can comprise one or more proximity transducers 166.Figure 1A and 1B show the proximity transducer 166 that is coupled to peripheral interface 118.Replacedly, proximity transducer 166 can be coupled to the input controller 160 in the I/O subsystem 106.Proximity transducer 166 can be as carrying out as described in following file: the U.S. Patent Application Serial Number No.11/241 that on September 30th, 2005 submitted to, 839, " Proximity Detector InHandheld Device "; The sequence number No.11/240 that on September 30th, 2005 submitted to, 788, " Proximity Detector In Handheld Device "; Sequence number " the Using Ambient Light Sensor To Augment ProximitySensor Output " undetermined that on January 7th, 2007 submitted to, attorney docket number No.04860.P4851US1; Sequence number " the Automated Response To And Sensing of UserActivity In Portable Device " undetermined that on October 24th, 2006 submitted to, attorney docket number No.04860.P4293; And sequence number " the Method And System ForAutomatic Configuration Of Peripherals " undetermined of submission on Dec 12nd, 2006, attorney docket number No.04860.P4634 is so these files all are incorporated into this by reference.In certain embodiments, proximity transducer (for example, when the user makes a phone call) when multifunctional equipment is placed near user's ear cuts out and forbids touch-screen 112.In certain embodiments, proximity transducer keeps touch-screen to cut out when equipment is arranged in user's pocket, wallet or other dark area, to prevent equipment unnecessary battery consumption when the lock-out state.
Equipment 100 also can comprise one or more accelerometers (accelerometer) 168.Fig. 1 shows the accelerometer 168 that is coupled to peripheral interface 118.Replacedly, accelerometer 168 can be couple to the input controller 160 in the I/O subsystem 106.Accelerometer 168 can be like the same execution described in the open No.20050190059 " Acceleration-based TheftDetection System for Portable Electronic Devices " of United States Patent (USP) and the open No.20060017692 " Methods And Apparatuses For Operating A PortableDevice Based On An Accelerometer " of United States Patent (USP), and these two parts disclose and all are incorporated into this by reference.In certain embodiments, come on touch-screen display with vertical view or transverse views display message based on the data analysis that receives from one or more accelerometers.
In certain embodiments, the software part that is stored in the storer 102 can comprise operating system 126, communication module (or instruction set) 128, contact/motion module (or instruction set) 130, figure module (or instruction set) 132, text load module (or instruction set) 134, GPS (GPS) module (or instruction set) 135 and use (or instruction set) 136.
Operating system 126 (for example; Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS or the embedded OS such as VxWorks) comprise various software parts and/or be used for control and management general-purpose system task (for example; Memory management, memory device control, power management etc.) driver, and can help the communication between the various hardware and software parts.
Communication module 128 can help to communicate with other equipment through one or more outside ports 124, and also can comprise the various software parts that are used to handle the data that received by RF circuit 108 and/or outside port 124.Outside port 124 (for example, USB (USB), FIREWIRE etc.) goes for directly being coupled to other equipment or is coupled to other equipment indirectly through network (for example, the Internet, WLAN etc.).In certain embodiments, outside port is spininess (for example 30 a pins) connector, and it is identical, similar and/or compatible with employed 30 needle connectors on iPod (AppleComputer, the trade mark of Inc.) equipment.
Contact/motion module 130 can detect contacting with touch-screen 112 (with display controller 156) and other touch-sensitive device (for example touch pad or physical points thumb wheel).Contact/motion module 130 comprises various software parts; Be used to carry out the various operations relevant, for example determine whether that contact has taken place, determine whether to exist moving and follow the tracks of and on touch-screen 112, moving and determine whether to interrupt contact (i.e. whether contact stops) of contact with detecting contact.Confirm that moving of contact point can comprise speed (amplitude), speed (amplitude and direction) and/or the acceleration (variation of amplitude and/or direction) of confirming contact point.These operations can be applied to single contact (for example, finger contact) or be applied to a plurality of simultaneous contacts (for example, " multiple point touching "/a plurality of finger contacts).In certain embodiments, contact/motion module 130 and display controller 156 contacting on also can the senses touch plate.In certain embodiments, contacting on contact/motion module 130 and the controller 160 check point thumb wheels.
Figure module 132 can comprise the various known software parts that are used on touch-screen 112, playing up (render) and display graphics, comprises the parts of the brightness that is used to change the figure that is shown.As employed at this, term " figure " comprises any object that can be shown to the user, includes but not limited to: text, webpage, icon (such as the user interface object that comprises soft key), digital picture, video, animation etc.
Can be the text load module 134 of the parts of figure module 132 be provided for various application (such as, contact person 137, Email 140, IM 141, blog 142, browser 147 and need the text input any other use) in key in the soft keyboard of text.
GPS module 135 is confirmed the position of equipment and can be provided this information to supply various application (for example to use; Offering phone 138 supplies location-based dialing to use; Offer camera 143 and/or blog 142 as picture/video metadata and offer the application that location based services is provided; Such as the little plug-in unit of weather (widgets also claims window little plug-in unit), the little plug-in unit of local Yellow Page and map/little plug-in unit navigates).
Application 1 36 can comprise following modules (or instruction set) or its subclass or superset:
● contact module 137 (being known as address book or contacts list sometimes);
phone module 138;
video conference module 139;
email client module 140;
● instant message transrecieving (IM) module 141;
blog module 142;
● be used for the camera model 143 of static state and/or video image;
image management module 144;
video player module 145;
musical player module 146;
browser module 147;
calendaring module 148;
little card module 149; It can comprise the little plug-in unit 149-1 of weather, the little plug-in unit 149-2 of stock, the little plug-in unit 149-3 of counter, the little plug-in unit 149-4 of alarm clock, the little plug-in unit 149-5 of dictionary and other little plug-in units that obtained by the user, and the little plug-in unit 149-6 that is created by the user;
● little plug-in unit builder module 150 is used to make the little plug-in unit 149-6 that the user creates;
search module 151;
● video and musical player module 152, it merges video player module 145 and musical player module 146;
notepad module 153; And/or
ground module 154.
The example that can be stored in other application 1s 36 in the storer 102 comprises that other word processings application, JAVA launch application, encryption, digital copyright management, speech recognition and the speech reproduction of (JAVA-enabled).
Through cooperating with touch-screen 112, display controller 156, contact modules 130, figure module 132 and text load module 134; Contact module 137 can be used for management address book or contacts list, comprising: in address book, add one or more names; The one or more names of deletion from address book; One or more telephone numbers, one or more e-mail address, one or more physical address or other information are associated with name; Image is associated with name; Name is classified and sorted; Through providing telephone number or e-mail address to start and/or assisting communication of phone 138, video conference 39, Email 140 or IM 141 or the like.
Through with RF circuit 108, voicefrequency circuit 110, loudspeaker 111, microphone 113, touch-screen 112, display controller 156, contact modules 130, figure module 132 and 134 cooperations of text load module, phone module 138 can be used for importing with one or more telephone numbers of the corresponding character string of telephone number, reference address book 137, revise the telephone number that has been transfused to, dial telephone number corresponding, carry out session and disconnection or hang up when conversation end.As stated, radio communication can be used any in multiple communication standard, agreement and the technology.
Through with RF circuit 108, voicefrequency circuit 110, loudspeaker 111, microphone 113, touch-screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact modules 130, figure module 132, text load module 134, contacts list 137 and phone module 138 cooperations, video conference module 139 can be used to start, carry out and stop the video conference between user and one or more other participants.
Through cooperating with RF circuit 108, touch-screen 112, display controller 156, contact modules 130, figure module 132 and text load module 134, email client module 140 can be used for establishment, transmission, reception and managing email.Through cooperating with image management module 144, the Email with the static or video image that obtains with camera model 143 can created and send to e-mail module 140 at an easy rate.
Through cooperating with RF circuit 108, touch-screen 112, display controller 156, contact modules 130, figure module 132 and text load module 134; Instant message transrecieving module 141 can be used to import with the corresponding character string of instant message, revise the previous character of importing, transmits corresponding instant message (for example is that those instant messages based on phone use Short Message Service (SMS) or Multimedia Message service (MMS) agreements; Or being those uses XMPP, SIMPLE or IMPS based on instant messages of the Internet), receive instant message and check the instant message that receives.In certain embodiments, the instant message that transmits and/or receive can comprise figure, photo, audio file, video file and/or be other annexes that MMS and/or enhanced messaging passing service (EMS) are supported." instant message transrecieving " used herein refers to message based on phone (message of for example using SMS or MMS to send) and based on the message (message of for example using XMPP, SIMPLE or IMPS to send) of the Internet.
Through with RF circuit 108, touch-screen 112, display controller 156, contact modules 130, figure module 132, text load module 134, image management module 144 and browse module 147 cooperation, blog module 142 can be used for sending text, rest image, video and/or other figures to blog (for example user's blog).
Through with touch-screen 112, display controller 156, one or more optical sensor 164, optical sensor controller 158, contact modules 130, figure module 132 and image management module 144 cooperations, camera model 143 can be used for obtaining rest image or video (comprising video flowing) and it is deposited in storer 102, revises the characteristic of rest image or video or from storer 102 deletion rest image or videos.
Through with touch-screen 112, display controller 156, contact modules 130, figure module 132, text load module 134 and camera model 143 cooperations, image management module 144 can be used for adjustment, revise or otherwise manipulation, mark, delete, appear (for example playing or the photograph album mode) and store static and/or video image with digital magic lantern.
Through with touch-screen 112, display controller 156, contact modules 130, figure module 132, voicefrequency circuit 110 and loudspeaker 111 cooperations, video player module 145 can be used for showing, appearing or playback video (for example at touch-screen or the external display that links to each other via outside port 124) otherwise.
Through cooperating with touch-screen 112, display system controller 156, contact modules 130, figure module 132, voicefrequency circuit 110, loudspeaker 111, RF circuit 108 and browser module 147; Musical player module 146 allows the music of user's download and playing back recorded and other audio files of storing with one or more music formats, for example MP3 or AAC file.In certain embodiments, equipment 100 can comprise the function of MP3 player, for example iPod (Apple Computer, the trade mark of Inc.).
Through cooperating with RF circuit 108, touch-screen 112, display system controller 156, contact modules 130, figure module 132 and text load module 134; Browser module 147 can be used for browsing internet; Comprise search, link, reception and display web page or its part, and the annex and the alternative document that are linked with webpage.
Through cooperating with RF circuit 108, touch-screen 112, display system controller 156, contact modules 130, figure module 132, text load module 134, e-mail module 140 and browser module 147; Calendaring module 148 can be used to create, show, revise and store calendar and the data that are associated with calendar (for example calendar, task list or the like).
Through with RF circuit 108, touch-screen 112, display system controller 156, contact modules 130, figure module 132, text load module 134 and browser module 147 cooperations, little card module 149 is the mini application programs that can be created (the for example little plug-in unit 149-6 of user's establishment) by user's download with use (the for example little plug-in unit 149-1 of weather, the little plug-in unit 149-2 of stock, the little plug-in unit 149-3 of counter, the little plug-in unit 149-4 of alarm clock and the little plug-in unit 149-5 of dictionary) or by the user.In certain embodiments, little plug-in unit comprises HTML (HTML) file, CSS (CSS is single) file and JavaScript file. Little plug-in unit).
Through cooperating with RF circuit 108, touch-screen 112, display system controller 156, contact modules 130, figure module 132, text load module 134 and browser module 147, little plug-in unit builder module 150 can be used for creating little plug-in unit (for example the user's specified portions with webpage becomes little plug-in unit) for the user.
Through with touch-screen 112, display system controller 156, contact modules 130, figure module 132 and 134 cooperations of text load module, search module 151 can be used for text, music, sound, image, video and/or the alternative document that searching storage 102 and one or more search criterias (for example one or more users specify search terms) are complementary.
Through cooperating with touch-screen 112, display controller 156, contact modules 130, figure module 132 and text load module 134, notepad module 153 can be used for creating and managing notepad, task list or the like.
Through with RF circuit 108, touch-screen 112, display system controller 156, contact modules 130, figure module 132, text load module 134, GPS module 135 and browser module 147 cooperations, ground module 154 can be used to receive, show, revise and store map and the data that are associated with map (are for example driven direction; About the shop be in ad-hoc location or near the data of other points of interest it; And other location-based data).
Each above-mentioned module and application are all corresponding to the instruction set that is used to carry out one or more above-mentioned functions.These modules (being instruction set) needn't be implemented as independently software program, process or module, and in various embodiment, the different subclass of these modules can make up or otherwise readjust thus.For example, video player module 145 can with musical player module 146 be combined into individual module (for example video and musical player module 152, Figure 1B).In certain embodiments, storer 102 can be stored the subclass of above-mentioned module and data structure.In addition, storer 102 can also be stored in add-on module and the data structure that preceding text are not pointed out.
In certain embodiments, equipment 100 is such equipment, and in this equipment, the operation of the predetermined function set on the equipment is only carried out through touch-screen 112 and/or touch pad.Through using touch-screen and/or touch pad, can reduce the quantity of the physics input/opertaing device (for example button, dial (of a telephone) or the like) on the equipment 100 as the primary input/opertaing device that is used for the operation of equipment 100.
Can be included between the user interface through the predetermined function set that touch-screen and/or touch pad are carried out exclusively and navigate.In certain embodiments, when being touched by the user, touch pad can be with equipment 100 from navigating to master menu, initial (home) menu or root menu at the Any user interface that equipment 100 shows.In such embodiment, touch pad can be called as " menu button ".In some other embodiment, menu button can be physical button or other physics input/opertaing devices, rather than touch pad.
Fig. 2 shows the portable multifunction device with touch-screen 112 100 according to some embodiment.The one or more figures of touch-screen in can display of user interfaces (UI) 200.In this embodiment, and among other embodiment that describe below, the user can select one or more said figures through for example using one or more fingers 202 (not being shown to scale in the drawings) contact or touching figure.In certain embodiments, when the user stopped with the contacting of one or more figures, generation was to the selection of said one or more figures.In certain embodiments; Contact can comprise with equipment 100 has carried out the gesture that contacts, touches (tap), one or many such as one or many and skims over (swipe) (from right to left from left to right,, upwards and/or downwards) and/or (rolling) finger that rolls (from left to right from right to left,, upwards and/or downward).In certain embodiments, can not select this figure with contacting unintentionally of figure.For example, when being when touching with selecting corresponding gesture, inswept (sweep) application icon skim over the application that gesture can not be selected this correspondence.
Equipment 100 also can comprise one or more physical button, for example " initial " or menu button 204.As previously mentioned, menu button 204 can be used to navigate to any application 1 36 in the set of applications that can on equipment 100, carry out.Replacedly, in certain embodiments, menu button may be implemented as the soft key among the GUI in the touch-screen 112.
In one embodiment, equipment 100 comprises pushbutton switch 206, (one or more) volume adjusting button 208, Subscriber Identity Module (SIM) draw-in groove 210, headset jack 212 and butt joint (the docking)/charging outside port 124 of touch-screen 112, menu button 204, the power supply that is used for opening/closing equipment and locking device.Reach predetermined time interval through press push button and hold button at the state of being depressed, pushbutton switch 206 can be used to the power supply of opening/closing equipment 100; Release-push before also the interval disappears at the fixed time through press push button, pushbutton switch 206 can be used to locking device; And/or be that equipment release or startup release are handled.In interchangeable embodiment, equipment 100 also can be accepted to be used to activate or the oral input of some functions of deactivation through microphone 113.
Forward notice to user interface (" UI ") and can be on the embodiment of the relevant treatment of implementing on the portable multifunction device 100 now.
Fig. 3 shows an exemplary user interface of touching the lock portable electric appts according to some embodiment.In certain embodiments, user interface 300 comprises with lower component or the subclass of these parts or superset:
● and then finger gesture moves the unlock image 302 of unlocker device;
● the arrow 304 to the visual cues of release gesture is provided;
● the passage 306 to the additional prompt of release gesture is provided;
● the time 308;
● day 310;
● the date 312; And
● wall paper images 314.
In certain embodiments, when equipment is in the user interface lock-out state, near the contacting of this Equipment Inspection and touch-sensitive display (for example user's finger contacts unlock image 302 or its).This equipment moves unlock image 302 according to said contact.If detected contact is corresponding with predetermined gesture, for example corresponding to moving unlock image along passage 306, this device translates becomes the user interface released state so.On the contrary, if detected contact is not corresponding with predetermined gesture, equipment will keep the user interface lock-out state so.As stated; On touch-screen, making the processing that uses gesture unlocker device is the U.S. Patent application of submitting on Dec 23rd, 2,005 11/322 that is called " Unlocking A Device By Performing Gestures On An UnlockImage "; The U.S. Patent application 11/322 of " the Indication Of Progress Towards Satisfaction Of A User InputCondition " by name that submitted on Dec 23rd, 549 and 2005; Describe in 550, wherein these patented claims here are incorporated herein by reference.
Fig. 4 A and 4B show the exemplary user interface that is used for the application menu on the portable multifunction device according to some embodiment.In certain embodiments, user interface 400A comprises following parts or its subclass or superset:
● be used for one or more S meters 402 of one or more radio communications, for example honeycomb and Wi-Fi signal;
● the time 404;
Battery Status Indicator 406;
● have the pallet 408 of the icon that is used for the frequent application of using, in for example following one or multinomial:
ο phone 138, it can comprise the indicator 414 about missed call or voice mail message quantity;
ο email client 140, it can comprise the indicator 410 about not reading Email quantity;
ο browser 147; And
ο music player 146; And
● be used for the icon of other application, in for example following one or multinomial:
οIM?141;
ο image management 144;
ο camera 143;
ο video player 145;
ο weather 149-1;
ο stock 149-2;
ο blog 149-3;
ο calendar 148;
ο counter 149-3;
ο alarm clock 149-4;
ο dictionary 149-5; And
The little plug-in unit 149-6 that ο user creates.
In certain embodiments, user interface 400B comprises following parts or its subclass or superset:
● aforesaid 402,404,406,141,148,144,143,149-3,149-2,149-1,149-4,410,414,138,140 and 147;
● ground Figure 154;
notepad 143;
● be provided with 412, it provides the visit to the setting of equipment 100 and various application 1 36 thereof; And
● video and musical player module 152, also claim iPod (Apple Computer, the trade mark of Inc.) module 152.
In certain embodiments, UI 400A or 400B show all useful application 136 on a screen, and list of application (for example by scrolling bar) thus there is no need to roll.In certain embodiments,, can reduce, can on single screen, show all application thus, and needn't roll with the size of using corresponding icon along with the increase of number of applications.In certain embodiments; If all application all are in a screen and have menu button; To allow the user to use two any desired application of finger visit at the most so; For example activate menu button 204, activate expectation then and use (for example through touching or other finger gestures with using to adopt on the corresponding icon).
In certain embodiments, UI 400A or 400B provide to based on the application of little plug-in unit with not based on the visit of the application of little plug-in unit.Whether in certain embodiments, no matter created by the user, all little plug-in units all are presented among UI 400A or the 400B.In other embodiments, the icon of the little plug-in unit 149-6 that creates through excited users, can leading, another comprises little plug-in unit that the user creates or the UI of the corresponding icon of creating with the user of little plug-in unit.
In certain embodiments; The user can arrange the icon among 400A or the 400B again; For example use the U.S. Patent application 11/459 of " the Portable Electronic DeviceWith Interface Reconfiguration Mode " by name that submitted on July 24th, 2006; The processing of describing in 602, wherein this patented claim is incorporated herein by reference.For example, the user can use finger gesture that application icon is moved into or shift out pallet 408.
Like the U.S. Patent application of submitting on Dec 23rd, 2,005 11/322 that is called " Account Information DisplayFor Portable Communication Device "; As described in 552; In certain embodiments; UI 400A or 400B comprise the more New Account use amount degree meter (not shown) that is used to show the account (for example, cellular telephone account) that is associated with the equipment operating position, and wherein this application here is incorporated herein by reference.
As in the preceding text background technology partly points out, be how accurately the finger of the 2D on touch-screen contact area information to be translated into unique 1-D cursor position to the challenge of portable electric appts with touch-screen.
Finger is a process that comprises multiple action with the contacting of touch-screen display (for example finger touches), and these actions comprise: finger is near display, and gesture contacts with display, and points and leave display.In this process, the contact area of finger increases from zero to the maximum contact zone, is reduced to zero then.In certain embodiments, with stable contact of display, its finger contact area is defined by regional with the stable maximum contact of finger and touch-screen that contact in the corresponding period concerning finger.
Fig. 5 and 6A~6L show the illustrative methods of coming definite cursor position from the contacting of finger and touch-screen according to some embodiment.
Shown in Fig. 6 A, touch-screen display shows a plurality of user interface object 5602~5608.Exemplary user interface object comprises to be opened icon, close icon, deletes icon, withdraws from icon or soft-key button icon.These some icons wherein can be deployed in the very zonule on the touch-screen display, and an icon can be in close proximity to another icon thus.
When finger contacted with touch-screen display, different with the click of routine, finger had specific contact area (for example 5610 among Fig. 6 A) on touch-screen display.In certain embodiments, before can the activated user interface object carrying out scheduled operation, be necessary to confirm contact area 5610 corresponding cursor positions with finger and touch-screen display.
After having confirmed the finger contact area (501), the definite primary importance (503) that is associated with contact area 5610 of portable set.Be described below, primary importance might be to contact corresponding cursor position with finger, also might not be.But primary importance will be used to confirm cursor position.In certain embodiments, shown in Fig. 6 B, primary importance P 1It is the barycenter of contact area 5610.
In some other embodiment (Fig. 6 H), when finger contacted with touch-screen display physics, this finger must be pressed display, and its pressure can change along with the difference of the inner position of contact area.Sometimes, the user has applied the position P of maximum pressure 2May not be the barycenter P of contact area 1But, maximum pressure position P 2Might be more hope the object selected near the user.
Shown in Fig. 6 H, contact area 5610 is considered to have main shaft and perpendicular to the ellipse of the minor axis of main shaft.Suppose barycenter and corresponding maximum pressure position P at contact area 5610 2Between have a fixed range Δ d '.In this case, primary importance or maximum pressure position P 2Can from P1 and Δ d ', confirm.
The cursor position P of finger contact confirms (505) according to one or more parameters, and this is comprising primary importance, the P among Fig. 6 B just 1Or the P among Fig. 6 H 2The location, one or more distances between near the one or more user interface object primary importance and the primary importance, and be in certain embodiments and the user interface object (W among Fig. 6 C or Fig. 6 I for example 1~W 4) the one or more activation susceptibilitys numeral that is associated.
In certain embodiments, shown in Fig. 6 C and 6I, the primary importance (P among Fig. 6 C 1Or the P among Fig. 6 I 2) with distance between the respective user interfaces object (5602,5604,5606 or 5608) be primary importance and near the distance between the point on the user interface object of primary importance.
In certain embodiments, shown in Fig. 6 D and 6J, the primary importance (P among Fig. 6 D 1Or the P among Fig. 6 J 2) with user interface object (5602,5604,5606 or 5608) between distance be the distance between primary importance and the user interface object center.
In certain embodiments, the skew between cursor position and the primary importance (the for example Δ d among Fig. 6 E and the 6F) is provided by following formula:
Δ d → = Σ i Δ d → i = Σ i W i d i n u → i
Wherein:
Figure DEST_PATH_GSB00000535353600012
Be cursor position P and primary importance P 1Between side-play amount,
Figure DEST_PATH_GSB00000535353600013
is along direction between primary importance and the user interface object i and the side-play amount that is associated with user interface object I
● W iBe the activation susceptibility numeral that is associated with user interface object i,
● d iBe the distance between primary importance and the user interface object i,
● n is real number (for example 1), and
●?
Figure DEST_PATH_GSB00000535353600014
along?
Figure DEST_PATH_GSB00000535353600015
direction of the unit vector.
If determined cursor position P is arranged in particular user interface object (for example 5602 of Fig. 6 E), activated user interface object then is so that carry out scheduled operation, for example play list, the message that deletes an e-mail or input character in input field.
In certain embodiments, according to the operation of each object associated, have different values and symbol with the activation susceptibility numeral that the different user interface object is associated.
Give an example; Shown in Fig. 6 E; If the operation that is associated with user interface object 5602 is reversible or is to can't harm (non-destructive) (for example user interface object 5602 is broadcast icons of music and video player module 146) in other respects then to have the activation susceptibility numeral W of first symbol (for example "+") 1' specify and give object 5602, so that determined cursor position P and primary importance P 1Compare and more approach object 5602, thereby be easier to activate object 5602.In this context, " can't harm " is defined by and is meant the operation that can not cause permanent information to be lost.
In contrast; Shown in Fig. 6 F; If the operation that is associated with user interface object 5602 is irreversible or (for example damages user profile; User interface object 5602 is deletion icons of mail module 140), then with an activation susceptibility numeral W who has with second symbol (for example "-") of first opposite in sign 1" specify to give object 5602 so that determined cursor position P can with primary importance P 1Compare more away from object 5602, thus the more difficult object 5602 that activates.Thus, when the activation susceptibility numeral with object associated had second symbol, this contact must relatively accurately be positioned on the said object, so that it is activated, what wherein bigger activation susceptibility digital value was corresponding is higher degree of accuracy.
In certain embodiments, cursor position P be according to primary importance, with the activation susceptibility numeral that is associated near the user interface object of primary importance and primary importance with the most definite near the distance between the user interface object of primary importance.In these embodiment, the influence of the parameter that cursor position P does not receive to be associated with other neighboring user interface objects.Give an example, shown in Fig. 6 K, primary importance P 1Approach most to have the activation susceptibility numeral W that is associated 1User interface object 5602.Primary importance P 1And the distance between the object 5602 is d 1Cursor position P to be determined only receives these parameter influences, and can not receive the influence of neighboring user interface object 5604,5606 or 5608.
In certain embodiments; When one or more user interface object fell in the preset distance of primary importance, cursor position P was definite according to the distance between each of primary importance, the activation susceptibility numeral that is associated with each user interface object that falls into preset distance and primary importance and these user interface object.As replacement; In certain embodiments; When one or more user interface object fell into user finger with the contacted contact area of touch-screen display (or being in the preset distance of contact area), cursor position P was definite according to the distance between each of primary importance, the activation susceptibility numeral that is associated with each user interface object that falls into contact area (or have preset distance with contact area) and primary importance and these user interface object.
In certain embodiments, shown in Fig. 6 L, if primary importance is in the particular user interface object on the display (for example 5604), cursor position is identical with primary importance so, and wherein said primary importance can be the P among Fig. 6 B 1Or the P among Fig. 6 H 2In this case, cursor position needn't further depart from primary importance.
In certain embodiments, shown in Fig. 6 E, the finger contact there is no need just in time to take place at the object place, so that activate this object.On the contrary, as long as determined cursor position falls in the user interface object, said user interface object can activate.In certain embodiments, if determined cursor position falls in the user interface object " hit area ", user interface object will be activated so.The hit area of user interface object both can have identical size with user interface object self, also can be greater than or less than user interface object.Cause the user interface object irreversible or variation of infringement property to have usually and the identical or littler hit area of said user interface object self to data.At some embodiment, at least some do not cause user interface object irreversible or that damageability changes to have the size greater than these user interface object to data.Concerning this class object, the part greater than the respective user interfaces object in the hit area can be called as hiding hit area.
At least some user interface object that relate to when in certain embodiments, in as above formula, confirming cursor position can be seen on touch-screen display.
In certain embodiments, in the certain applications module, the activation susceptibility that is associated with user interface object numeral (W for example 1~W 4) be to depend on contextually, and in the application-specific module, change along with change in context.For example; Certain object possibly have first and constantly cursor position attractive first activated susceptibility numeral (in first context in the application-specific module); But constantly have relatively poor even repel second of this cursor position (for example, if second activate the susceptibility numeral have opposite symbol) and activate susceptibility numeral (in second context in the application-specific module) second to the attractive force of cursor position.
Fig. 6 M~6O shows the illustrative methods of when keying in word with the soft keyboard button, dynamically adjusting the activation susceptibility numeral that is associated with the soft keyboard button according to some embodiment.This user interface comprises input field 5620 and soft keyboard 5640.If the user selects any key icon on the soft keyboard 5640, the corresponding user of input selectes character in input field 5620 so.From purpose of illustration, shown in Fig. 6 M, all key icon initially all have identical activation susceptibility numeral 5.
Fig. 6 N shows the activation susceptibility numeral that after two characters " Go " are input to input field 5620, is associated with the different key icon.The activation susceptibility numeral that is associated with key icon is adjusted according to the character of previous input.For example, because " God " is a common English word, therefore, the activation susceptibility numeral of key icon " D " has become 10 from 5.Thus, even next finger contact also can activate key icon " D " more near key icon " F " rather than key icon " D " self.Equally; Because each word string " Goa " and " Goo " will produce one or more more rational English words; For example " Goal ", " Good " or " Goad ", therefore, the activation susceptibility numeral that is associated with key icon " A " and " O " also can increase.In contrast, owing to can not find word string " Gok " at any common English word beginning, therefore, the activation susceptibility numeral of key icon " K " will drop to 3.
What Fig. 6 O described is the activation susceptibility numeral through upgrading that after another character " a " is input to input field 5620, is associated with the different key icon.Under the situation that has provided the word string " Goa " that has been transfused to, the user might key in word " Goal ".Correspondingly, the activation susceptibility numeral that is associated with key icon " L " will increase to 9, and the activation susceptibility numeral that is associated with key icon " O " then can reduce to 2, and this is because can not find word string " Goao " at any common English word beginning.
Generally speaking, pointing the cursor position that contacts with touch-screen display at least partly is according to specifying the activation susceptibility numeral (or weight) of giving user interface object to adjust.This cursor position adjustment helps to reduce the chance of falsely dropping user interface object.
For purpose of explanation, more than description is carried out with reference to specific embodiment.But the illustrative argumentation in the preceding text is not for exhaustive or limit the invention to disclosed exact form.According to above-mentioned instruction, numerous modifications and change all are feasible.Through selecting and describing these embodiment, can carry out best interpretations to the principle of the invention and practical application thereof, the different embodiment that make others skilled in the art use the present invention best thus and have the various modifications of the special-purpose that is suitable for being considered.

Claims (34)

1. one kind is used for contacting the method for confirming cursor position from finger, may further comprise the steps:
Detect the contact area of finger and touch-screen display;
Confirm the corresponding primary importance of barycenter with said contact area;
Confirm to depart from the second place of said primary importance based on the shape of said contact area; And
Confirm cursor position based on following at least in part:
The said second place,
One or more distances between one or more user interface object in the said second place and a plurality of user interface object, and
One or more activation susceptibilitys numeral, each activate susceptibility numeral all with said a plurality of user interface object in the respective user interfaces object associated.
2. one kind is used for contacting the method for confirming cursor position from finger, may further comprise the steps:
Detect the contact area of finger and touch-screen display;
Definite primary importance that is associated with said contact area; And
Confirm cursor position based on following at least in part:
Said primary importance,
One or more distances between one or more user interface object in said primary importance and a plurality of user interface object, and
One or more activation susceptibilitys numeral, each activate susceptibility numeral all with said a plurality of user interface object in the respective user interfaces object associated.
3. method according to claim 2, wherein said a plurality of user interface object comprise one or more icon, one or more icon, one or more deletion icon, one or more icon or soft keyboard key icon of withdrawing from of closing opened.
4. method according to claim 2 wherein for finger and fixedly the contacting of display, is fixedly contacting in the corresponding period with said, and detected contact area is regional with the maximum contact of display corresponding to finger.
5. method according to claim 2, wherein said primary importance are the barycenter of said contact area.
6. method according to claim 2, wherein said contact area are the ellipses with major axis, and said primary importance departs from the barycenter of said contact area along said major axis.
7. method according to claim 2 has first symbol if wherein activate the susceptibility numeral, and the user interface object with said activation susceptibility digital correlation couplet is easy to be activated so; And if said activation susceptibility numeral has second symbol with said first opposite in sign, the user interface object that joins with said activation susceptibility digital correlation so is difficult to be activated.
8. method according to claim 2, the distance between wherein said primary importance and the user interface object are the distances between the most approaching point of said primary importance on said primary importance and the user interface object.
9. method according to claim 2, the distance between wherein said primary importance and the user interface object are the distances between the center of said primary importance and user interface object.
10. method according to claim 2, confirm that wherein cursor position is based on the following:
Said primary importance,
With the activation susceptibility numeral that is associated near the user interface object of said primary importance, and
Said primary importance and near the distance between the user interface object of primary importance.
11. method according to claim 2; Wherein the cursor position amount that departs from said primary importance is provided by formula
Figure FSB00000535353700031
, wherein:
is side-play amount
Figure FSB00000535353700033
is the offset component that is associated with user interface object i; Wherein
Figure FSB00000535353700034
is along the direction between said primary importance and the user interface object i
W iBe the activation susceptibility numeral that is associated with user interface object i,
d iBe the distance between said primary importance and the user interface i,
N is a real number, and
Figure FSB00000535353700035
along
Figure FSB00000535353700036
direction of the unit vector.
12. method according to claim 11, wherein said real number n is 1.
13. method according to claim 2, if wherein said primary importance is in one of said a plurality of user interface object on the display, cursor position is a primary importance so.
14. method according to claim 2 also comprises: the user interface object at exciting light cursor position place.
15. method according to claim 2, wherein said a plurality of user interface object can be seen on display.
16. method according to claim 2 wherein activates the susceptibility numeral and in application, depends on context.
17. method according to claim 2 wherein activates the susceptibility numeral and be associated with the soft keyboard button, and the activation susceptibility numeral that is associated with the soft keyboard button changes when using the soft keyboard button to key in word.
18. an equipment that is used for confirming from the finger contact cursor position comprises:
Be used to detect the device of finger and the contact area of said touch-screen display;
The device that is used for the corresponding primary importance of barycenter of definite and said contact area;
Be used for confirming to depart from the device of the second place of said primary importance based on the shape of said contact area; And
Be used at least in part based on the following device of confirming cursor position:
The said second place,
One or more distances between one or more user interface object in the said second place and a plurality of user interface object, and
One or more activation susceptibilitys numeral, each activate susceptibility numeral all with said a plurality of user interface object in the respective user interfaces object associated.
19. an equipment that is used for confirming from the finger contact cursor position comprises:
Be used to detect the device of finger and the contact area of said touch-screen display;
The device that is used for definite primary importance that is associated with said contact area; And
Be used at least in part based on the following device of confirming cursor position:
Said primary importance,
One or more distances between one or more user interface object in said primary importance and a plurality of user interface object, and
One or more activation susceptibilitys numeral, each activate susceptibility numeral all with said a plurality of user interface object in the respective user interfaces object associated.
20. equipment according to claim 19, wherein said a plurality of user interface object comprise one or more icon, one or more icon, one or more deletion icon, one or more icon or soft keyboard key icon of withdrawing from of closing opened.
21. equipment according to claim 19 wherein for finger and fixedly the contacting of display, is fixedly contacting in the corresponding period with said, detected contact area is regional with the maximum contact of display corresponding to finger.
22. equipment according to claim 19, wherein said primary importance are the barycenter of said contact area.
23. equipment according to claim 19, wherein said contact area are the ellipses with major axis, and said primary importance departs from the barycenter of said contact area along said major axis.
24. equipment according to claim 19 has first symbol if wherein activate the susceptibility numeral, the user interface object with said activation susceptibility digital correlation couplet is easy to be activated so; And if said activation susceptibility numeral has second symbol with said first opposite in sign, the user interface object that joins with said activation susceptibility digital correlation so is difficult to be activated.
25. equipment according to claim 19, the distance between wherein said primary importance and the user interface object are the distances between the most approaching point of said primary importance on said primary importance and the user interface object.
26. equipment according to claim 19, the distance between wherein said primary importance and the user interface object are the distances between the center of said primary importance and user interface object.
27. equipment according to claim 19 confirms that wherein cursor position is based on the following:
Said primary importance,
With the activation susceptibility numeral that is associated near the user interface object of said primary importance, and
Said primary importance and near the distance between the user interface object of primary importance.
28. equipment according to claim 19; Wherein the cursor position amount that departs from said primary importance is provided by formula
Figure FSB00000535353700051
, wherein:
Figure FSB00000535353700061
is side-play amount
Figure FSB00000535353700062
is the offset component that is associated with user interface object i; Wherein
Figure FSB00000535353700063
is along the direction between said primary importance and the user interface object i
W iBe the activation susceptibility numeral that is associated with user interface object i,
d iBe the distance between said primary importance and the user interface i,
N is a real number, and
Figure FSB00000535353700064
along
Figure FSB00000535353700065
direction of the unit vector.
29. equipment according to claim 28, wherein said real number n is 1.
30. equipment according to claim 19, if wherein said primary importance is in one of said a plurality of user interface object on the display, cursor position is a primary importance so.
31. equipment according to claim 19 also comprises: the device that is used for the user interface object at exciting light cursor position place.
32. equipment according to claim 19, wherein said a plurality of user interface object can be seen on display.
33. equipment according to claim 19 wherein activates the susceptibility numeral and in application, depends on context.
34. equipment according to claim 19 wherein activates the susceptibility numeral and be associated with the soft keyboard button, and the activation susceptibility numeral that is associated with the soft keyboard button changes when using the soft keyboard button to key in word.
CN2007800405082A 2006-09-06 2007-09-05 Methods for determining a cursor position from a finger contact with a touch screen display Active CN101529368B (en)

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
US82476906P 2006-09-06 2006-09-06
US60/824,769 2006-09-06
US87925307P 2007-01-07 2007-01-07
US60/879,253 2007-01-07
US87946907P 2007-01-08 2007-01-08
US60/879,469 2007-01-08
US94671607P 2007-06-27 2007-06-27
US60/946,716 2007-06-27
US93799307P 2007-06-29 2007-06-29
US60/937,993 2007-06-29
US11/850,015 2007-09-04
US11/850,015 US7843427B2 (en) 2006-09-06 2007-09-04 Methods for determining a cursor position from a finger contact with a touch screen display
PCT/US2007/077645 WO2008030880A1 (en) 2006-09-06 2007-09-05 Methods for determining a cursor position from a finger contact with a touch screen display

Publications (2)

Publication Number Publication Date
CN101529368A CN101529368A (en) 2009-09-09
CN101529368B true CN101529368B (en) 2012-07-04

Family

ID=40308482

Family Applications (7)

Application Number Title Priority Date Filing Date
CNA2007800011428A Pending CN101356493A (en) 2006-09-06 2007-08-31 Portable electronic device for photo management
CN200780040362.1A Active CN101529367B (en) 2006-09-06 2007-08-31 For the voicemail manager of portable multifunction device
CNA2007800404728A Pending CN101529874A (en) 2006-09-06 2007-08-31 Incoming telephone call management for a portable multifunction device with touch screen display
CN2007800413515A Active CN101535940B (en) 2006-09-06 2007-08-31 Portable electronic device for instant messaging
CN2007800412226A Active CN101535938B (en) 2006-09-06 2007-09-05 Portable electronic device, method, and graphical user interface for displaying structured electronic documents
CN2007800405082A Active CN101529368B (en) 2006-09-06 2007-09-05 Methods for determining a cursor position from a finger contact with a touch screen display
CN2007800011409A Active CN101356492B (en) 2006-09-06 2007-09-06 Portable electonic device performing similar oprations for different gestures

Family Applications Before (5)

Application Number Title Priority Date Filing Date
CNA2007800011428A Pending CN101356493A (en) 2006-09-06 2007-08-31 Portable electronic device for photo management
CN200780040362.1A Active CN101529367B (en) 2006-09-06 2007-08-31 For the voicemail manager of portable multifunction device
CNA2007800404728A Pending CN101529874A (en) 2006-09-06 2007-08-31 Incoming telephone call management for a portable multifunction device with touch screen display
CN2007800413515A Active CN101535940B (en) 2006-09-06 2007-08-31 Portable electronic device for instant messaging
CN2007800412226A Active CN101535938B (en) 2006-09-06 2007-09-05 Portable electronic device, method, and graphical user interface for displaying structured electronic documents

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN2007800011409A Active CN101356492B (en) 2006-09-06 2007-09-06 Portable electonic device performing similar oprations for different gestures

Country Status (3)

Country Link
CN (7) CN101356493A (en)
AU (2) AU2022201622B2 (en)
ES (1) ES2361784T3 (en)

Families Citing this family (151)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US20080168402A1 (en) 2007-01-07 2008-07-10 Christopher Blumenberg Application Programming Interfaces for Gesture Operations
US20080168478A1 (en) 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US8717305B2 (en) 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US8416196B2 (en) 2008-03-04 2013-04-09 Apple Inc. Touch event model programming interface
US8645827B2 (en) 2008-03-04 2014-02-04 Apple Inc. Touch event model
US8285499B2 (en) 2009-03-16 2012-10-09 Apple Inc. Event recognition
US9684521B2 (en) 2010-01-26 2017-06-20 Apple Inc. Systems having discrete and continuous gesture recognizers
US8566045B2 (en) 2009-03-16 2013-10-22 Apple Inc. Event recognition
US9311112B2 (en) 2009-03-16 2016-04-12 Apple Inc. Event recognition
JP5370754B2 (en) * 2009-06-30 2013-12-18 ソニー株式会社 Input device and input method
TWI412963B (en) 2009-07-01 2013-10-21 Htc Corp Data display and movement methods and systems, and computer program products thereof
CN101650633B (en) * 2009-07-03 2011-10-05 苏州佳世达电通有限公司 Manipulating method of electronic device
CN101996028B (en) * 2009-08-21 2013-04-24 宏达国际电子股份有限公司 Data display and move method and system
JP5436975B2 (en) * 2009-08-21 2014-03-05 オリンパスイメージング株式会社 CAMERA, CAMERA DISPLAY CONTROL METHOD, DISPLAY DEVICE, AND DISPLAY METHOD
JP5333068B2 (en) * 2009-08-31 2013-11-06 ソニー株式会社 Information processing apparatus, display method, and display program
KR101390957B1 (en) * 2009-09-04 2014-05-02 나이키 인터내셔널 엘티디. Monitoring and tracking athletic activity
CN102023790B (en) * 2009-09-22 2013-06-12 宏碁股份有限公司 Method for dynamic operation on interactive objects and system thereof
US8823743B2 (en) * 2009-10-02 2014-09-02 Sony Corporation Image processing device and method, and program
KR20110037298A (en) * 2009-10-06 2011-04-13 삼성전자주식회사 Edit method of list and portable device using the same
KR20110037657A (en) * 2009-10-07 2011-04-13 삼성전자주식회사 Method for providing gui by using motion and display apparatus applying the same
EP2320312A1 (en) * 2009-11-10 2011-05-11 Research In Motion Limited Portable electronic device and method of controlling same
CN101702111B (en) * 2009-11-13 2013-07-03 宇龙计算机通信科技(深圳)有限公司 Method for realizing content scaling of touch screen and terminal
US8381125B2 (en) * 2009-12-16 2013-02-19 Apple Inc. Device and method for resizing user interface content while maintaining an aspect ratio via snapping a perimeter to a gridline
US8736561B2 (en) * 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US9052894B2 (en) 2010-01-15 2015-06-09 Apple Inc. API to replace a keyboard with custom controls
EP3882750A1 (en) 2010-01-20 2021-09-22 Nokia Technologies Oy User input
CN101763270B (en) 2010-01-28 2011-06-15 华为终端有限公司 Method for displaying and processing assembly and user equipment
US20110191719A1 (en) * 2010-02-04 2011-08-04 Microsoft Corporation Cut, Punch-Out, and Rip Gestures
RU2556079C2 (en) * 2010-02-04 2015-07-10 Нокиа Корпорейшн User data input
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US20110209101A1 (en) * 2010-02-25 2011-08-25 Hinckley Kenneth P Multi-screen pinch-to-pocket gesture
EP2367097B1 (en) * 2010-03-19 2017-11-22 BlackBerry Limited Portable electronic device and method of controlling same
US8756522B2 (en) 2010-03-19 2014-06-17 Blackberry Limited Portable electronic device and method of controlling same
WO2011130919A1 (en) 2010-04-23 2011-10-27 Motorola Mobility, Inc. Electronic device and method using touch-detecting surface
KR101673925B1 (en) * 2010-05-26 2016-11-09 삼성전자주식회사 Portable Device having the touch lock status and Operation system thereof
US8131898B2 (en) * 2010-05-27 2012-03-06 Adobe Systems Incorporated Event handling in an integrated execution environment
CN102270081B (en) * 2010-06-03 2015-09-23 腾讯科技(深圳)有限公司 A kind of method and device adjusting size of list element
US8552999B2 (en) * 2010-06-14 2013-10-08 Apple Inc. Control selection approximation
JP2012008686A (en) * 2010-06-23 2012-01-12 Sony Corp Information processor and method, and program
EP2405337B1 (en) * 2010-07-06 2015-09-16 HTC Corporation Method for presenting human machine interface, handheld device using the same, and computer readable medium therefor
JP5659586B2 (en) * 2010-07-09 2015-01-28 ソニー株式会社 Display control device, display control method, display control program, and recording medium
CN102209141A (en) * 2010-07-15 2011-10-05 优视科技有限公司 Page scrollbar display method and device for mobile communication equipment terminal
US9304591B2 (en) * 2010-08-10 2016-04-05 Lenovo (Singapore) Pte. Ltd. Gesture control
CN102385475B (en) * 2010-09-06 2017-04-19 联想(北京)有限公司 Electronic device and interactive method thereof
JP2012058921A (en) * 2010-09-07 2012-03-22 Sony Corp Information processor, information processing method and program
JP5389757B2 (en) 2010-09-15 2014-01-15 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, content creation apparatus, image processing method, and data structure of content file
EP2641145A4 (en) * 2010-11-20 2017-05-03 Nuance Communications, Inc. Systems and methods for using entered text to access and process contextual information
KR101749529B1 (en) * 2010-11-25 2017-06-21 엘지전자 주식회사 Mobile terminal and operation control method thereof
CN102169383A (en) * 2010-11-26 2011-08-31 苏州瀚瑞微电子有限公司 Identification method for rotating gestures of touch screen
US8866735B2 (en) * 2010-12-16 2014-10-21 Motorla Mobility LLC Method and apparatus for activating a function of an electronic device
US9363579B2 (en) * 2010-12-22 2016-06-07 Google Inc. Video player with assisted seek
CN102053754A (en) * 2010-12-31 2011-05-11 东莞宇龙通信科技有限公司 Method and device for processing key area on touch screen
US20120192118A1 (en) * 2011-01-24 2012-07-26 Migos Charles J Device, Method, and Graphical User Interface for Navigating through an Electronic Document
CN102185828B (en) * 2011-01-30 2013-10-09 广东佳和通信技术有限公司 Method for binding and controlling personal computer (PC) software and session initiation protocol user agent (SIP UA)
US9298363B2 (en) 2011-04-11 2016-03-29 Apple Inc. Region activation for touch sensitive surface
JP5254399B2 (en) * 2011-05-13 2013-08-07 株式会社エヌ・ティ・ティ・ドコモ Display device, user interface method and program
US8890823B2 (en) * 2012-01-09 2014-11-18 Motorola Mobility Llc System and method for reducing occurrences of unintended operations in an electronic device
GB2492789B (en) * 2011-07-12 2018-01-03 Denso Corp Displays
US9256361B2 (en) 2011-08-03 2016-02-09 Ebay Inc. Control of search results with multipoint pinch gestures
EP2740052A4 (en) * 2011-08-05 2015-04-08 Blackberry Ltd System and method for searching for text and displaying found text in augmented reality
US10140011B2 (en) 2011-08-12 2018-11-27 Microsoft Technology Licensing, Llc Touch intelligent targeting
US9280274B2 (en) * 2011-09-13 2016-03-08 Sony Corporation Information processing device, display control method, program and information storage medium
US9710048B2 (en) 2011-10-03 2017-07-18 Google Technology Holdings LLC Method for detecting false wake conditions of a portable electronic device
US10684768B2 (en) * 2011-10-14 2020-06-16 Autodesk, Inc. Enhanced target selection for a touch-based input enabled user interface
US20130125066A1 (en) * 2011-11-14 2013-05-16 Microsoft Corporation Adaptive Area Cursor
KR20130093720A (en) * 2011-12-23 2013-08-23 삼성전자주식회사 Display apparatus for releasing lock status and method thereof
CN102591853B (en) * 2011-12-29 2015-04-01 优视科技有限公司 Webpage rearrangement method, webpage rearrangement device and mobile terminal
CN103246457B (en) * 2012-02-09 2016-05-04 宇龙计算机通信科技(深圳)有限公司 The starting method of terminal and application program
US9106762B2 (en) * 2012-04-04 2015-08-11 Google Inc. Associating content with a graphical interface window using a fling gesture
WO2013169853A1 (en) 2012-05-09 2013-11-14 Industries Llc Yknots Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
CN105260049B (en) * 2012-05-09 2018-10-23 苹果公司 For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
JP5467123B2 (en) 2012-05-30 2014-04-09 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus and information processing method
US20130342468A1 (en) * 2012-06-20 2013-12-26 Chimei Innolux Corporation Method for determining touch location on a touch panel and touch panel module
KR101942308B1 (en) 2012-08-08 2019-01-25 삼성전자주식회사 Method for providing message function and an electronic device thereof
US9811216B2 (en) * 2012-09-14 2017-11-07 Sharp Kabushiki Kaisha Display device, portable terminal, monitor, television, and method for controlling display device
KR102096581B1 (en) * 2012-09-14 2020-05-29 삼성전자주식회사 Method for editing display information and an electronic device thereof
KR102102438B1 (en) * 2012-12-06 2020-04-20 삼성전자주식회사 Display apparatus and method for controlling thereof
US10585553B2 (en) 2012-12-06 2020-03-10 Samsung Electronics Co., Ltd. Display device and method of controlling the same
US9104371B2 (en) * 2012-12-07 2015-08-11 Apple Inc. Integrated visual notification system in an accessory device
CN103135903B (en) * 2013-02-22 2016-04-27 小米科技有限责任公司 A kind of chart gallery display method and device
US9471200B2 (en) * 2013-03-15 2016-10-18 Apple Inc. Device, method, and graphical user interface for organizing and presenting a collection of media items
CN104113682B (en) * 2013-04-22 2018-08-31 联想(北京)有限公司 A kind of image acquiring method and electronic equipment
US9807145B2 (en) * 2013-05-10 2017-10-31 Successfactors, Inc. Adaptive tile framework
US9477331B2 (en) 2013-06-07 2016-10-25 Apple Inc. Touch detection at bezel edge
US9733716B2 (en) 2013-06-09 2017-08-15 Apple Inc. Proxy gesture recognizer
US10168882B2 (en) * 2013-06-09 2019-01-01 Apple Inc. Device, method, and graphical user interface for switching between camera interfaces
US9419935B2 (en) * 2013-08-02 2016-08-16 Whatsapp Inc. Voice communications with real-time status notifications
JP5505550B1 (en) * 2013-08-06 2014-05-28 富士ゼロックス株式会社 Image display apparatus and program
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
US10545657B2 (en) 2013-09-03 2020-01-28 Apple Inc. User interface for manipulating user interface objects
KR20180128091A (en) * 2013-09-03 2018-11-30 애플 인크. User interface for manipulating user interface objects with magnetic properties
CN103472975A (en) * 2013-09-11 2013-12-25 江苏中科梦兰电子科技有限公司 Intelligent terminal human-computer interaction method with instant messaging style
CN103699297B (en) * 2013-12-13 2018-02-09 乐视网信息技术(北京)股份有限公司 A kind of intelligent terminal and collection of drama broadcast time reminding method
CN105916720B (en) * 2014-01-20 2019-06-14 大众汽车有限公司 User interface and for the method by touch-sensitive display unit control volume
US11914419B2 (en) 2014-01-23 2024-02-27 Apple Inc. Systems and methods for prompting a log-in to an electronic device based on biometric information received from a user
CN111488112B (en) * 2014-01-23 2024-03-08 苹果公司 Virtual computer keyboard
KR20150091607A (en) * 2014-02-03 2015-08-12 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6328797B2 (en) 2014-05-30 2018-05-23 アップル インコーポレイテッド Transition from using one device to using another device
US20150350141A1 (en) * 2014-05-31 2015-12-03 Apple Inc. Message user interfaces for capture and transmittal of media and location content
WO2015200889A1 (en) 2014-06-27 2015-12-30 Apple Inc. Electronic device with rotatable input mechanism for navigating calendar application
CN104133624B (en) * 2014-07-10 2015-10-28 腾讯科技(深圳)有限公司 Web animation display packing, device and terminal
JP6399834B2 (en) 2014-07-10 2018-10-03 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
CN106605201B (en) 2014-08-06 2021-11-23 苹果公司 Reduced size user interface for battery management
KR101610880B1 (en) * 2014-08-12 2016-04-08 네이버 주식회사 Method and apparatus of controlling display, and computer program for executing the method
US20160048319A1 (en) * 2014-08-18 2016-02-18 Microsoft Technology Licensing, Llc Gesture-based Access to a Mix View
WO2016032806A1 (en) * 2014-08-26 2016-03-03 Apple Inc. User interface for limiting notifications and alerts
CN104216617B (en) * 2014-08-27 2017-05-24 小米科技有限责任公司 Cursor position determination method and device
TWI582641B (en) 2014-09-02 2017-05-11 蘋果公司 Button functionality
DE202015006066U1 (en) 2014-09-02 2015-12-14 Apple Inc. Smaller interfaces for handling notifications
WO2016036509A1 (en) 2014-09-02 2016-03-10 Apple Inc. Electronic mail user interface
WO2016036541A2 (en) 2014-09-02 2016-03-10 Apple Inc. Phone user interface
US20160062571A1 (en) 2014-09-02 2016-03-03 Apple Inc. Reduced size user interface
TWI613582B (en) 2014-09-02 2018-02-01 蘋果公司 Method for reconfiguring user interface objects,touch-sensitive electronic device and non-transitorycomputer-readable storage medium
US11567626B2 (en) * 2014-12-17 2023-01-31 Datalogic Usa, Inc. Gesture configurable floating soft trigger for touch displays on data-capture electronic devices
US10586398B2 (en) * 2014-12-18 2020-03-10 Koninklijke Philips N.V. Medical image editing
KR20160088603A (en) * 2015-01-16 2016-07-26 삼성전자주식회사 Apparatus and method for controlling screen
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
EP3274870A1 (en) * 2015-03-27 2018-01-31 Google LLC Navigating event information
US9785487B1 (en) * 2015-05-12 2017-10-10 Google Inc. Managing device functions based on physical interaction with device modules
US9860451B2 (en) 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9658704B2 (en) * 2015-06-10 2017-05-23 Apple Inc. Devices and methods for manipulating user interfaces with a stylus
JP6518141B2 (en) * 2015-06-16 2019-05-22 株式会社ディスコ Touch panel device
CN104978146B (en) * 2015-06-30 2017-11-24 广东欧珀移动通信有限公司 A kind of picture operation method and mobile terminal
US9880735B2 (en) 2015-08-10 2018-01-30 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN108028869B (en) * 2015-09-25 2020-09-18 华为技术有限公司 Terminal equipment and method for processing incoming call
KR102553886B1 (en) * 2015-12-24 2023-07-11 삼성전자주식회사 Electronic device and method for image control thereof
US10637986B2 (en) 2016-06-10 2020-04-28 Apple Inc. Displaying and updating a set of application views
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
KR20180058097A (en) * 2016-11-23 2018-05-31 삼성전자주식회사 Electronic device for displaying image and method for controlling thereof
CN108266960B (en) * 2017-01-03 2020-09-22 三星电子株式会社 Food storage device and control method thereof
CN107247694A (en) * 2017-07-06 2017-10-13 福建中金在线信息科技有限公司 Information query method, device and electronic equipment based on portable electric appts
CN109429091A (en) * 2017-08-31 2019-03-05 武汉斗鱼网络科技有限公司 Promote method, storage medium, electronic equipment and the system of live streaming viewing experience
CN108197560B (en) * 2017-12-28 2022-06-07 努比亚技术有限公司 Face image recognition method, mobile terminal and computer-readable storage medium
CN115086736A (en) * 2018-05-08 2022-09-20 日本聚逸株式会社 Moving image distribution system, method thereof, and recording medium
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
WO2020094214A1 (en) * 2018-11-06 2020-05-14 Volvo Truck Corporation A finger-position sensitive human machine interface for handling a user input of a user and a method for handling a user input of a user of a finger-position sensitive human machine interface
US11477609B2 (en) 2019-06-01 2022-10-18 Apple Inc. User interfaces for location-related communications
US11481094B2 (en) 2019-06-01 2022-10-25 Apple Inc. User interfaces for location-related communications
US11152100B2 (en) 2019-06-01 2021-10-19 Apple Inc. Health application user interfaces
CN110297002B (en) 2019-06-27 2022-05-24 上海联影医疗科技股份有限公司 Energy imaging method, device, equipment and storage medium
CN110968238A (en) * 2019-11-20 2020-04-07 四川商通实业有限公司 Image fast editing method and system based on ios system
CN112835575A (en) * 2019-11-23 2021-05-25 西安诺瓦星云科技股份有限公司 Multi-layer display control method and device
CN113518261B (en) * 2020-12-25 2023-09-22 腾讯科技(深圳)有限公司 Guiding video playing method, guiding video playing device, computer equipment and storage medium
US11907605B2 (en) 2021-05-15 2024-02-20 Apple Inc. Shared-content session user interfaces
US11449188B1 (en) 2021-05-15 2022-09-20 Apple Inc. Shared-content session user interfaces
CN113721818B (en) * 2021-09-02 2022-08-09 北京城市网邻信息技术有限公司 Image processing method, device, equipment and computer readable storage medium
CN113703653A (en) * 2021-09-02 2021-11-26 北京城市网邻信息技术有限公司 Image processing method, device, equipment and computer readable storage medium
CN114866641B (en) * 2022-07-07 2022-11-11 荣耀终端有限公司 Icon processing method, terminal equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049326A (en) * 1997-05-12 2000-04-11 Siemens Information And Communication Networks, Inc. System and method for dual browser modes
GB2351639A (en) * 1999-01-15 2001-01-03 Ibm Touch screen region assist for hypertext links
CN1652072A (en) * 2004-02-06 2005-08-10 乐金电子(中国)研究开发中心有限公司 Portable terminal cursor shifting device and its method
EP1674976A2 (en) * 2004-12-22 2006-06-28 Microsoft Corporation Improving touch screen accuracy

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US6486895B1 (en) * 1995-09-08 2002-11-26 Xerox Corporation Display system for displaying lists of linked documents
US5847709A (en) * 1996-09-26 1998-12-08 Xerox Corporation 3-D document workspace with focus, immediate and tertiary spaces
US6069626A (en) * 1997-02-27 2000-05-30 International Business Machines Corporation Method and apparatus for improved scrolling functionality in a graphical user interface utilizing a transparent scroll bar icon
US6169911B1 (en) * 1997-09-26 2001-01-02 Sun Microsystems, Inc. Graphical user interface for a portable telephone
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US6181316B1 (en) * 1998-06-04 2001-01-30 International Business Machines Corporation Graphical user interface inline scroll control
US20020018051A1 (en) * 1998-09-15 2002-02-14 Mona Singh Apparatus and method for moving objects on a touchscreen display
AU4186000A (en) * 1999-03-30 2000-11-14 Tivo, Inc. Television viewer interface system
US6631186B1 (en) * 1999-04-09 2003-10-07 Sbc Technology Resources, Inc. System and method for implementing and accessing call forwarding services
US6262724B1 (en) * 1999-04-15 2001-07-17 Apple Computer, Inc. User interface for presenting media information
US7007239B1 (en) * 2000-09-21 2006-02-28 Palm, Inc. Method and apparatus for accessing a contacts database and telephone services
JP2001265481A (en) * 2000-03-21 2001-09-28 Nec Corp Method and device for displaying page information and storage medium with program for displaying page information stored
CN1392476A (en) * 2001-06-19 2003-01-22 神基科技股份有限公司 Method for matching multi-medium program to execute analogue jog dial function
US6690387B2 (en) * 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
TW591488B (en) * 2002-08-01 2004-06-11 Tatung Co Window scrolling method and device thereof
CN100483403C (en) * 2002-12-17 2009-04-29 汤姆森许可公司 Method for tagging and displaying songs in a digital audio player
US6990637B2 (en) * 2003-10-23 2006-01-24 Microsoft Corporation Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data
CN100346274C (en) * 2004-03-25 2007-10-31 升达科技股份有限公司 Inputtig method, control module and product with starting location and moving direction as definition
EP2000894B1 (en) * 2004-07-30 2016-10-19 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
KR100958490B1 (en) * 2004-07-30 2010-05-17 애플 인크. Mode-based graphical user interfaces for touch sensitive input devices
WO2006055675A1 (en) * 2004-11-16 2006-05-26 Waters Investments Limited Device for performing separations and methods of making and using same
US8341541B2 (en) * 2005-01-18 2012-12-25 Microsoft Corporation System and method for visually browsing of open windows
US8819569B2 (en) * 2005-02-18 2014-08-26 Zumobi, Inc Single-handed approach for navigation of application tiles using panning and zooming

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6049326A (en) * 1997-05-12 2000-04-11 Siemens Information And Communication Networks, Inc. System and method for dual browser modes
GB2351639A (en) * 1999-01-15 2001-01-03 Ibm Touch screen region assist for hypertext links
CN1652072A (en) * 2004-02-06 2005-08-10 乐金电子(中国)研究开发中心有限公司 Portable terminal cursor shifting device and its method
EP1674976A2 (en) * 2004-12-22 2006-06-28 Microsoft Corporation Improving touch screen accuracy

Also Published As

Publication number Publication date
CN101535940B (en) 2013-06-12
CN101529367B (en) 2016-02-17
CN101535938A (en) 2009-09-16
ES2361784T3 (en) 2011-06-22
CN101529368A (en) 2009-09-09
AU2022201622B2 (en) 2023-05-18
CN101356492A (en) 2009-01-28
CN101529874A (en) 2009-09-09
CN101535938B (en) 2013-05-08
AU2023216869A1 (en) 2023-09-07
CN101356492B (en) 2012-06-27
CN101535940A (en) 2009-09-16
CN101529367A (en) 2009-09-09
AU2022201622A1 (en) 2022-03-31
CN101356493A (en) 2009-01-28

Similar Documents

Publication Publication Date Title
CN101529368B (en) Methods for determining a cursor position from a finger contact with a touch screen display
CN101627361B (en) Portable multifunction device, method, and graphical user interface for interpreting a finger gesture on a touch screen display
CN101526881B (en) Text selection by gesture
CN101542424B (en) List scrolling and document translation, scaling, and rotation on touch-screen display
KR101085732B1 (en) Methods for determining a cursor position from a finger contact with a touch screen display
CN102541434B (en) Deletion gestures on a portable multifunction device
KR101152582B1 (en) Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US9207855B2 (en) Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
CN101627359B (en) System and method for moving lists on touch screen
EP2095214B1 (en) Portable multifunction device, method, and graphical user interface for adjusting an insertion point marker
CN102414755A (en) Device, method, and graphical user interface for editing an audio or video attachment in an electronic message

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant