US20120019453A1 - Motion continuation of touch input - Google Patents

Motion continuation of touch input Download PDF

Info

Publication number
US20120019453A1
US20120019453A1 US12/891,655 US89165510A US2012019453A1 US 20120019453 A1 US20120019453 A1 US 20120019453A1 US 89165510 A US89165510 A US 89165510A US 2012019453 A1 US2012019453 A1 US 2012019453A1
Authority
US
United States
Prior art keywords
motion
input
speed
decay rate
speed range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/891,655
Other languages
English (en)
Inventor
Wayne Carl Westerman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US12/891,655 priority Critical patent/US20120019453A1/en
Assigned to APPLE INC. reassignment APPLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WESTERMAN, WAYNE CARL
Priority to KR1020137004715A priority patent/KR20130024989A/ko
Priority to CN2011800415150A priority patent/CN103069379A/zh
Priority to AU2011282997A priority patent/AU2011282997B2/en
Priority to JP2013521860A priority patent/JP2013532871A/ja
Priority to EP11741040.7A priority patent/EP2598977A1/fr
Priority to PCT/US2011/045109 priority patent/WO2012015701A1/fr
Publication of US20120019453A1 publication Critical patent/US20120019453A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0446Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a grid-like structure of electrodes in at least two directions, e.g. using row and column electrodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • This relates generally to motion continuation of touch input, and more particularly, to motion continuation based on a plurality of decay rates.
  • Touch screens are becoming increasingly popular because of their ease and versatility of operation as well as their declining price.
  • Touch screens can include a transparent touch sensor panel positioned in front of a display device such as a liquid crystal display (LCD), or an integrated touch screen in which touch sensing circuitry is partially or fully integrated into a display, etc.
  • Touch screens can allow a user to perform various functions by touching the touch screen using a finger, stylus or other object at a location that may be dictated by a user interface (UI) being displayed by the display device.
  • UI user interface
  • touch screens can recognize a touch event and the position of the touch event on the touch sensor panel, and the computing system can then interpret the touch event in accordance with the display appearing at the time of the touch event, and thereafter can perform one or more actions based on the touch event.
  • Mutual capacitance touch sensor panels can be formed from a matrix of drive and sense lines of a substantially transparent conductive material such as Indium Tin Oxide (ITO), often arranged in rows and columns in horizontal and vertical directions on a substantially transparent substrate.
  • Drive signals can be transmitted through the drive lines, which can make it possible to measure the static mutual capacitance at the crossover points or adjacent areas (sensing pixels) of the drive lines and the sense lines.
  • the static mutual capacitance, and any changes to the static mutual capacitance due to a touch event can be determined from sense signals that can be generated in the sense lines due to the drive signals.
  • a touch input may be performed, for example, by one or more contacts on or near a touch sensing surface of a computing system.
  • a touch input can include, for example, a cursor motion, a scrolling motion, a dragging motion, etc.
  • the motion of the input can be tracked based on the one or more contacts on or near the touch sensing surface.
  • the motion of the input can be continued when one or more of the contacts lifts off from the surface by, for example, determining the liftoff of the one or more of the contacts during the input, determining a speed at the liftoff, selecting, based on the speed at liftoff, one of a plurality of decay rates corresponding to a plurality of ranges of speed, and continuing the motion of the input based on the selected decay rate.
  • the system can determine that the continued motion reaches a next-lower range of speed, and the decay rate can be reset based on the decay rate of the next-lower range of speed, and the motion can be continued based on the reset decay rate.
  • the plurality of ranges of speed can include a high-speed range, a low-speed range, and a medium-speed range in between the high-speed range and the low-speed range.
  • the decay rates corresponding to different ranges of speed can be selected such that, for example, the decay rate corresponding to the high-speed range is greater than the decay rate corresponding to the medium-speed range, and/or the decay rate corresponding to the low-speed range is greater than the decay rate corresponding to the medium-speed range. In this way, for example, motion continuation may be made more efficient by allowing easier visual tracking of an input motion.
  • FIGS. 1A-1D illustrate an example mobile telephone, an example digital media player, an example personal computer, and an example wireless trackpad that each include functionality according to embodiments of the disclosure.
  • FIG. 2 illustrates an example computer system that includes functionality according to embodiments of the disclosure.
  • FIG. 3 illustrates an example method of transitioning from an unspecified resting state according to embodiments of the disclosure.
  • FIG. 4 illustrates an example method of transitioning after a point input has been selected but not locked according to embodiments of the disclosure.
  • FIG. 5 illustrates one example method of transitioning from a locked point input according to embodiments of the disclosure.
  • FIG. 6 illustrates an example method of transitioning from a currently selected scroll input that is not locked according to embodiments of the disclosure.
  • FIG. 7 illustrates an example method of transitioning from a drag input according to embodiments of the disclosure.
  • FIGS. 8-9 illustrate an example drag continuation input according to embodiments of the disclosure.
  • FIG. 10 illustrates an example method of transitioning based on lifting and dropping a subset of fingers according to embodiments of the disclosure.
  • the disclosed example embodiments relate to continuing the motion of an input of a computing system based on a plurality of decay rates.
  • a user may perform a touch input, for example, by contacting a touch sensing surface of a computing system with one or more fingers. For example, the user may move a particular number of fingers across the touch sensing surface to move a cursor, scroll a document, select text with a dragging motion of a cursor, etc.
  • the motion of input of the user's fingers can be tracked based on the one or more finger contacts on the touch sensing surface.
  • the motion of the input can be continued when one or more of the contacts lifts off from the surface by, for example, determining the liftoff of the one or more of the contacts during the input, determining a speed at the liftoff, selecting, based on the speed at liftoff, one of a plurality of decay rates corresponding to a plurality of ranges of speed, and continuing the motion of the input based on the selected decay rate.
  • the system can determine that the continued motion reaches a next-lower range of speed, and the decay rate can be reset based on the decay rate of the next-lower range of speed, and the motion can be continued based on the reset decay rate.
  • the plurality of ranges of speed can include a high-speed range, a low-speed range, and a medium-speed range in between the high-speed range and the low-speed range.
  • a high decay rate can be selected for the high-speed range, such that motion continuation initiated in the high-speed range can decay quickly. In this way, for example, motion continuation that is too fast for a user to follow can quickly be slowed to a more reasonable speed, e.g., the medium-speed range.
  • a lower decay rate can be selected for the medium-speed range, such that once the continued motion reaches the medium-speed range, the motion decays more slowly. In this way, for example, a motion continuation can be maintained in a medium-speed range, e.g., at a speed the user can more easily track visually, for a longer period of time.
  • the decay rate can be set so that the motion decays quickly, so that the continued motion does not remain in a slow motion for too long before coming to a stop.
  • embodiments disclosed herein may be described and illustrated herein in terms of mutual capacitance touch sensing surfaces, it should be understood that the embodiments are not so limited, but can be additionally applicable to, for example, self-capacitance, optical, resistive, and other touch sensing surfaces and technologies that can detect single and/or multiple touches on or near the surface.
  • FIGS. 1A-1D show example systems in which embodiments of the disclosure may be implemented.
  • FIG. 1A illustrates an example mobile telephone 136 with a touch screen 124 .
  • FIG. 1B illustrates an example digital media player 140 with a touch screen 126 .
  • FIG. 1C illustrates an example personal computer 144 with a touch screen 128 and a trackpad 130 .
  • FIG. 1D illustrates an example wireless trackpad 150 , which can be wirelessly connected to a personal computer, such as personal computer 144 , for example.
  • FIG. 2 is a block diagram of an example computing system 200 that illustrates one implementation of an example touch screen 220 according to embodiments of the disclosure.
  • Computing system 200 could be included in, for example, mobile telephone 136 , digital media player 140 , personal computer 144 , or any mobile or non-mobile computing device that includes a touch screen.
  • a similar computing system, with similar touch sensing functionality and without the need for display functionality, can be included in, for example, trackpad 150 .
  • Computing system 200 can include a touch sensing system including one or more touch processors 202 , peripherals 204 , a touch controller 206 , and touch sensing circuitry.
  • Peripherals 204 can include, but are not limited to, random access memory (RAM) or other types of memory or storage, watchdog timers and the like.
  • Touch controller 206 can include, but is not limited to, one or more sense channels 208 , channel scan logic 210 and driver logic 214 .
  • Channel scan logic 210 can access RAM 212 , autonomously read data from the sense channels and provide control for the sense channels.
  • channel scan logic 210 can control driver logic 214 to generate stimulation signals 216 at various frequencies and phases that can be selectively applied to drive regions of the touch sensing circuitry of touch screen 220 .
  • touch controller 206 , touch processor 202 and peripherals 204 can be integrated into a single application specific integrated circuit (ASIC).
  • ASIC application specific integrated circuit
  • Touch screen 220 can include touch sensing circuitry that can include a capacitive sensing medium having a plurality of drive lines 222 and a plurality of sense lines 223 .
  • Drive lines 222 can be driven by stimulation signals 216 from driver logic 214 through a drive interface 224 , and resulting sense signals 217 generated in sense lines 223 can be transmitted through a sense interface 225 to sense channels 208 (also referred to as an event detection and demodulation circuit) in touch controller 206 .
  • sense channels 208 also referred to as an event detection and demodulation circuit
  • drive lines and sense lines can be part of the touch sensing circuitry that can interact to form capacitive sensing nodes, which can be thought of as touch picture elements (touch pixels), such as touch pixels 226 and 227 .
  • touch screen 220 is viewed as capturing an “image” of touch.
  • touch controller 206 has determined whether a touch has been detected at each touch pixel in the touch screen
  • the pattern of touch pixels in the touch screen at which a touch occurred can be thought of as an “image” of touch (e.g. a pattern of fingers touching the touch screen).
  • Computing system 200 can also include a host processor 228 for receiving outputs from touch processor 202 and performing actions based on the outputs.
  • host processor 228 can be connected to program storage 232 and a display controller, such as an LCD driver 234 .
  • Host processor 228 can use LCD driver 234 to generate an image on touch screen 220 , such as an image of a user interface (UI), and can use touch processor 202 and touch controller 206 to detect a touch on or near touch screen 220 , such a touch input to the displayed UI.
  • UI user interface
  • the touch input can be used by computer programs stored in program storage 232 to perform actions that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device connected to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like.
  • Host processor 228 can also perform additional functions that may not be related to touch processing.
  • Computing system 200 can allow a user to enter inputs by, for example, tapping, sliding, etc., one or more touch devices, such as fingers, thumbs, etc., on a touch sensing surface, such as touch screen 220 .
  • a particular input may be selected, for example, based on a number of contacts on or near the touch sensing surface and a motion of the contacts.
  • one finger down on the touch sensing surface and moving may correspond to a point input that can cause a mouse cursor to move in the direction of the one-finger motion.
  • Two fingers down on the touch sensing surface and moving may correspond to a scroll input that can cause a document displayed on a touch screen or display to scroll in the direction of the two-finger motion.
  • some systems may require that the user lift all fingers from the touch surface, and then drop the number of fingers required for the new input. In other words, some systems may simply retain the currently-selected input even though the number of fingers changes, until all fingers are lifted off. For example, to switch from a two-finger scroll input to a one-finger point input, a system may require that the user lift the two fingers and drop one finger back down. In this system, the scroll input can remain selected even after lifting one of the two fingers. On the other hand, some systems may simply select the current input that matches the current number of fingers down. In these systems, for example, each new finger liftoff/touchdown can cause the selected input to switch to the corresponding number of fingers down. In other words, these two systems can either always allow or never allow switching between inputs while some fingers remain down.
  • a “finger” can include a finger or a thumb, unless otherwise noted.
  • Example embodiments are described below using an example set of inputs that correspond to various combinations of contact numbers/arrangements and activities, as summarized in Table 1.
  • a combination of contact number/arrangement and activity also referred to as a “base gesture” herein
  • a corresponding input can be made to a computing system, such as computing system 200 .
  • the touch system can enter a state of “no current input” when no touches are detected by the touch system, e.g., the user is not touching the touch surface. While in the no input state, if one of the base gestures is detected, its corresponding input can be selected without requiring further evaluation.
  • a user can select one of the inputs in Table 1 directly by lifting all fingers, dropping the number of fingers corresponding to the desired input, and performing the corresponding activity (e.g., tapping, motion). The user can lift all fingers when the desired input is complete, returning the system to the no input state.
  • the example touch system can allow for more complex interactions that can include lifting, dropping, resting, and/or moving one or more additional fingers while a current input is selected.
  • the touch system can decide whether or not to switch the currently selected input to a new input based on touch information determined from various characteristics of the contacts, such as number of contacts down, number of contacts lifted off, motion and/or resting of all or a subset of the contacts, arrangement of the contacts, whether contacts are being added or removed, and other information, such as the currently selected input, the input to be selected, whether input has been locked, etc.
  • FIGS. 3-10 illustrate example methods of determining when to allow and when to prevent switching from a currently selected input/state to a new input/state.
  • the example methods described below do not necessarily cover all possible switching scenarios that could occur, but provide examples in which various touch information and other information can be compared to predefined criteria to determine whether or not to switch inputs.
  • FIG. 3 illustrates an example method of transitioning from an unspecified resting state.
  • An unspecified resting state can be, for example, a state in which all fingers are resting and no input has been selected.
  • the unspecified resting state can be entered by, for example, dropping one or more fingers onto the touch surface while keeping the fingers substantially stationary on the surface. While in the resting state, certain transitions to selected inputs can be made, while other transitions may be prevented.
  • Starting from unspecified resting state 301 if one-finger motion is detected ( 303 ) then the number of fingers down (i.e., the number of fingers currently touching the touch surface) can be counted ( 305 ).
  • the unspecified resting state can be maintained ( 307 ).
  • a point input can be selected ( 309 ).
  • touch information such as a total number of fingers down and whether the other fingers down are in a resting state (e.g., substantially stationary).
  • the determination can also be based on other information, such as whether the current state is an unspecified resting state.
  • the number of fingers down can be determined ( 313 ), and if the number of fingers down equals three then the unspecified resting state can be maintained ( 315 ). On the other hand, if the number of fingers down is greater than three, then a scroll input can be selected ( 317 ). In other words, starting from an unspecified resting state, a user can move two fingers and initiate a scroll input so long as at least four fingers are down.
  • a pointing input can be selected ( 323 ).
  • the unspecified resting state can be maintained ( 321 ). In other words, starting from the unspecified resting state, the user can move all five fingers to initiate a pointing input. It is noted that neither three-finger nor four-finger motion can initiate input from the unspecified resting state. That is, if the user moves three or four fingers, the unspecified resting state is maintained. On the other hand, if the user moves one, two, or five fingers, a new input state can be selected.
  • a subsequent resting of the fingers will not enter the unspecified resting state, unless there is a lift off of all fingers and a subsequent touch down in the resting state. In other words, the currently selected input will remain selected even though fingers subsequently rest.
  • FIG. 4 illustrates an example method of transitioning after a point input has been selected but not locked. Locking point input is described in more detail below.
  • a point input selected 401 if the number of fingers down is one ( 403 ), then the user can be pointing with a single finger.
  • Other inputs that can occur when the user is pointing with a single finger down can include additional finger drops ( 405 ). If additional fingers are not dropped, the point input can be maintained ( 407 ). On the other hand, if additional fingers drop during single-finger pointing then further testing can take place to determine whether to switch inputs. If four fingers drop ( 409 ) then pointing input can be maintained ( 407 ).
  • the touch system can include a pair of autoregressive filters including a slow filter, which can provide an indication of average speed of contact motion over a longer period of time, and a fast filter, which can provide an indication of average speed of contact motion over a shorter period of time. Comparing the outputs of the two filters can provide an indication of whether the contact motion is decelerating, for example.
  • a rolling stop can be determined if the output of the fast filter falls below a predetermined fraction of the slow filter, for example.
  • the additional finger drop occurred at a rolling stop of the single finger if the additional finger drop occurred within a predetermined time before or after a rolling stop of the single finger motion is determined by comparison of the autoregressive filters, for example.
  • an additional finger drop of a certain number of fingers can be allowed to select a new input if the drop occurs within, for example, 125 milliseconds of an initial touch down of the single finger. If the additional finger drop does not meet one of the criteria that the first finger is at a rolling stop or soon after initial touchdown, then the point input selection can be maintained ( 407 ).
  • a four-finger input can be selected ( 415 ). If two additional fingers drop ( 417 ) then a three-finger input (e.g., swipe or drag) can be selected ( 419 ). If a one-finger tap is detected ( 421 ), then a one-finger tap input can be selected ( 423 ). If a one-finger drop is detected and the dropped finger remains down ( 425 ), then a scroll input can be selected ( 427 ). Otherwise, the point input can remain selected ( 407 ).
  • the user can change the selected input by dropping one, two, or three fingers, so long as the one finger down comes to a rolling stop before dropping the additional fingers. Additional finger drops of four fingers and finger drops that do not occur at a rolling stop or soon after initial single-finger touchdown may not switch the input from the selected pointing input.
  • the determination of whether or not to select a new input can be based on touch information such as total number of fingers down, number of additional fingers dropped, and whether the single finger is at a rolling stop.
  • one way in which the user can select another input can be to move more than one finger.
  • the touch system can require that certain criteria be satisfied in order for a new input to be selected. If one or zero fingers are moving ( 429 ), then pointing input can be maintained ( 407 ). On the other hand, if more than one finger is moving ( 429 ), then it can be determined whether all of the fingers currently down are in motion ( 431 ). If all fingers down are not moving, then a point input can be locked ( 433 ). In other words, if point input is currently selected, and more than one but not all of the currently down fingers are moving, then the selection of point input can be locked. Locked point input will be described in more detail in FIG. 5 .
  • a view all windows input can be selected ( 445 ). If a four-finger horizontal motion is detected ( 447 ) then an application-switching input can be selected ( 449 ). Otherwise, the pointing input can be locked ( 433 ).
  • the user can continue pointing by moving just one finger.
  • the user can switch to another input by making certain multi-finger motion that meet certain criteria. Specifically, if the user is pointing with one of three fingers down, comes to a rolling stop with the single finger, and initiates a three-finger motion, then a three-finger input can be selected.
  • the user can enter one of the four-finger inputs by coming to a rolling stop with the single finger, and either initiating a four-finger vertical motion to switch to a view all windows input or initiate a four-finger horizontal motion to switch to an application switching input. All other multi-finger motion while pointing is selected can lock the pointing input selection.
  • the determination of whether or not to select a new input can be based on touch information such as a total number of fingers down, whether all or a subset of finger move, and whether fingers move at substantially the same time.
  • the determination can also be based on other information, such as whether the currently selected input is locked, as will now be described in more detail.
  • FIG. 5 illustrates one example method of transitioning from a locked point input.
  • a user can unlock the point input selection and select another input by lifting and tapping more than one finger ( 503 ) or by lifting and touching down more than one finger ( 505 ).
  • the point selection can be unlocked ( 507 ), and the input corresponding to the number of fingers in the lift and tap or lift and touch down can be selected ( 509 ).
  • Another way in which a user can unlock the point selection can be to lift all but one finger ( 511 ).
  • the point input can be unlocked ( 513 ), and as long as the one finger remains touching, the selection of pointing input can be maintained ( 515 ) in an unlocked state. Otherwise, the selected input can remain locked in point input ( 517 ).
  • the user can point using a wide range of finger combinations and motions. This can allow a user freedom in pointing, which can be a common task. While maintaining at least one finger down, the user can still change input by initiating lifts and taps or lifts and touch downs of a subset of the total number of touchdown fingers. In addition, the user can simply lift all but one finger to unlock the selection of pointing, and therefore may select other inputs through additional actions once only one finger remains on the surface.
  • FIG. 6 illustrates an example method of transitioning from a currently selected scroll input that is not locked.
  • the scroll input selection 601 if the number of fingers down equals two ( 603 ) then certain inputs may be selected by dropping additional fingers. If the user drops additional fingers ( 605 ), and the number of additional fingers dropped equals three ( 607 ), then the scroll input can be maintained ( 609 ). However, if the user drops one or two additional fingers, then an input may be changed if the additional finger drop occurs at a rolling stop of the two scrolling fingers or if the additional fingers drop soon after the touchdown of the original two scrolling fingers ( 611 ).
  • the scroll input can be maintained ( 609 ). However, if the two scrolling fingers are at a rolling stop, or touchdown soon before, a drop of an additional one finger ( 613 ), then selection of a new input can depend on whether the user has set the three-finger input to a swipe input or a drag input. If the user has set the three-finger input to a swipe input ( 615 ) then the swipe input can be selected ( 617 ). However, if the user has selected the three-finger input as drag input, then the scroll input can be maintained ( 609 ), i.e., the system can prevent switching to drag input.
  • the determination of whether or not to switch to a new input can depend on information such as the function of the new input, e.g., given the same touch information, such as number of contacts, motion, etc., switching to a new input a may depend on the function a user has chosen to correspond to a particular base gesture, for example.
  • a user may typically use quick motions for swipe inputs and slower motions for drag inputs. Allowing a user to switch to swipe input at a rolling stop of two-finger scroll may be a better match for a typical quick motion used at a rolling two-finger stop than a typically slower drag motion.
  • using different information, such as touch information and other information, to determine whether or not to allow switching to a new input can allow the design of a touch sensing system to be more natural and easy to use.
  • a four-finger input can be selected ( 621 ).
  • the four-finger input can be, for example, dependent on the particular direction of motion of the four fingers.
  • additional motion testing may be required in order to generate the four-finger input, e.g., a view all windows input or an application switching input.
  • dropping an additional two fingers can switch to a new input event regardless of the function of the input event to be switched to.
  • a scroll input selection in this example embodiment does not include additional ways to transition to different inputs using motion of additional fingers alone, without requiring additional finger drops. It may be more likely that a user performing a two-finger scroll would unintentionally move additional fingers than if the user were performing a one-finger point, for example.
  • FIG. 7 illustrates an example method of transitioning from a drag input according to embodiments of the disclosure.
  • the system can determine if the number of fingers down is equal to three ( 703 ). If the number of fingers down is not equal to three, then if all fingers but one liftoff ( 705 ), then the input can be switched to point input ( 707 ); otherwise, the drag input can be maintained ( 709 ). If the number of fingers down is equal to three ( 703 ), then if a two-finger liftoff occurs ( 711 ) and the three-finger drag is not at a rolling stop ( 713 ), then a drag continuation input can be selected ( 715 ).
  • the input can be switched to point input ( 717 ). If two fingers do not lift off ( 711 ), then the drag input can be maintained ( 709 ).
  • a drag continuation input can be selected by lifting off two of the three fingers while not at a rolling stop.
  • the drag continuation input is described in more detail with regard to FIGS. 8-9 .
  • FIGS. 8-9 illustrate an example drag continuation input according to embodiments of the disclosure.
  • the velocity of the two-finger subset lifting off during a three-finger drag is obtained ( 801 ).
  • An initial decay rate is determined and set ( 803 ) based on the liftoff velocity.
  • FIG. 9 illustrates an example graph showing three velocity ranges, a high-speed range 901 , a medium-speed range 903 , and a low-speed range 905 , corresponding to a first decay rate 907 , a second decay rate 909 , and a third decay rate 911 .
  • first decay rate 907 can be selected ( 803 ) as the initial decay rate, and motion of the drag input can be initially continued at the liftoff velocity and subsequently reduced based on the initial decay rate ( 805 ).
  • the decay rate can be reset to the decay rate of the next-lower range ( 807 ).
  • second decay rate 909 can be selected and the continued drag motion can be reduce based on the second decay rate.
  • the decaying drag continuation motion reaches low-speed range 905 .
  • the ranges and associated decay rates can be selected such that motion continuation initiated in a high-speed range, that may be so fast that the vision of a typical user may not be able to easily track the motion, can decay quickly.
  • motion continuation that is too fast for a user to follow can quickly be slowed to a more reasonable speed, e.g., the medium-speed range.
  • the decay rate can be set so that the motion decays more slowly. In this way, for example, a relatively fast motion continuation can be maintained at a speed the user can visually track, for a longer period of time.
  • the decay rate can be set so that the motion decays quickly, so that the continued motion does not remain in a slow motion for too long before coming to a stop.
  • the ranges and rates of decay can be set such that a high-speed continued motion comes to a stop within a predetermined distance or time, regardless of the exact speed of the liftoff velocity within the high-speed range.
  • FIG. 10 illustrates an example method of transitioning based on lifting and dropping a subset of fingers according to embodiments of the disclosure.
  • the thumb can be considered a finger.
  • the touch system can determine that the contact is a thumb, and can disregard input from the thumb, or allow the user to access specialized gestures involving the thumb.
  • FIG. 10 illustrates that a user can switch to base gesture input when resting with four or five fingers, even when a current input is selected, that is, when the four or five fingers down come to a rest, such as with a rolling stop.
  • the process also can apply in a four or five finger unspecified resting state.
  • the user can lift and tap one finger ( 1005 ) to switch to a one-finger tap input ( 1007 ), lift and tap two fingers ( 1009 ) to switch to a two-finger tap input ( 1011 ), lift, drop, and move two fingers ( 1013 ) to switch to a locked scroll input ( 1015 ), lift, drop, and move three fingers ( 1017 ) to switch to a three-finger input ( 1019 ), or, in the five-finger resting, lift, drop, and move four fingers ( 1021 ) to switch to a four-finger input ( 1023 ), else the currently selected input can be maintained ( 1025 ).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US12/891,655 2010-07-26 2010-09-27 Motion continuation of touch input Abandoned US20120019453A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US12/891,655 US20120019453A1 (en) 2010-07-26 2010-09-27 Motion continuation of touch input
KR1020137004715A KR20130024989A (ko) 2010-07-26 2011-07-22 터치 입력의 움직임 계속
CN2011800415150A CN103069379A (zh) 2010-07-26 2011-07-22 触摸输入的运动持续
AU2011282997A AU2011282997B2 (en) 2010-07-26 2011-07-22 Motion continuation of touch input
JP2013521860A JP2013532871A (ja) 2010-07-26 2011-07-22 タッチ入力の動き継続
EP11741040.7A EP2598977A1 (fr) 2010-07-26 2011-07-22 Continuation de mouvement d'entrée tactile
PCT/US2011/045109 WO2012015701A1 (fr) 2010-07-26 2011-07-22 Continuation de mouvement d'entrée tactile

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US36786010P 2010-07-26 2010-07-26
US12/891,655 US20120019453A1 (en) 2010-07-26 2010-09-27 Motion continuation of touch input

Publications (1)

Publication Number Publication Date
US20120019453A1 true US20120019453A1 (en) 2012-01-26

Family

ID=45493183

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/891,635 Active 2033-06-30 US8922499B2 (en) 2010-07-26 2010-09-27 Touch input transitions
US12/891,655 Abandoned US20120019453A1 (en) 2010-07-26 2010-09-27 Motion continuation of touch input
US13/251,073 Active 2032-12-21 US9310995B2 (en) 2010-07-26 2011-09-30 Touch input transitions

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/891,635 Active 2033-06-30 US8922499B2 (en) 2010-07-26 2010-09-27 Touch input transitions

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/251,073 Active 2032-12-21 US9310995B2 (en) 2010-07-26 2011-09-30 Touch input transitions

Country Status (7)

Country Link
US (3) US8922499B2 (fr)
EP (2) EP2476046B1 (fr)
JP (2) JP2013532871A (fr)
KR (2) KR20130024989A (fr)
CN (4) CN103069379A (fr)
AU (2) AU2011283001B2 (fr)
WO (2) WO2012015701A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019469A1 (en) * 2010-07-26 2012-01-26 Wayne Carl Westerman Touch input transitions
US20120210270A1 (en) * 2011-02-10 2012-08-16 Samsung Electronics Co., Ltd. Method and apparatus for processing multi-touch input at touch screen terminal
CN103472986A (zh) * 2013-08-09 2013-12-25 深圳Tcl新技术有限公司 触摸滑动操作自适应控制方法、装置及触摸板
US9542096B2 (en) * 2012-07-18 2017-01-10 Sony Corporation Mobile client device, operation method, recording medium, and operation system
US9791956B2 (en) * 2015-03-30 2017-10-17 Lenovo (Singapore) Pte. Ltd. Touch panel click action
US11219820B2 (en) * 2018-01-05 2022-01-11 Tencent Technology (Shenzhen) Company Limited Control method for virtual controlled object, apparatus, storage medium, and electronic apparatus

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014507726A (ja) * 2011-02-08 2014-03-27 ハワース, インコーポレイテッド マルチモーダルタッチスクリーン対話装置、方法、及び、システム
US9465434B2 (en) 2011-05-23 2016-10-11 Haworth, Inc. Toolbar dynamics for digital whiteboard
EP2715490B1 (fr) 2011-05-23 2018-07-11 Haworth, Inc. Appareils, procédés et systèmes de collaboration de tableau blanc numérique
US20140055400A1 (en) 2011-05-23 2014-02-27 Haworth, Inc. Digital workspace ergonomics apparatuses, methods and systems
US9471192B2 (en) 2011-05-23 2016-10-18 Haworth, Inc. Region dynamics for digital whiteboard
DE112011105305T5 (de) * 2011-06-03 2014-03-13 Google, Inc. Gesten zur Textauswahl
US9395901B2 (en) * 2012-02-08 2016-07-19 Blackberry Limited Portable electronic device and method of controlling same
US20130257742A1 (en) * 2012-03-28 2013-10-03 Google Inc. Method and System for Controlling Imagery Panning Based on Displayed Content
JP5663519B2 (ja) * 2012-04-10 2015-02-04 京セラドキュメントソリューションズ株式会社 表示入力装置および画像形成装置
US9479548B2 (en) 2012-05-23 2016-10-25 Haworth, Inc. Collaboration system with whiteboard access to global collaboration data
US9479549B2 (en) 2012-05-23 2016-10-25 Haworth, Inc. Collaboration system with whiteboard with federated display
CN103529976B (zh) * 2012-07-02 2017-09-12 英特尔公司 手势识别系统中的干扰消除
JP6188288B2 (ja) * 2012-07-20 2017-08-30 キヤノン株式会社 情報処理装置及びその制御方法
US10304037B2 (en) 2013-02-04 2019-05-28 Haworth, Inc. Collaboration system including a spatial event map
US11861561B2 (en) 2013-02-04 2024-01-02 Haworth, Inc. Collaboration system including a spatial event map
JP6221265B2 (ja) * 2013-03-04 2017-11-01 株式会社デンソー タッチパネル操作装置及びタッチパネル操作装置における操作イベント判定方法
US9785240B2 (en) * 2013-03-18 2017-10-10 Fuji Xerox Co., Ltd. Systems and methods for content-aware selection
US9477331B2 (en) 2013-06-07 2016-10-25 Apple Inc. Touch detection at bezel edge
US9330545B2 (en) * 2013-07-17 2016-05-03 Google Inc. Determining input received via tactile input device
JP6098435B2 (ja) * 2013-08-22 2017-03-22 ソニー株式会社 情報処理装置、記憶媒体、および制御方法
US9207794B2 (en) * 2013-12-30 2015-12-08 Google Inc. Disambiguation of user intent on a touchscreen keyboard
KR20150083378A (ko) * 2014-01-09 2015-07-17 삼성전기주식회사 멀티 터치 정보를 이용한 제스쳐 인식 장치 및 제스쳐 인식 방법
KR20150102589A (ko) * 2014-02-28 2015-09-07 삼성메디슨 주식회사 의료 영상 처리 장치, 의료 영상 처리 방법, 및 컴퓨터 판독가능 기록매체
JP2016001682A (ja) * 2014-06-12 2016-01-07 ソニー株式会社 固体撮像装置およびその製造方法、並びに電子機器
JP2016038876A (ja) * 2014-08-11 2016-03-22 カシオ計算機株式会社 画像入力装置、画像出力装置及び画像入出力システム
US10747426B2 (en) * 2014-09-01 2020-08-18 Typyn, Inc. Software for keyboard-less typing based upon gestures
DE102014019040B4 (de) * 2014-12-18 2021-01-14 Audi Ag Verfahren zum Betreiben einer Bedienvorrichtung eines Kraftfahrzeugs bei einer Mehrfingerbedienung
EP3292524B1 (fr) 2015-05-06 2020-07-08 Haworth, Inc. Mode de suivi d'espaces de travail virtuels dans des systèmes de collaboration
CN104932695B (zh) * 2015-06-29 2018-06-01 联想(北京)有限公司 信息输入装置及信息输入方法
EP3130998A1 (fr) * 2015-08-11 2017-02-15 Advanced Digital Broadcast S.A. Procédé et système permettant de commander une interface utilisateur à écran tactile
US20170052631A1 (en) * 2015-08-20 2017-02-23 Futurewei Technologies, Inc. System and Method for Double Knuckle Touch Screen Control
US10255023B2 (en) 2016-02-12 2019-04-09 Haworth, Inc. Collaborative electronic whiteboard publication process
US10163245B2 (en) 2016-03-25 2018-12-25 Microsoft Technology Licensing, Llc Multi-mode animation system
CN106227454B (zh) * 2016-07-27 2019-10-25 努比亚技术有限公司 一种触控轨迹检测系统及方法
US12019850B2 (en) 2017-10-23 2024-06-25 Haworth, Inc. Collaboration system including markers identifying multiple canvases in multiple shared virtual workspaces
US11934637B2 (en) 2017-10-23 2024-03-19 Haworth, Inc. Collaboration system including markers identifying multiple canvases in multiple shared virtual workspaces
US11126325B2 (en) 2017-10-23 2021-09-21 Haworth, Inc. Virtual workspace including shared viewport markers in a collaboration system
JP7022846B2 (ja) * 2018-05-07 2022-02-18 アップル インコーポレイテッド ユーザインタフェース間でのナビゲーション、ドックの表示、及びシステムユーザインタフェース要素の表示のためのデバイス、方法、及びグラフィカルユーザインタフェース
WO2020176517A1 (fr) 2019-02-25 2020-09-03 Haworth, Inc. Flux de travail basés sur un geste dans un système de collaboration
CN111625174B (zh) * 2020-05-06 2022-01-04 Oppo(重庆)智能科技有限公司 触摸屏控制方法及装置、电子设备、存储介质
US11212127B2 (en) 2020-05-07 2021-12-28 Haworth, Inc. Digital workspace sharing over one or more display clients and authorization protocols for collaboration systems
US11750672B2 (en) 2020-05-07 2023-09-05 Haworth, Inc. Digital workspace sharing over one or more display clients in proximity of a main client

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7073637B2 (en) * 2001-08-23 2006-07-11 Nsk-Warner K.K. Double-wrap brake band apparatus
US7173637B1 (en) * 2001-02-26 2007-02-06 Microsoft Corporation Distance-based accelerated scrolling
US20110202834A1 (en) * 2010-02-12 2011-08-18 Microsoft Corporation Visual motion feedback for user interface
US8271898B1 (en) * 2009-06-04 2012-09-18 Mellmo Inc. Predictive scrolling

Family Cites Families (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7345675B1 (en) 1991-10-07 2008-03-18 Fujitsu Limited Apparatus for manipulating an object displayed on a display device by using a touch screen
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5488204A (en) 1992-06-08 1996-01-30 Synaptics, Incorporated Paintbrush stylus for capacitive touch sensor pad
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5835079A (en) 1996-06-13 1998-11-10 International Business Machines Corporation Virtual pointing device for touchscreens
JPH1031551A (ja) * 1996-07-15 1998-02-03 Mitsubishi Electric Corp ヒューマンインターフェースシステムおよびこれを使用した高速移動物体位置検出装置
JP3593827B2 (ja) * 1996-11-26 2004-11-24 ソニー株式会社 画面のスクロール制御装置及びスクロール制御方法
US6310610B1 (en) 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
US8479122B2 (en) * 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
EP2256605B1 (fr) * 1998-01-26 2017-12-06 Apple Inc. Procédé et appareil d'intégration d'entrée manuelle
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US6188391B1 (en) 1998-07-09 2001-02-13 Synaptics, Inc. Two-layer capacitive touchpad and method of making same
JP4542637B2 (ja) 1998-11-25 2010-09-15 セイコーエプソン株式会社 携帯情報機器及び情報記憶媒体
US7469381B2 (en) 2007-01-07 2008-12-23 Apple Inc. List scrolling and document translation, scaling, and rotation on a touch-screen display
US6730863B1 (en) * 1999-06-22 2004-05-04 Cirque Corporation Touchpad having increased noise rejection, decreased moisture sensitivity, and improved tracking
JP2001134382A (ja) 1999-11-04 2001-05-18 Sony Corp 図形処理装置
JP3800984B2 (ja) 2001-05-21 2006-07-26 ソニー株式会社 ユーザ入力装置
JP2003173237A (ja) 2001-09-28 2003-06-20 Ricoh Co Ltd 情報入出力システム、プログラム及び記憶媒体
US6690387B2 (en) 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
JP2003330614A (ja) 2002-05-13 2003-11-21 Ricoh Co Ltd タッチパネル付きディスプレイ装置、タッチパネル付きディスプレイ装置の制御方法およびその方法をコンピュータに実行させるためのプログラム
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US7167162B2 (en) 2003-12-12 2007-01-23 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Apparatus and method for controlling a screen pointer
WO2006020305A2 (fr) 2004-07-30 2006-02-23 Apple Computer, Inc. Gestes pour dispositifs d'entree sensibles au toucher
US7728823B2 (en) * 2004-09-24 2010-06-01 Apple Inc. System and method for processing raw data of track pad device
KR100984630B1 (ko) * 2004-09-24 2010-09-30 애플 인크. 트랙 패드 장치의 원시 데이터를 처리하기 위한 시스템 및방법
WO2007037806A1 (fr) * 2005-09-15 2007-04-05 Apple Inc. Systeme et procede de traitement de donnees brutes d'un dispositif de pave tactile
JP4826174B2 (ja) 2005-08-24 2011-11-30 ソニー株式会社 表示装置
WO2007086386A1 (fr) * 2006-01-27 2007-08-02 Matsushita Electric Industrial Co., Ltd. Dispositif avec capteur sensitif
US8077153B2 (en) * 2006-04-19 2011-12-13 Microsoft Corporation Precise selection techniques for multi-touch screens
US7813774B2 (en) * 2006-08-18 2010-10-12 Microsoft Corporation Contact, motion and position sensing circuitry providing data entry associated with keypad and touchpad
US20080084400A1 (en) 2006-10-10 2008-04-10 Outland Research, Llc Touch-gesture control of video media play on handheld media players
US7956847B2 (en) 2007-01-05 2011-06-07 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
DE202007014957U1 (de) * 2007-01-05 2007-12-27 Apple Inc., Cupertino Multimediakommunikationseinrichtung mit Berührungsbildschirm, der auf Gesten zur Steuerung, Manipulierung und Editierung von Mediendateien reagiert
US7877707B2 (en) * 2007-01-06 2011-01-25 Apple Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US7844915B2 (en) 2007-01-07 2010-11-30 Apple Inc. Application programming interfaces for scrolling operations
CN101308437A (zh) * 2007-05-15 2008-11-19 宏达国际电子股份有限公司 信息导览方法及其相关电子装置
US7916126B2 (en) * 2007-06-13 2011-03-29 Apple Inc. Bottom-up watershed dataflow method and region-specific segmentation based on historic data to identify patches on a touch sensor panel
US9740386B2 (en) 2007-06-13 2017-08-22 Apple Inc. Speed/positional mode translations
US20090164951A1 (en) * 2007-12-19 2009-06-25 Nvidia Corporation Input architecture for devices with small input areas and executing multiple applications
US20090174679A1 (en) 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
JP2009176114A (ja) 2008-01-25 2009-08-06 Mitsubishi Electric Corp タッチパネル装置及びユーザインタフェース装置
US20090213083A1 (en) * 2008-02-26 2009-08-27 Apple Inc. Simulation of multi-point gestures with a single pointing device
KR100942821B1 (ko) * 2008-05-08 2010-02-18 주식회사 한모아 터치 위치 이동과 방향 전환에 의한 명령 또는 데이터 입력 방법 및 장치
US8176438B2 (en) * 2008-09-26 2012-05-08 Microsoft Corporation Multi-modal interaction for a screen magnifier
JP2010086230A (ja) 2008-09-30 2010-04-15 Sony Corp 情報処理装置、情報処理方法およびプログラム
CN101424997A (zh) * 2008-12-04 2009-05-06 苏州达方电子有限公司 触控面板及其快速滚动条的启动方法
US20100162181A1 (en) 2008-12-22 2010-06-24 Palm, Inc. Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress
JP2010157047A (ja) * 2008-12-26 2010-07-15 Brother Ind Ltd 入力装置
US8922499B2 (en) * 2010-07-26 2014-12-30 Apple Inc. Touch input transitions

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7173637B1 (en) * 2001-02-26 2007-02-06 Microsoft Corporation Distance-based accelerated scrolling
US7073637B2 (en) * 2001-08-23 2006-07-11 Nsk-Warner K.K. Double-wrap brake band apparatus
US8271898B1 (en) * 2009-06-04 2012-09-18 Mellmo Inc. Predictive scrolling
US20110202834A1 (en) * 2010-02-12 2011-08-18 Microsoft Corporation Visual motion feedback for user interface

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019469A1 (en) * 2010-07-26 2012-01-26 Wayne Carl Westerman Touch input transitions
US8922499B2 (en) 2010-07-26 2014-12-30 Apple Inc. Touch input transitions
US9310995B2 (en) * 2010-07-26 2016-04-12 Apple Inc. Touch input transitions
US20120210270A1 (en) * 2011-02-10 2012-08-16 Samsung Electronics Co., Ltd. Method and apparatus for processing multi-touch input at touch screen terminal
US9003322B2 (en) * 2011-02-10 2015-04-07 Samsung Electronics Co., Ltd Method and apparatus for processing multi-touch input at touch screen terminal
US10429969B2 (en) 2011-02-10 2019-10-01 Samsung Electronics Co., Ltd Method and apparatus for processing multi-touch input at touch screen terminal
US11460938B2 (en) 2011-02-10 2022-10-04 Samsung Electronics Co., Ltd Method and apparatus for processing multi-touch input at touch screen terminal
US9542096B2 (en) * 2012-07-18 2017-01-10 Sony Corporation Mobile client device, operation method, recording medium, and operation system
US10007424B2 (en) 2012-07-18 2018-06-26 Sony Mobile Communications Inc. Mobile client device, operation method, recording medium, and operation system
CN103472986A (zh) * 2013-08-09 2013-12-25 深圳Tcl新技术有限公司 触摸滑动操作自适应控制方法、装置及触摸板
US9791956B2 (en) * 2015-03-30 2017-10-17 Lenovo (Singapore) Pte. Ltd. Touch panel click action
US11219820B2 (en) * 2018-01-05 2022-01-11 Tencent Technology (Shenzhen) Company Limited Control method for virtual controlled object, apparatus, storage medium, and electronic apparatus

Also Published As

Publication number Publication date
CN103069379A (zh) 2013-04-24
EP2598977A1 (fr) 2013-06-05
CN102346592A (zh) 2012-02-08
WO2012015701A1 (fr) 2012-02-02
CN107122111B (zh) 2022-01-28
CN107122111A (zh) 2017-09-01
US20120019469A1 (en) 2012-01-26
US9310995B2 (en) 2016-04-12
WO2012015705A1 (fr) 2012-02-02
AU2011283001B2 (en) 2014-03-06
JP2013532872A (ja) 2013-08-19
KR20130050956A (ko) 2013-05-16
JP2013532871A (ja) 2013-08-19
AU2011283001A1 (en) 2013-02-14
AU2011282997A1 (en) 2013-02-14
EP2476046B1 (fr) 2019-07-10
EP2476046A1 (fr) 2012-07-18
CN114237486A (zh) 2022-03-25
US20120019452A1 (en) 2012-01-26
EP2476046A4 (fr) 2013-06-05
US8922499B2 (en) 2014-12-30
KR101451531B1 (ko) 2014-10-15
JP5604004B2 (ja) 2014-10-08
AU2011282997B2 (en) 2014-05-15
KR20130024989A (ko) 2013-03-08

Similar Documents

Publication Publication Date Title
US8922499B2 (en) Touch input transitions
US7932896B2 (en) Techniques for reducing jitter for taps
CA2843607C (fr) Geste de glissement croise de selection et reorganisation
CA2812288C (fr) Dispositif electronique portable et son procede de commande
AU2008100547A4 (en) Speed/position mode translations
US8970475B2 (en) Motion sensitive input control
US10198163B2 (en) Electronic device and controlling method and program therefor
CA2674663A1 (fr) Methode et dispositif portable de navigation bimodal a ecran tactile
CN103870030A (zh) 信息处理方法和电子设备

Legal Events

Date Code Title Description
AS Assignment

Owner name: APPLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WESTERMAN, WAYNE CARL;REEL/FRAME:025856/0373

Effective date: 20100924

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION