CN101611373A - Utilize the attitude of touch-sensitive device control, manipulation and editing media file - Google Patents

Utilize the attitude of touch-sensitive device control, manipulation and editing media file Download PDF

Info

Publication number
CN101611373A
CN101611373A CN200780051755.2A CN200780051755A CN101611373A CN 101611373 A CN101611373 A CN 101611373A CN 200780051755 A CN200780051755 A CN 200780051755A CN 101611373 A CN101611373 A CN 101611373A
Authority
CN
China
Prior art keywords
touch
attitude
display device
touch input
contact point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200780051755.2A
Other languages
Chinese (zh)
Other versions
CN101611373B (en
Inventor
G·克里斯蒂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/818,342 external-priority patent/US7956847B2/en
Application filed by Apple Computer Inc filed Critical Apple Computer Inc
Priority to CN201310719094.3A priority Critical patent/CN103631496B/en
Publication of CN101611373A publication Critical patent/CN101611373A/en
Application granted granted Critical
Publication of CN101611373B publication Critical patent/CN101611373B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments of the invention relate to a kind of system, method and software, are used to realize being used to manage and edit the attitude of utilizing touch-sensitive device (such as touch-sensitive display) of the media file in computing equipment or the system.Especially, staff touching/and can be used for control, editor and handle file near the input of the attitude on the sensitive equipment, such as media file, described media file includes but not limited to graphic file, photo files and video file.

Description

Utilize the attitude of touch-sensitive device control, manipulation and editing media file
Cross reference to related application
The sequence number that the present invention requires on January 5th, 2007 to submit to is the rights and interests based on 35 USC 119 (e) of 60/878,754 U.S. Provisional Patent Application, and the content of this temporary patent application is incorporated herein by reference.
Technical field
The present invention relates to by utilizing hand attitude on the touch-sensitive device to manage, handle and edit the system and method for the media object the Drawing Object on display.
Background technology
There is the input media of many types to be used for now in the computer system executable operations.Described operation is selected corresponding to moving cursor with on display screen usually.Described operation also can comprise page turning, rolling, translation, convergent-divergent, or the like.For example, input media can comprise button, switch, keyboard, mouse, trace ball, touch pad, operating rod and touch-screen, or the like.In these equipment each all has the merits and demerits of being considered when designing a calculating machine system.
Button and switch in fact normally machinery and to moving cursor with select the control that provides limited.For example, they are exclusively used in moving cursor on specific direction (for example arrow key) usually or carry out specific selection (for example carriage return, deletion, numeral etc.).
When using mouse apparatus, the input pointer is moving usually corresponding to when user's the relatively moving of mouse during along surperficial rolling mouse on the display.When using Trackball device, the input pointer is moving usually corresponding to when user's the relatively moving of trace ball during the motion track ball in the enclosure on the display.Mouse and Trackball device also comprise one or more buttons that are used to select usually.Mouse apparatus also can comprise roller, and it allows the user by shown content that roller lift-over is forward or backward rolled.
Utilize touch panel device, such as the touch pad on the individual laptop computer, the input pointer on the display move usually corresponding to when user's finger (or stylus) during along touch pad surperficial mobile the user point relatively moving of (or stylus).On the other hand, touch-screen can be one type a display screen, and it generally includes the touch-sensitive transparent panel (or " skin ") that covers display screen.When using touch-screen, the user directly puts by (using stylus or finger usually) usually and selects on display screen to being presented at the object (such as the GUI object) on the screen.
For additional function is provided, utilized some above-mentioned input media to realize the hand attitude.For example, on touch pad, in the time can on touchpad surface, detecting one or more rapping, can select.In some cases, can rap any part of touch pad, and in other cases, can rap the private part of touch pad.Except that selecting, can also start rolling by utilizing finger motion at the touch pad edge.
Transfer Apple Computer, U.S. Patent No. 5,612,719 and the No.5 of Inc., 590,219 have described some other uses of attitude.U.S. Patent No. 5,612,719 disclose the button on a kind of screen, and it is in response at least two different button attitudes of carrying out on screen or near this button place.U.S. Patent No. 5,590,219 disclose the method for the ellipse attitude input on a kind of display screen that is used to be identified in computer system.
Recently, realized more senior attitude.For example, can start rolling, so that roll attitude is identified and move these fingers then to carry out the rolling incident on touch pad by on touch pad, placing four fingers.But, be used to realize that the method for these senior attitudes may be limited, and not directly perceived in many cases.In some applications, especially in relating to the application that utilizes computer system management or editing media file, utilize the hand attitude of touch-screen can allow the operation that the user is more effective and realization is more accurately expected.
Based on the above, need to improve the mode that attitude can be performed on touch-sensitive device, especially for management and editing media file.
Summary of the invention
The present invention relates to a kind of system, method and software, be used to realize to be used for the attitude of utilizing touch-sensitive device (such as touch-sensitive display) of management and editing media file on computer system.Especially, staff touching/and can be used for control, editor and handle file near the input of the attitude on the sensitive equipment, such as media file, described media file includes but not limited to photo files and video file.
According to an embodiment, the attitude input of using on the display in the touch-sensitive computer desktop is used to realize traditional mouse/trace ball action, such as definite target (target), selection, right click action, rolling etc.
According to another embodiment, the attitude input on touch-sensitive display can be used for realizing in order to edit the edit commands of the image file such as photo files.Can such as slider bar, discern the attitude input via user interface (" UI ") element.Attitude input via the UI element can change by the quantity that changes the contact point on described UI element.
According to another embodiment, the activation that (invoke) UI element is enabled in attitude input then, can realize further function alternately with the attitude of the UI element of being enabled.
Description of drawings
Fig. 1 is the block diagram of the computer system of one exemplary embodiment according to the present invention.
Fig. 2 illustrates another computer system of another exemplary embodiment according to the present invention.
Fig. 3 is a kind of multiple spot disposal route of one exemplary embodiment according to the present invention.
Fig. 4 A and 4B illustrate detected touch image according to an embodiment of the invention.
Fig. 5 illustrates a stack features according to an embodiment of the invention.
Fig. 6 is a kind of calculation method of parameters according to an embodiment of the invention.
Fig. 7 A-7E and 7I-7K illustrate the target and/or select the various attitudes of task of setting the tasks that is used to carry out according to an embodiment of the invention.
Fig. 7 F-7H illustrates the figure of method that is used to discern and realizes the attitude input of Fig. 7 A-E.
Fig. 8 A-8G illustrates rotation attitude according to an embodiment of the invention.
Fig. 9 is a kind of figure based on the method that touches according to an embodiment of the invention.
Figure 10 is a kind of figure based on the method that touches according to an embodiment of the invention.
Figure 11 is a kind of figure based on the method that touches according to an embodiment of the invention.
Figure 12 is the figure of a kind of convergent-divergent attitude method according to an embodiment of the invention.
Figure 13 A-13H illustrates a series of convergent-divergent according to an embodiment of the invention.
Figure 14 is the figure of a kind of shift method according to an embodiment of the invention.
Figure 15 A-15D illustrates a series of translation according to an embodiment of the invention.
Figure 16 is the figure of a kind of spinning solution according to an embodiment of the invention.
Figure 17 A-17C illustrates a series of rotation according to an embodiment of the invention.
Figure 17 D and 17E illustrate the method that is used to rotate optional target according to an embodiment of the invention.
Figure 18 A and 18B illustrate the attitude input that is used for the editing photo document according to an embodiment of the invention.
Figure 18 C illustrates the figure of method that is used to discern and realizes the attitude input of Figure 18 A and 18B.
Figure 18 D and 18E illustrate the attitude input that is used for amplifying and dwindling in photo application photo files according to an embodiment of the invention.
Figure 19 A-19D illustrates the attitude input that is used to roll across the contiguous file of playback according to an embodiment of the invention.
Figure 19 E and 19F illustrate the attitude input that is used for rolling across the photo files of playback on the digital camera display according to an embodiment of the invention.
Figure 19 G illustrates the attitude input that is used in playback duration mark or deletion photo files according to an embodiment of the invention.
Figure 19 H illustrates a kind of interchangeable attitude input in playback duration mark or deletion photo files of being used for according to another embodiment of the present invention.
Figure 20 is the sketch plan of method that is used to realize the method for Figure 18 A-19F according to illustrating of the application's a embodiment.
Figure 21 A-21D illustrates and according to an embodiment of the inventionly is used to utilize Video Applications to control and/or the attitude input of editing video.
Figure 22 A and 22B are the figure of method that is used to realize the attitude input of Figure 21 A-21D.
Figure 23 illustrates the attitude that is used to utilize voice applications to control and/or edit audio frequency according to an embodiment of the invention and imports.
Embodiment
Below in the description of preferred embodiments with reference to the accompanying drawings, wherein accompanying drawing is as this instructions part, and by illustrating the mode that can put into practice specific embodiments of the invention accompanying drawing is shown.Should be appreciated that and to use other embodiment, and can produce structural change and do not break away from the scope of the preferred embodiment of the present invention.
Fig. 1 is the block diagram of exemplary computer system 50 according to an embodiment of the invention.Computer system 50 can be corresponding to the personal computer system, such as desktop computer, laptop computer, flat computer or handheld computer.Computer system also can be corresponding to calculation element, such as cell phone, PDA, specialized media player and consumer-elcetronics devices, or the like.
Exemplary computer system 50 shown in Figure 1 comprises processor 56, and this processor 56 is configured to the operation of executing instruction and carrying out being associated with computer system 50.For example, use the instruction of for example retrieving from storer, processor 56 may command are for the reception and the manipulation of the input and output data between the parts of computing system 50.Processor 56 can be realized on single-chip, multicore sheet or a plurality of electric parts.For example, can be processor 56 and use various structures, comprise special use or flush bonding processor, single goal processor, controller and ASIC, or the like.
In most cases, processor 56 is operated with object computer code and generation and use data with operating system.Operating system is normally well-known, will be not for a more detailed description here.For example, operating system can be corresponding to OS/2, DOS, Unix, Linux, Palm OS, or the like.Operating system can also be special purpose operating system, such as can be for limiting those operating systems that purposes device type computing equipment uses.Operating system, other computer code and data can reside in the memory block 58 that can operationally be couple to processor 56.Memory block 58 is provided for storing usually can be by the computer code of computer system 50 uses and the space of data.For example, memory block 58 can comprise ROM (read-only memory) (ROM), random-access memory (ram), hard disk drive, or the like.Information also can reside on the movable storage medium, and is written into when needed or is installed on the computer system 50.Movable storage medium comprises for example CD-ROM, PC-CARD, storage card, floppy disk, tape and network components.
Computer system 50 also can comprise the display device 68 that can operationally be couple to processor 56.Display device 68 can be LCD (LCD) (for example active matrix, passive matrix, or the like).Replacedly, display device 68 can be the monitor such as monochrome display, cga (CGA) display, enhanced graphics adapter (EGA) display, changeable graphics array (VGA) display, Super VGA display, cathode ray tube (CRT) or the like.Display device also can be corresponding to plasma scope or the display of realizing with electric ink.
Display device 68 can be configured to display graphics user interface (GUI) 69 usually, and described graphic user interface 69 is provided at the wieldy interface between the application of the user of computer system and operating system or operation on it.Generally speaking, GUI 69 expressions have program, file and the option of operation of graph image, object or vector representation.Graph image can comprise window, field, dialog box, menu, icon, button, cursor, scroll bar, or the like.These images can be arranged as predetermined layout, perhaps can dynamically be created to serve the ongoing specific operation of user.During operation, the user can select and/or activate various graph images to start function and the task that is associated with it.For example, the user can select to be used to open, close, minimize or the button of maximized window or be used for initiating the icon of (launch) specific program.GUI 69 can be additionally or alternatively display message of user on display device 68, such as noninteractive text and figure.
Computer system 50 also can comprise the input equipment 70 that can operationally be couple to processor 56.Input equipment 70 can be configured to transfer data to the computer system 50 from the external world.Input equipment 70 can for example be used to carry out to be followed the tracks of and selects at the GUI on the display 68 69.Input equipment 70 also is used in issue an order in the computer system 50.Input equipment 70 can comprise the touch-sensing equipment that is configured to receive the input that touches from the user and this information is sent to processor 56.For example, touch-sensing equipment can be corresponding to touch pad or touch-screen.Under many circumstances, the position and the amplitude of touch of touch-sensing recognition of devices and the touch on touch sensitive surface.The touch-sensing Equipment Inspection touches and reports described touch to processor 56, and processor 56 is explained described touch according to its program design.For example, processor 56 can be according to the specific touch initiating task.A kind of application specific processor is used in local the processing and touches and reduce needs to the primary processor of computer system.
Touch-sensing equipment can be based on the detection technology that includes but not limited to capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure-sensing and/or light sensing etc.And described touch-sensing means can be based on single point sensing or multipoint sensing.Single point sensing only can be distinguished single touch, and multipoint sensing may can be distinguished simultaneous a plurality of touch.
As discussed above, input equipment 70 can be positioned at above the display 68 or the front with the integrated touch-screen of display device 68, perhaps can be the independent parts such as touch pad.
Computer system 50 also preferably includes the ability that is couple to one or more I/O equipment 80.For example, I/O equipment 80 can be corresponding to keyboard, printer, scanner, camera, microphone and/or loudspeaker, or the like.I/O equipment 80 can be integrated with computer system 50, and perhaps they can be independent parts (for example peripherals).In some cases, I/O equipment 80 can be connected to computer system 50 by wired connection (for example cable/port).In other cases, I/O equipment 80 can be connected to computer system 50 by wireless connections.For example, data link can be corresponding to PS/2, USB, IR, Firewire, RF or bluetooth, or the like.
According to one embodiment of present invention, computer system 50 is designed to discern the attitude 85 that is applied to input equipment 70 and based on the each side of described attitude 85 control computer systems 50.In some cases, attitude may be defined as can be mapped to one or more certain calculation operation mutual with stylize (stylized) input equipment.Can produce attitude 85 by various hands, particularly finger motion.Replacedly or additionally, available stylus produces attitude.In all these situations, input equipment 70 receives attitude 85, processor 56 operations of execution command to carry out being associated with attitude 85.In addition, memory block 58 can comprise attitude running program 88, and it can be the part of operating system or independent application.Attitude running program 88 can comprise one group of instruction usually, the generation of its identification attitude 85 and the action (or a plurality of action) of notifying attitudes 85 and/or will taking in response to attitude 85 to one or more agengs.Other details about the various attitudes that can be used as input command is discussed below.
According to a preferred embodiment, in case the user has carried out one or more attitudes, input equipment 70 just passes to attitude information processor 56.Use is from the instruction of storer 58, more particularly, uses attitude running program 88, and processor 56 is explained attitudes 85 and based on the different parts of attitude 85 control computer systems 50, such as storer 58, display 68 and I/O equipment 80.Attitude 85 can be identified as and be used in the application that is stored in storer 58 carrying out action, revises the image object that is presented on the display 68, revises the data that are stored in the storer 58, and/or is used for carrying out at I/O equipment 80 order of action.
In addition, though Fig. 1 is expressed as two independent frames for illustrated purpose with input equipment 70 and display 68, these two frames can be realized on an equipment.
Fig. 2 illustrates an exemplary computer system 10, and it uses the input equipment of multi-touch (multi-touch) panel 24 as attitude; Described multi-touch panel 24 can be a display pannel simultaneously.Computing system 10 can comprise the one or more multi-touch panel processor 12 that are exclusively used in multi-touch subsystem 27.Replacedly, multi-touch panel processor function can be realized by the special logic such as state machine.Peripherals 11 can include but not limited to storer or the memory device and the watchdog timer etc. of random-access memory (ram) or other type.Multi-touch subsystem 27 can include but not limited to one or more analog channels 17, channel scan logic 18 and driver logic 19.Channel scan logic 18 can access RAM 16, independently from the analog channel reading of data, and provides control for analog channel.This control can comprise that each row with multi-touch panel 24 are multiplexed to analog channel 17.In addition, channel scan logic 18 can the Control Driver logic and is applied to pumping signal of each row of multi-touch panel 24 by selectivity.In certain embodiments, multi-touch subsystem 27, multi-touch panel processor 12 and peripherals 11 can be integrated in the single asic (ASIC).
Driver logic 19 can provide a plurality of multi-touch subsystem outputs 20, and can provide the special purpose interface that drives high-voltage drive, this high-voltage drive preferably include demoder 21 with and subsequent level displacement shifter and driver-level 22, but the level shift function also can be carried out before decoder function.Level displacement shifter and driver 22 can provide the level shift from low voltage level (for example CMOS level) to high voltage level, thereby noise (S/N) compares for the noise reduction purpose provides preferably.Demoder 21 can be with the driving interface signal decoding to one of N output, and N can be the maximum number of lines in the panel.Demoder 21 can be used for reducing the number of drive wire required between described high-voltage drive and the multi-touch panel 24.Each multi-touch panel row input 23 delegation or multirow that can drive in the multi-touch panel 24.Should be noted that driver 22 and demoder 21 also can be integrated among the single ASIC, be integrated in the driver logic 19, is being unnecessary in some cases perhaps.
Multi-touch panel 24 can comprise the capacitive sensing medium with a plurality of row traces or drive wire (driving line) and a plurality of row trace or sense wire (sensing line), but also can use other sensed media.The row and column trace can be formed by transparent conductive medium, such as tin indium oxide (ITO) or antimony tin (ATO), but also can use other transparent and opaque material, such as copper.In certain embodiments, the row and column trace can form on the opposite face of dielectric substance, and can be perpendicular to one another, but in other embodiments, may be other non-Cartesian orientations.For example, in polar coordinate system, sense wire can be a concentric circles, and drive wire can be an extension line (otherwise or) radially.Therefore, be to be understood that, here employed term " row " and " row ", " first dimension " and " second dimension ", or " first " and " second ", be intended that and not only comprise orthogonal grid, also comprise the crossing trace (for example polar coordinates arrange concentric line and radial line) of other geometric configuration with first and second dimensions.Row and column can be formed on the single face of substrate, perhaps can be formed on two substrates that separate being separated by dielectric substance.In some cases, additional dielectric capping layers can place on the row or column trace, exempts to be damaged with reinforced structure and the whole assembly of protection.
" intersection point " at the trace of multi-touch panel 24 located, and wherein trace is not in that the above and below is by (intersection) (but directly electrically contacting each other) each other, and trace forms two electrodes (intersecting but also have more than two traces) in fact.Each intersection point of row traces and row trace can be represented a capacitive sensing node, and can be regarded as pictorial element (pixel) 26, and this may be particularly useful when multi-touch panel 24 is used as " image " of catching touch.(in other words, after multi-touch subsystem 27 had determined whether that each touch sensor place in the multi-touch panel has detected touch event, the pattern (pattern) that the touch sensor at touch event place takes place in the multi-touch panel can be regarded as touching " image " (for example pattern of finger touch panel).) electric capacity between the row and column electrode shows as the stray capacitance that lists at all when this given row remains on DC, and show as mutual capacitance Csig during by the AC signal excitation at this given row.Can by to measure to the change of Csig detect finger or other object near or be positioned at existence on the multi-touch panel.The row of multi-touch panel 24 can drive the one or more analog channels 17 (being also referred to as event detection and demodulator circuit here) in the multi-touch subsystem 27.In some embodiments, each row can be couple to a specialized simulation passage 17.But in other embodiments, each row can be couple to the analog channel 17 of lesser amt via analog switch.
Computing system 10 also can comprise primary processor 14, it is used to receive from the output of multi-touch panel processor 12 with based on described output carries out action, described output can include but not limited to: move the object such as cursor or pointer, roll or translation, regulate control setting, open file or document, check menu, select, execution command, be operationally connected to the peripherals of main process equipment, or the like.Primary processor 14, it can be personal computer CPU, also can carry out and to handle irrelevant additional function with the multi-touch panel, and can be couple to program storage 15 and such as LCD display, be used for providing the display device 13 of user interface (UI) to the equipment user.
Though should be noted that Fig. 2 special-purpose MT panel processor 12 is shown, can directly controls the multi-touch subsystem by primary processor 14.In addition, shall also be noted that multi-touch panel 24 and display device 13 can be integrated in the single screen displaying equipment.The more details that multi-touch sensor detects---comprise by the approaching of touch panel and detecting---are described in the common pending application of following common transfer, comprise: as the disclosed application of U.S. Patent Publication No.US2006/0097991 No.10/840,862, as the disclosed application of U.S. Patent Publication No.US2006/0238522 No.11/428, the application of " the Proximity and Multi-Touch Sensor Detection and Demodulation " by name that submitted on January 3rd, 522 and 2007, its all the elements are incorporated herein by reference.
Fig. 3 illustrates a kind of according to an embodiment of the invention multiple spot disposal route 100.Multiple spot disposal route 100 can for example be carried out in the system shown in Fig. 1 or Fig. 2.Multiple spot disposal route 100 is usually from piece 102, at piece 102, can be from the multiple spot input equipment---more particularly, from multi-point touch panel---reading images.Though use term " image ", should be noted that data can occur with other form.In most cases, the image that reads from touch-screen provides the function of amplitude (Z) as position (x and y) for each sensing points or the pixel of touch-screen.Amplitude can for example be reflected in the electric capacity of each point measurement.
Determining after 102, multiple spot disposal route 100 proceeds to piece 104, and at piece 104, described image can be converted into the set or the tabulation of feature.Each feature is represented a different input, such as a touch.In most cases, each feature can comprise its oneself unique identifier (ID), x coordinate, y coordinate, Z amplitude, angle Θ, area A, or the like.For example, Fig. 4 A and 4B illustrate specific image 120 sometime.In image 120, have based on two different two features 122 that touch.Described touch can for example touch described touch-screen by pair of finger and form.As directed, each feature 122 can comprise unique identifier (ID), x coordinate, y coordinate, Z amplitude, angle Θ and area A.More particularly, the first feature 122A can be by ID 1, X 1, Y 1, Z 1, Θ 1, A 1Expression, and the second feature 122B can be by ID 2, X 2, Y 2, Z 2, Θ 2, A 2Expression.These data can for example be utilized the multi-touch agreement and export.
Conversion from data or image to feature can utilize as the disclosed common unsettled U.S. Patent application No.10/840 of U.S. Patent Publication No.US2006/007991, and the method for describing in 862 is finished, and this application is combined in this once more by reference.As wherein disclosed, raw data is received with digitized forms usually, and can comprise the value of each node of touch-screen.Described value can be between 0 and 256, and wherein 0 is equivalent to no touch pressure, and 256 are equivalent to full touch pressure.Then, raw data can be filtered to reduce noise.In case filter, just can generate gradient data (gradient data), it indicates the topology of the every group of point that links to each other.Then, can calculate the border (that is, can determine which point can gather together and forms each touch area) of touch area based on gradient data.For example, can use watershed algorithm.In case determined the border, can calculate the data (for example X, Y, Z, Θ, A) of each touch area.
After piece 104, multiple spot disposal route 100 proceeds to piece 106, at piece 106, can carry out tagsort and in groups.In assorting process, can determine the identity of each feature.For example, can with tagsort specific finger, thumb, palm or other object.In case be classified, feature just can be in groups.The mode of formation group can extensively change.In most cases, feature can be based on some criterion (for example they possess similar attribute) and in groups.For example, two features shown in Fig. 4 A and Fig. 4 B can gather together, because the position of each in these features can be located adjacent one another or because they are from identical hand.Other filters can to comprise a certain level in groups, is not the feature of the part of touch event with filtering.When filtering, can refuse one or more features, because they satisfy some predefined criterion or because they do not satisfy some criterion.For example, one of feature can be classified as the thumb that is positioned at dull and stereotyped PC edge.Therefore because this thumb is being used for gripping device rather than is being used for executing the task, the feature that generates thus is rejected, that is, this feature is not considered to the part of the touch event handled.
After piece 106, multiple spot disposal route 100 proceeds to piece 108, at piece 108, but the key parameter of calculated characteristics group.Key parameter can comprise the x/y centre of form (centroid), the feature rotation of distance between the feature, all features, the stagnation pressure of the group pressure of centre of form place (for example), or the like.As shown in Figure 5, described calculating can comprise and finds centre of form C, draws dummy line 130 from centre of form C to each feature, defines each dummy line (D 1And D 2) distance and get distance D then 1And D 2Mean value.In case calculate described parameter, but with regard to the Report Parameters value.Usually utilize the number of features (being three in this embodiment) in group identifier (GID) and each group to come the Report Parameters value.In most cases, initial parameter value and parameter current value are all reported.Initial parameter value can promptly work as the user its finger is placed on the touch-screen, and currency can be based on any point in the stroke (stroke) that takes place after putting down based on putting down (set down).
Should be appreciated that piece 102-108 can carry out repeatedly during user's stroke, thereby generate the signal of a plurality of arranged in order.Can in step after a while, compare initial parameter and parameter current in system, to carry out action.
After piece 108, process stream proceeds to piece 110, at piece 110, will organize or can with the group be associated with user interface (UI) element.The UI element can be button frame, tabulation, slide block, wheel, knob, or the like.Parts at each UI element representative of consumer interface or control.The application behind of UI element can be visited the supplemental characteristic that calculates in piece 108.In one embodiment, described application follows the degree of correlation of UI element corresponding with it to sort to touch data.Ordering can be based on some predetermined criteria.Ordering can comprise the generation quality factor, and whichever UI element has the highest quality factor, all gives its independent visit to described group.Even can also have to a certain degree history (in case UI element is advocated the control to this group, this group just with this UI element adhesion (stick with) till another UI element has much higher ordering).For example, ordering can comprise the degree of approach of definite centre of form (or feature) to the image object that is associated with the UI element.
After piece 110, multiple spot disposal route 100 proceeds to piece 112 and 114. Piece 112 and 114 can roughly be carried out simultaneously.In one embodiment, from user perspective, piece 112 and 114 looks like simultaneously to be carried out.At piece 112, can carry out one or more actions based on the difference between initial parameter value and the parameter current value, and these one or more actions can also be based on they associated UI elements that arrives, if such UI element is arranged.At piece 114, can provide user feedback about described one or more actions of carrying out.For example, user feedback can comprise demonstration, audio frequency and/or tactile feedback, or the like.
Fig. 6 is a kind of according to an embodiment of the invention calculation method of parameters 150.Calculation method of parameters 150 can be for example corresponding to the piece shown in Fig. 3 108.Calculation method of parameters 150 at piece 152, can receive a stack features usually from piece 152.After piece 152, calculation method of parameters 150 proceeds to piece 154, at piece 154, can determine whether the number of features in this stack features changes.For example, the number of feature may change owing to the user lifts or place another finger.May need different fingers to carry out different control (for example, follow the tracks of, gesture).If the number of feature changes, then calculation method of parameters 150 proceeds to piece 156, at piece 156, can calculate initial parameter value.If the number of feature remains unchanged, then calculation method of parameters 150 proceeds to piece 158, at piece 158, can calculate the parameter current value.Then, calculation method of parameters 150 proceeds to piece 150, at piece 150, can report initial and parameter current value.For example, the average initial distance between initial parameter value can comprise a little (or Distance (AVG) initial), and the average current distance (or Distance (AVG) current) of parameter current value between can comprising a little.These can be compared the each side with the control computer system in the step of back.
Said method and technology can be used for realizing the gui interface object and the action of arbitrary number.For example, can create attitude detecting and to finish user command, thereby adjust window size, roll display, target rotation, amplify or dwindle shown view, deletion or insert literal or other object, or the like.
The attitude of a base class should allow the user to import can be by the normal commands of using traditional mouse or Trackball device to import.Fig. 7 F illustrates the process flow diagram that is used to handle to the detection of click action.From piece 710, can detect one or two touch of finger.If it is a finger that detected touch can be determined 711, can determine then whether 712 these touches follow certain the display image object that is associated with selectable file object to have the predetermined degree of approach, if carry out 714 and select action.Rap action if detect 716 twice of being associated, then can enable 718 and double-click action with selectable object.Can leave touch-screen and touch this touch-screen immediately again and determine for twice to rap action twice by detecting finger.According to interchangeable embodiment, keep surpassing a predetermined amount of time if detect the touch of finger on selected object, then also can enable and double-click action.
Shown in Fig. 7 G, if a detected finger touch is not associated with selectable file object, but be determined 720 for to be associated with a network address hyperlink, can activate this hyperlink thereby then can enable click action.If this hyperlink is touched, then also will initiate browser application in the non-browser environment.
If detect 711 two finger touch,, then select 715 these objects if at least one contact point is associated with selectable file object 713 so.Rap if when keeping this contact point, on touch-sensitive display, detect the one or many of one of 717 described fingers, then can enable and click mouse action by right key.
According to a preferred embodiment, if detected one or more touch is not associated with any selectable file object or hyperlink, then shown in Fig. 7 H, can determine 722 these contact points (or a plurality of contact point) whether with/whether may be associated all text editing in this way of described scrollable field application window, listed files window or internet webpage with a scrollable field.
Rolling is usually directed to data presented or image are moved past the zone of checking on the display screen, checks new one group of data in the zone so that can check at this.In most cases, in case check that the zone is full, each new group data just appear at edge and all other group data of checking the zone and remove a position.That is for shifting out each group data of checking the zone, one group of new data appear.In fact, these functions allow the user to check at present and check extra-regional continuous multi-group data.In most cases, the user can quicken the traversal to data set by move its finger with bigger speed.The example that rolls across tabulation can find in following U.S. Patent Publication: No.2003/0076303A1, No.2003/0076301A1, No.2003/0095096A1, these patent disclosures are incorporated herein by reference.
If contact point (or a plurality of contact point) is arranged in/may be positioned at a scrollable field, and then can enable 723 and be similar to the scroll actions of on traditional mouse apparatus, depressing roller.If this scrollable field only can roll (for example) in one direction, then the scroll actions of being enabled will be that folk prescription is to rolling.If this scrollable field can roll on two dimensions, then the scroll actions of being enabled will be omnidirectional.
Roll may be limited in vertically (being Y-axis) folk prescription in scroll actions, only the vertical vector component that moves of the touch of following the tracks of to some extent will be used as and be used to realize the input of vertically rolling.Similarly, roll may be limited in laterally (being X-axis) folk prescription in scroll actions, only the horizontal vector component that moves of the touch of following the tracks of to some extent will be used as and be used to realize the input of laterally rolling.If this scroll actions is omnidirectional, then the scroll actions that is realized will be followed the tracks of moving of the touch followed the tracks of.
According to a preferred embodiment,, then can prepare to carry out 724 scroll actions with normal or 1X speed if detected touch is a finger touch.And if, then can carry out scroll actions on touch-screen by following the tracks of contact point moving on touch-screen in case the finger of contact begins to move.If detected touch is two finger touch, then can carry out 725 scroll actions with twice or 2X speed.Also can increase other finger to carry out scroll actions faster, wherein, in the multi-page document window, detect four finger touch and can be interpreted as " pg up (upwards page turning) " or " pg dn (page turning downwards) " order.
According to another embodiment, though when finger when touch-screen is removed, data presented also continues to move.This continuous motion can to small part based on before motion.For example, can continue to roll with identical direction and speed.In some cases, roll and slow down in time, that is, traversal becomes more and more slower by the speed of media item, thereby till rolling finally stops to stay static tabulation.For example, enter each new media items of checking the zone and can little by little reduce speed.Replacedly or additionally, when finger was put back on the touch-screen, data presented stopped to move.That is, with finger put back to can realize on the touch-screen braking (braking), the motion that it stops or slowing down and carry out continuously.
Illustrate attitude action discussed above, shown in Fig. 7 A, utilize touch-screen (all multi-touch screens 24 as shown in Figure 2), rap may be interpreted as with the single finger of finger 501 on image object (for example listed files 500) and be equivalent to mouse-click, this can indicate selection in this example, and described selection is represented by highlighting selected file or image object usually.Detect on image object twice and rap may be interpreted as and be equivalent to mouse and double-click, this can enable the initiation of the application that is associated with the image object that is rapped.For example, on screen, rap listed files twice,, can make that initiating photo checks and use and open this photo files such as photo files.
Can be by the image that is associated with the object that will put down (drop) with at least one finger touch with by keeping this touch this object to be drawn to the place of putting down of expectation with figure ground, enable drag-and-drop function, shown in Fig. 7 B, illustrate listed files 500 is dragged and dropped into folder window 503 from folder window 502.
Some mouse function may need two touches to finish.For example, shown in Fig. 7 C, can enable " clicking by right key " attitude by two fingers, second finger 507 is rapped screen at least once to one of them finger as contacting finger 506, clicks action by right key with indication.Fig. 7 D illustrates, may finish click action by right key after, can enable action window 504, then, first finger can move on to the window 504 enabled to select and to rap action item 505 with single finger 506.According to one embodiment of present invention, have only detectedly to rap to contact very approachingly, and detected when rapping the left side (seeing it is the right of contact finger) that is positioned at contact finger, just may realize clicking by right key action from user's angle with detected.
Can only utilize touch action to carry out generally needs the alternative document of Combined mouse and keyboard action selection function.For example, in Microsoft Windows environment, in order to select a plurality of files in file window 502, the user need pull mouse icon usually on a series of files that will select when keeping pressing the shift button.Do not keep pressing the shift button, the towing of mouse icon may be interpreted as ole Server OLE.Shown in Fig. 7 E, according to one embodiment of present invention, the touch that detects two tight association of listed files pulls the multiselect action that can be considered to be used to select one group of file 508.For fear of the mistranslation of the described pair of touch action is another order such as spinning movement, have only when detect these two touches toward each other closely near the time, just preferably enable this pair touch multiselect function.
With reference to the scroll actions of describing among the figure 7H, and as shown in Fig. 7 I and 7J, one or two finger contact in the window that can roll can make the displaying contents of this window roll with friction speed.Particularly, in case enable 723 scroll actions, if determine on touch-sensitive display, only to detect a finger (or a contact point), just roll, if detect two fingers (or two contact points) then roll with 2X speed with 1X speed 724.According to a preferred embodiment, during scroll actions, scroll bar 727 and 728 and rotating direction consistently move.
At last, utilization can be carried out near the multi-touch display that detects, the common pending application No.10/840 of common transfer as the aforementioned all and incorporated herein by reference, the panel of describing in the application of " the Proximity and Multi-Touch Sensor Detection and Demodulation " by name that submits in 862 (being disclosed as U.S. Patent Publication No.US2006/0097991) and on January 3rd, 2007, finger gesture also can be used for enabling and can be equivalent to mouse icon (hover) action of hovering of hovering on image object.
For example, with reference to figure 7K, in desktop computer 729, the user is pointed 501 approaching detections on application icon 731 may be interpreted as the action of hovering, its scrolling of enabling the application icon 730 that hovers ejects (rolling popup).If the user touches the icon of this ejection, can initiate this application thereby then can enable to double-click to move.Similarly notion can be applicable to the situation of application-specific, when being presented in the photo management software with thumbnail image format when photo files, the action of hovering is enabled in approaching detection on thumbnail to finger, thereby can increase the size (rather than selection) of this photo thumbnail that hovers.
Attitude also can be used for enabling and handling the virtual controlling interface, such as volume knob, switch, slide block, keyboard and can create other the mutual virtual interfaces that are used to help people and computing system or consumer electronics.To utilize attitude to enable the virtual controlling interface is example, and with reference to figure 8A-8H, uses description to the rotation attitude of the virtual volume knob 170 of control on the gui interface 172 of the display 174 of dull and stereotyped PC 175.In order to start knob 170, the user is placed on its finger 176 on the multi-point touch panel 178.May show the virtual controlling knob, or the specific quantity of the finger that puts down, orientation or profile, or immediately following the moving of thereafter finger, or a certain combination of these and other feature of user interactions can be enabled the virtual controlling knob that will show.In either case, computing system is associated with this virtual controlling knob with the finger group and determines that the user plans to use this virtual volume knob.
This association can also be partly based on pattern or the current state of calculation element when importing.For example, for identical attitude, if current just played songs on calculation element, then this attitude may be interpreted as the volume knob attitude, and if carrying out object editing and using, then this attitude may be interpreted as rotate command.Other user feedback can be provided, comprise for example sense of hearing or tactile feedback.
In case the such knob 170 that shown shown in Fig. 8 A, user's finger 176 just can be placed on knob 170 around, just as it is actual knob or dial (of a telephone), and then can be around knob 170 rotations so that simulation turning knob 170.And, when knob 170 possibilities " rotation ", the audio feedback that for example has noise made in coughing or vomiting loudspeaker sound form can be provided or have the tactile feedback of vibration mode.The user also can use his or her another hand to hold dull and stereotyped PC 175.
Shown in Fig. 8 B, multi-point touch panel 178 detects at least one pair of image.Especially, when putting down, create first image 180, and when finger 176 rotations, can create at least one other image 182.Though only show two images, in most cases, between these two images, will have more images to increase progressively appearance.Each image is represented the profile of the finger that contacts with touch-screen in a particular moment.These images also can be described as the touch image.Will be appreciated that term " image " does not also mean that this profile is displayed on (but by the imaging of touch-sensing equipment) on the screen 178.Though also it should be noted that and use term " image ", data can have other form on the touch plane of representing each time.
Shown in Fig. 8 C, each in the image 180 and 182 all can be transformed into the set of feature 184.Each feature 184 can be associated with a specific touch, and this touch is for example from each thumb pointing 176 finger tip and be used to hold the another hand 177 of dull and stereotyped PC 175 around knob 170.
Shown in Fig. 8 D, feature 184 is classified---being that each finger/thumb is identified---and for each image 180 and 182 in groups.In this instantiation, the feature 184A that is associated with the knob 170 formation group 188 that can gather together, and the feature 184B that is associated with thumb can be filtered off.In interchangeable layout, thumb feature 184B can be regarded feature separately (or being arranged in another group) separately, with the operator scheme that for example changes input or system or realize another attitude, for example be presented at screen on the slide block attitude that is associated of balanced device slide block in thumb (or other finger) zone.
Shown in Fig. 8 E, can be the key parameter of each image 180 and 182 calculated characteristics groups 188.Represent original state with the key parameter that first image 180 is associated, and the key parameter of second image 182 is represented current state.
Shown in Fig. 8 E, because feature group 188 approaches knob 170, knob 170 is the UI elements that are associated with feature group 188 equally.Then, shown in Fig. 8 F, can be relatively from the key parameter value of the feature group 188 of each image 180 and 182 to determine rotating vector, that is, and feature group five (5) degree that from the original state to the current state, turn clockwise.In Fig. 8 F, initial characteristics group (image 180) is shown in broken lines, and current feature group (image 182) illustrates with solid line.
Shown in Fig. 8 G, based on rotating vector, the loudspeaker 192 of dull and stereotyped PC 175 increases (or reducing) its output according to the rotation amount of finger 176,, based on 5 rotations of spending volume is increased by 5% that is.The rotation that the display 174 of dull and stereotyped PC also can come adjusting knob 170 according to the rotation amount of finger 176, that is, and the position of knob 170 rotation five (5) degree.In most cases, the rotation of knob and the rotation of finger take place simultaneously,, are once pointing every rotation that is, and knob was once rotating.In fact, the virtual controlling knob is followed the attitude that appears on the screen.Further, the audio unit 194 of dull and stereotyped PC can provide noise made in coughing or vomiting loudspeaker sound one time to the per unit rotation, for example, provides the noise made in coughing or vomiting loudspeaker five times based on five degree rotations.Further, the haptic unit 196 of dull and stereotyped PC 175 can provide certain vibratory output or other tactile feedback to each noise made in coughing or vomiting loudspeaker, thus the actual knob of simulation.
Should be noted that and to carry out other attitude simultaneously with virtual controlling knob attitude.For example, can utilize both hands to control simultaneously more than a virtual controlling knob, that is, and virtual controlling knob of a hand control.Replacedly or additionally, can control one or more slider bars simultaneously with the virtual controlling knob, i.e. manual manipulation virtual controlling knob, at least one finger of another hand (replacedly simultaneously, more than a finger) operate at least one slider bar (replacedly, more than a slider bar), for example each points a slider bar.
Utilize the virtual controlling knob to describe though shall also be noted that this embodiment, in another embodiment, the UI element can be a virtual scroll wheel.As an example, virtual scroll wheel can be imitated actual roller, such as in U.S. Patent Publication No.US2003/0076303A1, No.US2003/0076301A1 and No.US2003/0095096A1, describe those, all these patent disclosures are incorporated herein by reference.
Fig. 9 is a kind of figure based on the method 200 that touches according to an embodiment of the invention.This method at piece 202, can detect the user's input on the present multipoint sensing device usually from piece 202.User's input can comprise one or more touch inputs, and each touches input and has a unique identifier.After piece 202, should proceed to piece 204 based on the method 200 that touches, at piece 204, if user's input may comprise single unique identifier (touches input), then user's input can be classified as and follow the tracks of or select input, if perhaps user's input may comprise at least two unique identifiers (touching input more than), then user's input can be classified as the attitude input.Follow the tracks of input if user's input can be classified as, then proceed to piece 206,, can carry out with the user and import corresponding tracking at piece 206 based on the method 200 that touches.
If user's input is classified as the attitude input, then proceed to piece 208 based on the method 200 that touches, at piece 208, can carry out with the user and import corresponding one or more attitude control action.The attitude control action can be at least in part based on the variation of at least two unique identifiers generations or the variation that takes place between at least two unique identifiers.
Figure 10 is a kind of figure based on the method 250 that touches according to an embodiment of the invention.Usually from piece 252,, can catch initial pictures during the input stroke on the touch sensitive surface based on the method 250 that touches at piece 252.After piece 252, proceed to piece 254 based on the method 250 that touches, at piece 254, can determine touch mode based on initial pictures.For example, if initial pictures comprises single unique identifier, then touch mode can be corresponding to following the tracks of or preference pattern.
On the other hand, if initial pictures comprises that then touch mode can be corresponding to gesture mode more than a unique identifier.
After piece 254, proceed to piece 256 based on the method 250 that touches, at piece 256, can catch next image during the input stroke on the touch sensitive surface.During stroke, catch image usually serially, therefore have a plurality of images that are associated with this stroke.
After piece 256, proceed to piece 258 based on the method 250 that touches, at piece 258, can determine catching initial pictures and catching whether touch mode changes between next image.If touch mode changes, then proceed to piece 260 based on the method 250 that touches, at piece 260, next image can be set to initial pictures, determines touch mode based on new initial pictures once more at piece 254 then.If touch mode remains unchanged, then proceed to piece 262 based on the method 250 that touches, at piece 262, can compare initial pictures and next image, and can relatively generate one or more control signals based on this.
Figure 11 is a kind of figure based on the method 300 that touches according to an embodiment of the invention.Based on the method 300 that touches from piece 302, at piece 302, exportable image object, it can be the GUI object.For example, but the processor indication display shows the specific image object.After piece 302, proceed to piece 304 based on the method 300 that touches, at piece 304, be received in the attitude input on this image object.For example, the user can place or move its finger in the attitude mode on touch screen surface, and its finger is on shown image object simultaneously.The attitude input can comprise one or more single attitude or simultaneous a plurality of attitudes that take place in succession.Each attitude has particular order associated therewith, motion or orientation usually.For example, attitude can comprise that finger opens (spread apart) or finger closes up (close together), pivoting finger and/or translation finger, or the like.
After piece 304, proceed to piece 306 based on the method 300 that touches, at piece 306, can as one man revise image object based on the attitude input and with this attitude input.By revising, mean that image object changes according to particular pose or a plurality of particular pose carried out.By making its unanimity, mean roughly when attitude or a plurality of attitude are being carried out to change.In most cases, in attitude (a plurality of attitude) with between the variation of image object place generation one-to-one relationship is arranged, and their take place simultaneously basically.In fact, image object is followed the motion of finger.For example, point the open while and can amplify object, finger can reduce image object when closing up, can target rotation in the time of pivoting finger, can allow image object translation or rolling in the time of the translation finger.
In one embodiment, piece 306 can comprise determines which image object is associated with the attitude of carrying out, and then shown object is locked onto finger placed on it, and input changes so that this image object is according to attitude.By pointing locking or being associated with image object, this image object can be regulated himself continuously according to point the operation of just carrying out on touch-screen.Usually, describedly determine and locking occurs in when putting down, that is, and when finger is placed on the touch-screen.
Figure 12 is the figure of convergent-divergent attitude method 350 according to an embodiment of the invention.The convergent-divergent attitude can be carried out on the multi-point touch panel of all multi-touch panels 24 as shown in Figure 2 and so on.Convergent-divergent attitude method 350 is usually from piece 352, at piece 352, detect at least the first finger and second finger on touch sensitive surface in existence.The existence of at least two fingers can be used for showing that this touch is that attitude touches rather than touches based on the tracking of a finger.In some cases, only the existence of two fingers shows that this touch is that attitude touches.In other cases, the finger more than two arbitrary number shows that this touch is that attitude touches.In fact, no matter be that two, three, four or more fingers are touching, even and during described attitude number change, described attitude touches and all can work, and, whenever only needs minimum two fingers during described attitude that is.
After piece 352, convergent-divergent attitude method 350 proceeds to piece 354, at piece 354, and the distance between more at least two fingers.This distance can be from pointing finger, perhaps pointing certain other reference point, for example centre of form from each.If the distance between two fingers increases (opening), can generate amplifying signal, shown in piece 356.If the distance between two fingers reduces (closing up), can generate and dwindle signal, shown in piece 358.In most cases, putting down of finger will or lock onto a specific image object that is showing the finger association.For example, touch sensitive surface can be a touch-screen, and image object may be displayed on this touch-screen.This takes place when at least one described finger is positioned on this image object usually.Thereby when finger was mobile separately, amplifying signal can be used for increasing the size of the embedding feature in the image object, and was retracted to a time-out when pointing, and dwindled the size that signal can be used for reducing the embedding feature in this object.Described convergent-divergent usually occurs in the predefined border, such as the circumference of display, the circumference of window and/or the edge of this image object, or the like.Embed feature and can be formed on a plurality of layers, each layer represented the convergent-divergent of different stage.
In most cases, the amount of convergent-divergent changes according to two distance between objects.And convergent-divergent can substantially side by side take place with motion of objects usually.For example, along with finger opens or closes up, object just amplifies simultaneously or dwindles.Though this method, should be noted that it also can be used for increasing or reducing at convergent-divergent.Convergent-divergent attitude method 350 is particularly useful for the graphic package such as publication, photo and plotter program.And convergent-divergent can be used for controlling the peripherals such as camera, that is, when finger opened, camera dwindled, and when finger closed up, camera amplified.
Figure 13 A-13H illustrates a series of convergent-divergents that utilize above-described method.Figure 13 A illustrates the display of the image object 364 that presents ground, North America diagram form, and wherein this image object 364 has scalable embedding rank.In some cases, as shown in the figure, image object can be positioned within the window on the border that forms image object 364.Figure 13 B illustrates the user its finger 366 is placed on the zone in North America 368, particularly is on the U.S. 370, more specifically is on the California 372.In order to amplify on California 372, the user begins to open its finger 366, shown in Figure 13 C.Shown in Figure 13 D-13H, along with finger 366 further opens (detected distance increases), map further amplifies on Northern California 374, arrive the specific region of Northern California 374 then, arrive Bayarea 376 then, arrive the peninsula 378 (for example, the zone between San Francisco and the San Jose zone) then, arrive the city San Carlos380 between San Francisco and San Jose then.In order to dwindle San Carlos 380 and to get back to North America 368, finger 366 reverse orders according to said sequence close up back together.
Figure 14 is the figure of a kind of shift method 400 according to an embodiment of the invention.The translation attitude can be carried out on multi-point touch panel.Shift method 400 is usually from piece 402, at piece 402, can detect at least the first object and second object and exist on touch sensitive surface.The existence of at least two fingers can be used for showing that this touch is that attitude touches rather than touches based on the tracking of a finger.In some cases, only exist two fingers to show that this touch is that attitude touches.In other cases, the finger more than two arbitrary number shows that this touch is that attitude touches.In fact, no matter be that two, three, four or more fingers are touching, even and number during described attitude, change, described attitude touches and all can work, and, only needs minimum two fingers that is.
After piece 402, shift method 400 proceeds to piece 404, at piece 404, and the position of monitoring these two objects when two objects move past touch-screen together.After piece 404, shift method 400 proceeds to piece 406, at piece 406, when the position of these two objects changes with respect to initial position, can generate the translation signal.In most cases, putting down of finger will or lock onto the specific image object that is presented on the touch-screen the finger association.Usually, when in the described finger at least one is positioned on the position on this image object.Thereby when finger moved past touch-screen together, the translation signal can be used for making image translation on finger orientation.In most cases, the amount of translation changes according to the distance that two objects move.And translation can substantially side by side take place with motion of objects usually.For example, along with finger moves, object is along with finger moves simultaneously.
Figure 15 A-15D illustrates a series of translations based on shift method 400 described above.Utilize the map of Figure 13 A, Figure 15 A illustrates the user its finger 366 is placed on the map.In case put down, finger 366 just locks onto map.Shown in Figure 15 B, when finger 366 when vertically moving up, entirely Figure 36 4 can move up, thus make map 364 before the part seen be placed in and check outside the zone, and the part of not seeing of map 364 is placed in and checks within the zone.Shown in Figure 15 C, flatly horizontal when mobile when finger 366, entirely Figure 36 4 can laterally move, thus make map 364 before the part seen be placed in and check outside the zone, and the part of not seeing of map 364 is placed in and checks within the zone.Shown in Figure 15 D, oblique when mobile when finger 366, entirely Figure 36 4 can obliquely move, thus make map 364 before the part seen be placed in and check outside the zone, and the part of not seeing of map 364 is placed in and checks within the zone.Should be appreciated that the motion of the motion accompanying finger 366 of map 364.This process is similar to along desk slip a piece of paper.The pressure that finger puts on the paper locks onto finger with paper, and when finger slipped over desk, this paper was along with finger moves.
Figure 16 is the figure of spinning solution 450 according to an embodiment of the invention.Rotation attitude can be carried out on multi-point touch panel.Spinning solution 450, exists when can detect first object and second object at piece 452 usually from piece 452.The existence of at least two fingers can be used for showing that this touch is that attitude touches rather than touches based on the tracking of a finger.In some cases, only exist two fingers to show that this touch is that attitude touches.In other cases, the finger more than two arbitrary number shows that this touch is that attitude touches.Under some other situations, no matter be that two, three, four or more fingers are touching, even and during described attitude number change, described attitude touches and all can work, and, only needs minimum two fingers that is.
After piece 452, spinning solution 450 proceeds to piece 454, at piece 454, the angle of each finger is set.Described angle can be determined with respect to reference point usually.After piece 454, spinning solution 450 proceeds to piece 456, at piece 456, when the angle of at least one described object changes with respect to reference point, can generate rotating signal.In most cases, putting down of finger will or lock onto the specific image object that is presented on the touch-screen the finger association.Usually, when in the described finger at least one was positioned on the position on this image object, this image object was with associated or lock onto described finger.Thereby when when rotation finger, rotating signal is used on the direction of finger rotation (for example, clockwise, counterclockwise) and rotates this object.In most cases, the rotation amount of object changes according to the rotation amount of finger, that is, if finger moves 5 degree, then object also will be so.And rotation can substantially side by side take place with the motion of finger usually.For example, in the finger rotation, object is along with the finger rotation.
Figure 17 A-17C illustrates a series of rotations based on above-described method.Utilize the map of Figure 13 A, Figure 17 A illustrates the user its finger 366 is placed on the map 364.In case put down, finger 366 just locks onto map 364.Shown in Figure 17 B, when finger 366 rotated in a clockwise direction, entirely Figure 36 4 can rotate in a clockwise direction according to the finger 366 that rotates.Shown in Figure 17 C, when finger 366 rotated in a counter-clockwise direction, entirely Figure 36 4 can rotate in a counter-clockwise direction according to the finger 366 of rotation.
Though should be noted that Figure 17 A-17C illustrates the use thumb and forefinger is enabled rotation attitude, be to use two fingers except that thumb, such as forefinger and middle finger, also can be used for enabling rotation attitude.
And, in some special applications, may not require that two fingers enable rotation attitude.For example, according to a preferred embodiment and shown in Figure 17 D and 17E, can utilize single finger gesture the photo thumbnail to be rotated to the orientation (for example from laterally to vertically) of expectation.Especially, in case detect the touch that is associated with selectable photo thumbnail icons 741, and wherein this touch input is such attitude, promptly, detected touch forms rotation or the radial arc around the thumbnail core, and then this input is interpreted as being used for rotating according to the direction of described rotation or radial arc the instruction of thumbnail.According to a preferred embodiment, the rotation of thumbnail icons will also make the corresponding file object change orientation arrangement.According to another embodiment, in photo management is used, detect rotation attitude and also will enable the snap order, so that being revolved automatically towards sense of rotation, the photo thumbnail turn 90 degrees.
Figure 18 A and 18B illustrate according to another example one exemplary embodiment of the present invention of describing in Figure 10 before, that utilization is edited the media file such as photo by the attitude input of UI element.Especially, shown in Figure 18 A, in photo editor environment 750, the UI element 751 of the each side of editing photo can be provided for, wherein in this photo editor environment 750, picture image file (for example jpeg file) 752 can be opened for editor.UI element 751 can be a level sliders, is used to regulate the rank of some aspect of photo.In the example shown in Figure 18 A, UI element 751 can be to be used to receive the interface of touch attitude with the gray scale of adjusting photo.Especially, along with the finger touch of being followed the tracks of moves on to described the left side, gray scale is reduced, and if the touch of being followed the tracks of moves on to the right of this UI element, then gray scale increases.According to an embodiment, the UI element is preferably transparent, so that the user still can see the photograph image in UI element back.In another embodiment, the photo size that is presented on the screen can reduce so that for separating the UI element vacating space that shows, wherein place the below of the photo that this UI element can be and then shown.
Figure 18 B illustrates by optionally utilizing single or multiple contact points, switches the ability of attitude input pattern via UI element 751.Especially, shown in Figure 18 B, second contact point that detects on UI element 751 will make operator scheme become the contrast level adjusting from the gray scale adjusting.In this case, two contact points move to the left or to the right and will make the contrast level of photo reduce respectively or increase.Detect the instruction that other contact point (for example three or four fingers) also can be interpreted as being used to switch to other operator scheme (such as convergent-divergent, tone adjusting, gamma rank etc.).Notice, regulate brightness and contrast's rank though Figure 18 A and 18B illustrate by UI element 751, the user can programme or customize UI element 751 the contact point number is interpreted as referring to other form of operator scheme.Shall also be noted that slider bar UI element 751 can adopt other form, such as virtual scroll wheel.
Figure 18 C is the process flow diagram that the algorithm that is associated with the top specific example of discussing in Figure 18 A and 18B is shown.Especially, shown in Figure 18 C, output 760UI element 751 on screen.Touch if detect the input of 761 attitudes, can determine further then how many contact points 762-765 has be associated with described touch.According to detected contact point number, can activate corresponding operator scheme at 767-769.In case activated suitable operator scheme, just detect the tracking of 770 butt contacts (or a plurality of contact point), regulate accordingly to realize 771 according to operator scheme.Should be noted that arbitrary time point that operator scheme can be during editing process switches, reason is that then this process turns back to and determines that 762-764 is so that activate new operator scheme if detecting the number of 772 contact point (or a plurality of contact point) changes.
Figure 18 D illustrates with 18E and utilizes identical UI element 751 discussed above, enables other action by importing other attitude command.Especially, in the gray scale of regulating shown photo, second finger can be used for realizing amplifying or dwindling action.Amplify and dwindle action and can enable apart from the variation of the degree of approach by detecting between second contact point and two contact points.Variable in distance between two contact points can be interpreted as amplifying or dwindling action according to method shown in Figure 12 and discussed above.Should be noted that according to an embodiment,, will can not enable zoom action if detected second contact point and first make contact keep constant distance; In this case, this attitude will be interpreted as being used to activate the input of second operator scheme (for example regulating from gray scale shown in Figure 18 A and 18B becomes the contrast level adjusting).
Figure 19 A and 19B illustrate and utilize attitude input to roll across media file---such as the photo files that is presented in the photo editor---example.Especially, shown in Figure 19 A and 19B, touch detection zone 754 and can be exclusively used in scroll actions, thereby finger moves up and down the attitude input that attitude can be interpreted as being used to be rolled to next photo 753 on the shown photo 752 of touch-screen 750.According to a preferred embodiment, needn't show that UI unit usually enables the rolling operation pattern; On the contrary, detect the downward sliding action of finger in touching detection zone 754 (for example, the downward tracking that detects contact point is moved) and can be enough to enable automatically scroll actions.According to an alternative embodiment, the UI element can be shown as on screen and be used for the zone of indicating the virtual longitudinal sliding motion bar that scroll actions has been activated and being used to continue the touch detection zone 754 of scroll actions to the user.
According to a preferred embodiment, if moving, detected downward tracking has more than a contact point (for example both hands refer to sliding gesture), then can carry out and roll, be similar to above-described about in scrollable field, enabling the mode of scroll actions with 2X speed.
Figure 19 C and 19D illustrate the another kind of form of UI element, and virtual scroll wheel 755 is used to receive the demonstration of attitude input with the rolling photo.In this embodiment, can enable virtual scroll wheel by on photo, carry out the circular contact that touches this simple attitude or three fingers with a finger.In case can present virtual scroll wheel UI element 755, the user just can " " this virtual scroll wheel be to roll across photo in rotation.In this particular example, rolling speed be can't help to detect how many contact points on roller 755 and is controlled, but is controlled around the speed of virtual scroll wheel 755 centers rotation by contact point.
Figure 19 E and 19F are illustrated in Figure 19 A on the display screen 781 of digital camera 780 and the conception of 19B.According to a preferred embodiment, the display screen 781 of digital camera 780 can be made by the responsive panel of multi-touch, such as the responsive panel 2 of the top multi-touch of describing in Fig. 2.
Figure 19 E illustrates an embodiment, wherein, in the replay mode of digital camera 780, detects vertically downwards the hit attitude input of at least one finger in touching detection zone 782 and enables the playback scroll actions, thereby can show next photo.According to another embodiment, the downward attitude input on display screen 781 any parts can be enabled scroll actions automatically.
Figure 19 F illustrates the alternative embodiment of Figure 19 E, wherein needs to detect two touches and rolls to enable to reset.Especially, in the contact region 783 contact point and in the contact region 782 places or near the combination that slidingly inputs downwards it can enable scroll actions to show next photo.Should be noted that the method that Figure 19 A describes in the 19E does not require specific form factor (formfactor), realize because this method can or have on the equipment of any kind of touch-screen at PC monitor, laptop computer monitor, digital camera.
Figure 19 G illustrates the other attitude that can import according to the playback duration at the media file such as photo files of another embodiment.Especially, be similar to the embodiment shown in Figure 18 A and the 18B, by distinguishing contact point number (i.e. Shou Zhi number) on touch-sensitive display, identical moving can differently be explained.In this example, the attitude of vertically hitting downwards of two fingers can be interpreted as being used to delete the attitude of photo files, mark photo files (for the purpose such as the establishment photograph album) or any other useful order.
Other other attitude is detected in the UI district that Figure 19 H illustrates other appointment that utilizes touch-sensitive display.In this example, detect contact point in another designation area 756 and can be interpreted as referring to deletion, mark or other useful order.According to an embodiment, a plurality of contact regions can be shown as the transparent covering of photo files.
Be illustrated in the attitude of hitting on vertical downward direction though should be noted that Figure 19, it will also be appreciated that, vertically upward to or horizontal direction on the attitude input of hitting and also can be designated as same commands.
Figure 20 illustrates a kind of possible algorithm that is used to the method shown in Figure 19 A-19F that realizes.Especially, at first step, on touch-sensitive display, show one of more than 790 photo.If detect 791 touches on display screen, can determine then whether 792 these touches are attitude inputs, and receive the type (for example follow the tracks of sliding action downwards, circularly follow the tracks of spinning movement, or the like) of 793 attitudes input.According to the input of detected attitude, can export 794UI element (for example slider bar or virtual scroll wheel) as required, can enable then 795 with the corresponding action of use of this UI element or the input of this attitude.
Should be noted that the method for describing among Figure 18-20 also can realize in video environment.Especially, during video file playback, also can enable and show the UI element the lateral slide-bar shown in Figure 18 A, thereby, according to detected contact point number, can activate the operator scheme of some the scalable aspect such as brightness, contrast that is used to change video.Simultaneously, can also realize rolling shown in Figure 19 A-19F and Zoom method in a similar manner, but what will carry out is to refund and F.F. action, but not rolls.
Can utilize attitude input on some existing control element to realize the other editor/playback of video file.According to a preferred embodiment, can be by optionally shrinking or launching playback time top-stitching bar and realize that the non-linear time of video file resets.Especially, Figure 21 A illustrates Video Applications 790 (using such as video playback), its display video playback 791 and progress bar 792, the time schedule that replay queue 793 instruction videos on the progress bar 792 are reset.
According to a preferred embodiment, replay queue 793 can move with the F.F. that realizes video on progress bar 792 forward or backward and refund.This formation also can remain on same position or with the nonlinear velocity adjustment, reset or time-out with the variable velocity that realizes video.According to a preferred embodiment, Video Applications 790 may be displayed on the touch-sensitive display, and can be by handled the position of replay queue 793 in this formation of position manual palpation that this formation can show on screen by the finger of hand 501.That is, replay queue 793 both can be used as progress indicator, also can be used as to be used for the speed that control of video resets and the UI element of time location.
According to a preferred embodiment, whole progress bar 792 can be used as a UI element, thereby the user can realize the non-linear playback of video by one or more parts of expansion or contraction progress bar.Especially, shown in Figure 21 B, can or dwindle attitude (discussing at Figure 12) and handle UI element progress bar 792 by two finger amplifications as top.In the example shown in Figure 21 B, amplify attitude and enable the expansion of playback duration between 60 minutes marks and 80 minutes marks.In the example shown in Figure 21 B, the playback speed of video becomes nonlinear, and reason is that the playback speed of video can slow down during the time period between 60 and 80 minutes marks.Replacedly, the playback speed of video can be accelerated after the mark between 0 and 60 minute mark and at 80 minutes, and is normal playback speed between the mark at 60 and 80 minutes.
Figure 21 C illustrates an other UI element 794 that is presented in the Video Applications 790.In this embodiment, UI element 794 can be a virtual scroll wheel, and the user can borrow the playback speed of its further control of video.Combine with manipulation to progress bar 792, user's part that at first playback speed will slow down in the designated, thus, the user can use roller 794 further to adjust playback direction and/or the speed of replay queue 793 with control of video.
Figure 21 D illustrates other the other touch-sensitive UI element that is used to edit purpose that rises to Video Applications 790.For example, shown in Figure 21 D, can increase slider bar UI element 796 and be used to enable the attitude input that rank is regulated, the adjusting of types such as described rank all translational adjustment in this way of adjusting or brightness, contrast, tone, gamma with detection.Be similar to the UI element of being discussed with reference to figure 18A-18E 751, slider bar UI element 796 can be used for enabling different operator schemes by the contact point number that changes on the slider bar UI element 796.
UI element 795 also can be presented in the Video Applications 790 to realize the sound-editing of video.Especially, UI element 795 can comprise and is used for and will regulates with the record of the different passages of video mix or sound or music or a plurality of ranks of playback.
According to a preferred embodiment, the user of Video Applications 790 can customize to show which UI element, and can arrange the function of described UI element carry out desired.
Figure 22 A and 22B illustrate an exemplary algorithm 800 that is used to realize the method described about Figure 21 A-21D.Especially, shown in Figure 22 A, Video Applications 790 can be initiated video playback to be provided and/or to edit 802.Can show 803 progress bars 792.Touch if on progress bar 792, detect 804, can determine that then 805 these touches are amplifications or dwindle order.If this touch is not detected as amplification or dwindles order, then can handle this replay queue according to the touch input of being followed the tracks of.If it is the convergent-divergent attitude that this touch is detected as, thereby the part that then detects touch in the progress bar can be handled according to this attitude input and launches or shrink.
At Figure 22 B, step 808-810 can be performed to show other UI element respectively alternatively, regulates such as roller, sound mixer and slider bar rank.Can enable appropriate functional 814-818 then in step 811-813 senses touch (a plurality of touch).
Figure 23 illustrates another embodiment of the present invention, and it is used for the playback and the record of manipulation of audio or music file.As shown in figure 23, music application 830 can show a pair of virtual rotating disk 842 and 843, is playing two music disc 834 and 835 on it, and described disc is one of single (single) or LP disc.Disc 834 and 835 can be the diagrammatic representation of passing through the digital music file (for example song A and song B) of music application 830 playbacks.In other words, disc can be the figure marking of music file, just looks like that music file is stamped on the physics disc.
Be similar to a pair of physics rotating disk, stylus 844 and stylus 855 can be the graphic icons indications of replay queue, and the position of this replay queue can be by changing at the desired locations that touches this formation on the touch-sensitive display screen and this icon is drawn on the figure disc.The redirect of moving the playback point that will cause respective songs of stylus is just as on the physics rotating disk.
Be similar to a pair of physics rotating disk equally, can begin by one or more finger touch/stop button 838 and 839, to switch beginning or the stop/pause that song is reproduced back and forth.Velocity variations bar 840 and 841 can be by the playback speed of linear regulation with the control song.Window 831 and 833 can figure ground reproduce reproduce the frequency representation of song, window 832 can show the frequency representation of the actual output of music application 832 simultaneously, this actual output can only be one of song of reproducing, or the mixing/combination of song.Mixing/translation bar 850 can be handled two songs reproducing with modulation or demodulation.
At the song reproduction period, can be similar to the physics disc and handle disc 834 and 835 like that.For example, the moving around fast of disc can cause the sound effect of disc " scraping (scratching) ", just as the music program host usually does on the physics rotating disk.
Should be noted that method described above can realize simultaneously during identical attitude stroke.That is, selection, tracking, convergent-divergent, rotation and translation can all be carried out during an attitude stroke, and the attitude stroke can comprise opens, rotates and the finger that slides.For example, in case put down at least two fingers, shown object (map) just can be associated or be locked onto this two fingers.For convergent-divergent, the user can open or close up its finger.In order to rotate, the user can rotate its finger.For translation, user's its finger that can slide.In these actions each can take place in a continuous motion simultaneously.For example, the user can open and close up its finger, rotates simultaneously and its finger that slides crosses touch-screen.Replacedly, the user can be cut apart each in these motions and needn't reset this attitude stroke.For example, the user can at first open its finger, rotates its finger then, closes up its finger, its finger that slides then then, or the like.
Shall also be noted that the finger of not necessarily always wanting the end user realizes attitude input.If possible, use the pointing apparatus such as stylus also to be enough to realize the attitude input.
Common transfer the possession of as the disclosed common pending application No.10/903 of U.S. Patent Publication No.US2006/0026521,964 and as the disclosed application of U.S. Patent Publication No.US2006/0026535 No.11/038, in 590, illustrated and described can be used as input, in order to realize interface command the additional example of the attitude stroke of---comprise with UI element mutual---, above-mentioned application is all incorporated herein by reference.
Those skilled in the art can carry out many variations and modification and not break away from the spirit and scope of the present invention.Therefore, it must be understood that it only is for exemplary purposes that the embodiment that illustrates is set forth, and they should not be considered to the restriction of the present invention to being defined by the following claims.For example, though described many embodiment of the present invention at personal computing devices here, be to be understood that, the invention is not restricted to desktop computer or laptop computer, but can be applicable to other computing application at large, such as mobile communication equipment, multimedia reproducing apparatus independently, or the like.
Being used to of using in this instructions describes the speech of the present invention and each embodiment thereof and not only will understand from the implication of its common qualification, and will comprise structure, material or action outside specific definitions, that the exceed common qualification implication in this manual scope.Therefore, if an element can be understood to include in the context of the present specification more than an implication, then its use in the claims must be understood that for this instructions and this speech itself supported might implication all be general.
Therefore, the speech in the claims or the definition of element are restricted to the combination that not only comprises literal element of go up setting forth in this manual, and comprise and be used for carrying out all equivalent structures, material or the action that basic identical function obtains essentially identical result in essentially identical mode.On this meaning, thereby can expect can be with any one element in two or more element equivalent substitute claims, perhaps can replace two or more elements in the claim with individual element.
The unsubstantiality that those skilled in the art expected is derived from the theme of being advocated changes, no matter be now known or after find out, all thought clearly to be equivalent within the scope of the claims.Therefore, present or obvious replacement well known by persons skilled in the art later on is restricted in the scope of the claim element that is limited.
Therefore, illustrate especially and describe above this claim will be understood to include, conceptive equivalence and can be obviously instead things.For example, the term of putting down in writing in the claim " computing machine " or " computer system " should comprise desk-top computer, laptop computer or at least such as any mobile computing device of mobile communication equipment (for example cell phone or Wi-Fi/Skype phone, E-mail communication apparatus, personal digital assistant device) and multimedia reproducing apparatus (for example, iPod, MP3 player or any digital figure/photo reproducer) and so on.

Claims (108)

1. the computer implemented method of a touch input that is used for processing and utilizing touch-sensitive display device, described method comprises step:
Detection is in the lip-deep touch input of described display device, and described touch input is included in described lip-deep at least one contact point of described display device;
Described at least one contact point is associated with at least one optional file object on being presented at described display device;
At least one attitude that identification is associated with detected touch input; And
The optional file object that is associated is carried out at least one operation,
Wherein described at least one attitude that is associated with detected touch input is partly discerned by the quantity that touches in turn that is associated with detected described at least one contact point on the described surface of described display device.
2. if the attitude of the method for claim 1, wherein being discerned is by the single touch of described at least one contact point on described optional file object, then performed described at least one operation is that file is selected.
3. the method for claim 1, wherein, if the attitude of being discerned is followed by by the described lip-deep towing of described contact point at described display device by the single touch of described at least one contact point on described optional file object, then performed described at least one operation comprises that file is selected and towing, and described towing and described at least one contact point advancing on described display device is relevant.
4. the method for claim 1, wherein, if the attitude of being discerned is to be touched in turn by described at least one contact point on described optional file object twice, then described at least one operation comprises described optional file object is associated, initiates described software operation and utilizes described software application to activate described optional file object with software application.
5. the method for claim 1, wherein, if the attitude of being discerned be by the single touch of described at least one contact point on described optional file object followed by at least once touching the described lip-deep of described display device by another contact point in described at least one contact point, but then described at least one operation is included in and shows a plurality of selection operations that can carry out explicitly with described optional file object on the described display device.
6. computer system comprises:
The touch-sensitive display device;
Be used to detect the device in the lip-deep touch input of described display device, described touch input is included in described lip-deep at least one contact point of described display device;
Be used for device that described at least one contact point is associated with at least one optional file object on being presented at described display device;
Be used to discern the device of at least one attitude that is associated with detected touch input;
Be used for the optional file object that is associated is carried out the device of at least one operation.
7. computer system as claimed in claim 6, wherein, if the attitude of being discerned is by the single touch of described at least one contact point on described optional file object, then performed described at least one operation is that file is selected.
8. computer system as claimed in claim 6, wherein, if the attitude of being discerned is followed by by the described lip-deep towing of described contact point at described display device by the single touch of described at least one contact point on described optional file object, then performed described at least one operation comprises that file is selected and towing, and described towing and described at least one contact point advancing on described display device is relevant.
9. computer system as claimed in claim 6, wherein, if the attitude of being discerned is by twice touch of described at least one contact point on described optional file object, then described at least one operation comprises described optional file object is associated, initiates described software operation and utilizes described software application to activate described optional file object with software application.
10. computer system as claimed in claim 6, wherein, if the attitude of being discerned be by the single touch of described at least one contact point on described optional file object followed by at least once touching the described lip-deep of described display device by another contact point in described at least one contact point, but then described at least one operation is included in and shows a plurality of selection operations that can carry out explicitly with described optional file object on the described display device.
Be used to make computer implemented method or handle the executable instruction that touches input 11. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises the touch-sensitive display device, and described method comprises step:
Detection is in the lip-deep touch input of described display device, and described touch input is included in described lip-deep at least one contact point of described display device;
Described at least one contact point is associated with at least one optional file object on being presented at described display device;
At least one attitude that identification is associated with detected touch input; And
The optional file object that is associated is carried out at least one operation.
12. program as claimed in claim 11, wherein, if the attitude of being discerned is by the single touch of described at least one contact point on described optional file object, then performed described at least one operation is that file is selected.
13. program as claimed in claim 11, wherein, if the attitude of being discerned is followed by by the described lip-deep towing of described contact point at described display device by the single touch of described at least one contact point on described optional file object, then performed described at least one operation comprises that file is selected and towing, and described towing and described at least one contact point advancing on described display device is relevant.
14. program as claimed in claim 11, wherein, if the attitude of being discerned is by twice touch of described at least one contact point on described optional file object, then described at least one operation comprises described optional file object is associated, initiates described software operation and utilizes described software application to activate described optional file object with software application.
15. program as claimed in claim 11, wherein, if the attitude of being discerned be by the single touch of described at least one contact point on described optional file object followed by at least once touching the described lip-deep of described display device by another contact point in described at least one contact point, but then described at least one operation is included in and shows a plurality of selection operations that can carry out explicitly with described optional file object on the described display device.
16. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive display device, described method comprises step:
Detection is in the lip-deep touch input of described display device, and described touch input is included in described lip-deep at least one contact point of described display device;
With described touch input be identified as be presented at described display device on the roll attitude that is associated of scrollable field, described roll attitude comprises the towing of described at least one contact point on the direction that expectation is rolled; And
Described scrollable field rolls on the direction that described expectation is rolled.
17. method as claimed in claim 16 also comprises step:
Periodically determine the quantity of the contact point be associated with described roll attitude; And
The quantity of the contact point that is associated according to determined and described roll attitude controls the speed of described rolling, and the speed of wherein said rolling is along with the quantity of determined contact point increases and increases.
18. method as claimed in claim 16 also comprises step:
Display scrolling action icon on described display device.
19. a computer system comprises:
The touch-sensitive display device;
Be used to detect the device in the lip-deep touch input of described display device, described touch input is included in described lip-deep at least one contact point of described display device;
Be used for described touch input be identified as be presented at described display device on the device of the roll attitude that is associated of scrollable field, described roll attitude comprises the towing of described at least one contact point on the direction that expectation is rolled; And
The device that is used for the described scrollable field of rolling on the direction that described expectation is rolled.
20. computer system as claimed in claim 19 also comprises:
Be used for the periodically device of the quantity of definite contact point that is associated with described roll attitude; And
The quantity that is used for the contact point that is associated according to determined and described roll attitude controls the device of the speed of described rolling, and the speed of wherein said rolling is along with the quantity of determined contact point increases and increases.
21. computer system as claimed in claim 19 also comprises being used to make the device of scroll actions icon display on described display device.
Be used to make computer implemented method or handle the executable instruction that touches input 22. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises the touch-sensitive display device, and described method comprises step:
Detection is in the lip-deep touch input of described display device, and described touch input is included in described lip-deep at least one contact point of described display device;
With described touch input be identified as be presented at described display device on the roll attitude that is associated of scrollable field, described roll attitude comprises the towing of described at least one contact point on the direction that expectation is rolled; And
Described scrollable field rolls on the direction that described expectation is rolled.
23. program as claimed in claim 22, wherein said method also comprises step:
Periodically determine the quantity of the contact point be associated with described roll attitude; And
The quantity of the contact point that is associated according to determined and described roll attitude controls the speed of described rolling, and the speed of wherein said rolling is along with the quantity of determined contact point increases and increases.
24. program as claimed in claim 22, wherein said method also comprises step:
Display scrolling action icon on described display device.
25. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive display device, described method comprises step:
Show a plurality of thumbnail icons on described display device, each described thumbnail icons is represented a graphic file;
Detecting the touch that is associated with at least one thumbnail icons in described a plurality of thumbnail icons imports;
Described touch input is defined as being used to rotate the attitude of described at least one thumbnail icons; And
The display direction of described at least one thumbnail icons of rotation.
26. method as claimed in claim 25, the display direction of wherein said at least one thumbnail icons is rotated by 90 degrees.
27. method as claimed in claim 25 also comprises step:
Regulate the display direction of the described graphic file that is associated with described at least one thumbnail icons.
28. the described method of claim 25 also comprises step:
Detecting be associated with described at least one thumbnail icons approaching hovers; And
Temporarily increase the demonstration size of described at least one thumbnail icons.
29. a computer system comprises:
The touch-sensitive display device;
Be used for showing on described display device the device of a plurality of thumbnail icons, each described thumbnail icons is represented a graphic file;
Be used for detecting the device of the touch input that is associated with at least one thumbnail icons of described a plurality of thumbnail icons;
Be used for described touch input being defined as being used to rotating the device of the attitude of described at least one thumbnail icons; And
Be used to make the device of the display direction rotation of described at least one thumbnail icons.
30. computer system as claimed in claim 29, the display direction of wherein said at least one thumbnail icons is rotated by 90 degrees.
31. computer system as claimed in claim 29 also comprises the device of the display direction that is used to regulate the described graphic file that is associated with described at least one thumbnail icons.
32. computer system as claimed in claim 29 also comprises:
Be used to detect the approaching device that hovers that is associated with described at least one thumbnail icons; And
Be used for temporarily increasing the device of the demonstration size of described at least one thumbnail icons.
Be used to make computer implemented method or handle the executable instruction that touches input 33. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises the touch-sensitive display device, and described method comprises step:
Show a plurality of thumbnail icons on described display device, each described thumbnail icons is represented a graphic file;
Detecting the touch that is associated with at least one thumbnail icons in described a plurality of thumbnail icons imports;
Described touch input is defined as being used to rotate the attitude of described at least one thumbnail icons; And
The display direction of described at least one thumbnail icons of rotation.
34. program as claimed in claim 33, the display direction of wherein said at least one thumbnail icons is rotated by 90 degrees.
35. program as claimed in claim 33 also comprises step:
Regulate the display direction of the described graphic file that is associated with described at least one thumbnail icons.
36. program as claimed in claim 33 also comprises step:
Detecting be associated with described at least one thumbnail icons approaching hovers; And
Temporarily increase the demonstration size of described at least one thumbnail icons.
37. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive input equipment and display device, described method comprises step:
Display media file on described display device;
Display graphics user interface element on described display device;
The touch input of detection on described touch-sensitive input equipment;
Described touch input is associated with described graphical user-interface element;
Described touch input is defined as rank regulates attitude; And
Regulate the rank that attitude is regulated the graphic feature of shown media file according to described rank.
38. being ranks, method as claimed in claim 37, wherein said graphical user-interface element regulate one of bar and roller.
39. method as claimed in claim 37 also comprises step:
Determine the quantity of the contact point on described graphical user-interface element; And
The graphic feature of the shown media file that selection will be regulated from a plurality of graphic features.
40. method as claimed in claim 37, wherein said media file are one of photo files, graphic file and video file.
41. method as claimed in claim 39, wherein said a plurality of graphic features are one of brightness, contrast, contrast factor and acutance.
42. method as claimed in claim 37, wherein said touch-sensitive device and described display device are overlapping.
43. method as claimed in claim 37, wherein said rank regulate attitude be by one or more contact points on described graphical user-interface element the motion that is essentially straight line or be essentially one of motion of circular arc.
44. a computer system comprises:
The touch-sensitive input equipment;
Display device;
Be used to make media file to be presented at device on the described display device;
Be used to make graphical user-interface element to be presented at device on the described display device;
Be used to detect the device of the touch input on described touch-sensitive input equipment;
Be used for device that described touch input is associated with described graphical user-interface element;
Be used for described touch input is defined as the device that rank is regulated attitude; And
Be used for regulating linearly other device of level of the graphic feature of shown media file according to described rank adjusting attitude.
45. being ranks, computer system as claimed in claim 44, wherein said graphical user-interface element regulate one of bar and roller.
46. computer system as claimed in claim 44 also comprises:
Be used to determine the device of the quantity of the contact point on described graphical user-interface element; And
Be used for selecting the device of the graphic feature of the shown media file that will regulate from a plurality of graphic features.
47. computer system as claimed in claim 44, wherein said media file are one of photo files, graphic file and video file.
48. computer system as claimed in claim 46, wherein a plurality of graphic features are one of brightness, contrast, contrast factor and acutance.
49. computer system as claimed in claim 44, wherein said touch-sensitive device and described display device are overlapping.
50. computer system as claimed in claim 44, wherein said rank regulate attitude be by one or more contact points on described graphical user-interface element the motion that is essentially straight line or be essentially one of motion of circular arc.
Be used to make computer implemented method or handle the executable instruction that touches input 51. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises touch-sensitive input equipment and display device, and described method comprises step:
Display media file on described display device;
Display graphics user interface element on described display device;
The touch input of detection on described touch-sensitive input equipment;
Described touch input is associated with described graphical user-interface element;
Described touch input is defined as rank regulates attitude; And
Regulate the rank that attitude is regulated the graphic feature of shown media file linearly according to described rank.
52. being ranks, program as claimed in claim 51, wherein said graphical user-interface element regulate one of bar and roller.
53. program as claimed in claim 51 also comprises step:
Determine the quantity of the contact point on described graphical user-interface element; And
The graphic feature of the shown media file that selection will be regulated from a plurality of graphic features.
54. program as claimed in claim 51, wherein said media file are one of photo files, graphic file and video file.
55. program as claimed in claim 53, wherein said a plurality of graphic features are one of brightness, contrast, contrast factor and acutance.
56. program as claimed in claim 51, wherein said touch-sensitive device and described display device are overlapping.
57. program as claimed in claim 51, wherein said rank regulate attitude be by one or more contact points on described graphical user-interface element the motion that is essentially straight line or be essentially one of motion of circular arc.
58. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive input equipment and display device, described method comprises step:
Display media file on described display device;
The touch input of detection on described touch-sensitive input equipment;
Described touch input is defined as the medium roll attitude; And
Demonstration with second media file replaces the demonstration of described media file on described display device.
59. method as claimed in claim 58, wherein said medium roll attitude is made up of vertically hitting downwards of a contact point.
60. method as claimed in claim 58 also comprises step:
The display graphics user interface element; And
Described touch input is associated with described graphical user-interface element.
61. method as claimed in claim 60, wherein said graphical user-interface element is a vertical bar.
62. method as claimed in claim 60, wherein said graphical user-interface element is a roller, and the medium roll attitude is made up of the circular arc motion of at least one contact point that is associated with described roller.
63. method as claimed in claim 58, the in advance appointed area place of wherein said touch input on described touch-sensitive input equipment is detected.
64. method as claimed in claim 58 also comprises step:
Detect second and touch input;
Touch input with described second and be defined as one of mark or deletion attitude; And
Touch input according to detected second, mark or delete described media file.
65. method as claimed in claim 58, wherein said touch-sensitive input equipment and described display device are overlapping.
66. a computer system comprises:
The touch-sensitive input equipment;
Display device;
Be used to make media file to be presented at device on the described display device;
Be used to detect the device of the touch input on described touch-sensitive input equipment;
Be used for described touch input is defined as the device of medium roll attitude; And
Be used to device that second media file is shown, the demonstration of wherein said second media file replaces the demonstration of described media file.
67. as the described computer system of claim 65, wherein said medium roll attitude is made up of vertically hitting downwards of a contact point.
68., also comprise as the described computer system of claim 65:
Be used to device that graphical user-interface element is shown; And
Be used for device that described touch input is associated with described graphical user-interface element.
69. as the described computer system of claim 65, wherein said graphical user-interface element is a vertical bar.
70. as the described computer system of claim 65, wherein said graphical user-interface element is a roller, and the medium roll attitude is made up of the circular arc motion of at least one contact point that is associated with described roller.
71. as the described computer system of claim 65, the in advance appointed area place of wherein said touch input on described touch-sensitive input equipment is detected.
72., also comprise as the described computer system of claim 65:
Be used on described touch-sensitive input equipment, detecting second and touch the device of importing;
Be used for touching the device that input is defined as one of mark or deletion attitude with described second; And
Be used for touching input and mark or delete the device of described media file according to detected second.
73. as the described computer system of claim 65, wherein said touch-sensitive input equipment and described display device are overlapping.
Be used to make computer implemented method or handle the executable instruction that touches input 74. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises touch-sensitive input equipment and display device, and described method comprises step:
Display media file on described display device;
The touch input of detection on described touch-sensitive input equipment;
Described touch input is defined as the medium roll attitude; And
Demonstration with second media file replaces the demonstration of described media file on described display device.
75. as the described program of claim 74, wherein said medium roll attitude is made up of vertically hitting downwards of a contact point.
76., also comprise step as the described program of claim 74:
The display graphics user interface element; And
Described touch input is associated with described graphical user-interface element.
77. as the described program of claim 74, wherein said graphical user-interface element is a vertical bar.
78. as the described program of claim 74, wherein said graphical user-interface element is a roller, and the medium roll attitude is made up of the circular arc motion of at least one contact point that is associated with described roller.
79. as the described program of claim 74, the in advance appointed area place of wherein said touch input on described touch-sensitive input equipment is detected.
80., also comprise step as the described program of claim 74:
Detect second and touch input;
Touch input with described second and be defined as one of mark or deletion attitude; And
Touch input according to detected second, mark or delete described media file.
81. as the described program of claim 74, wherein said touch-sensitive input equipment and described display device are overlapping.
82. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive display device, described method comprises step:
On described display device, reproduce video file;
The progress bar that on described display device, displays the play, described playing progress bar comprises the timeline of the time span of indicating described video file;
The progress that displays the play formation, described playing progress rate formation are displayed on a certain position on the described playing progress bar, are used to the present timing of indicating described rabbit just proceeding to;
Senses touch input, described touch input is the time shift attitude, wherein said time shift attitude is that near the display position of described playing progress rate formation single contact point is followed by the drag kick of described playing progress rate formation along described playing progress bar; And
Regulate the described present timing of described rabbit according to described time shift attitude.
83. a computer system comprises:
The touch-sensitive display device;
Be used on described display device, reproducing the device of video file;
Be used to make playing progress bar to be presented at device on the described display device, described playing progress bar comprises the timeline of the time span of indicating described video file;
Be used to device that the playing progress rate formation is shown, described playing progress rate formation is displayed on a certain position on the described playing progress bar, is used to the present timing of indicating described rabbit just proceeding to;
The device that is used for the senses touch input, described touch input is the time shift attitude, and wherein said time shift attitude is that near the display position of described playing progress rate formation single contact point is followed by the drag kick of described playing progress rate formation along described playing progress bar; And
Be used for regulating the device of the described present timing of described rabbit according to described time shift attitude.
Be used to make computer implemented method or handle the executable instruction that touches input 84. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises the touch-sensitive display device, and described method comprises step:
On described display device, reproduce video file;
The progress bar that on described display device, displays the play, described playing progress bar comprises the timeline of the time span of indicating described video file;
The progress that displays the play formation, described playing progress rate formation are displayed on a certain position on the described playing progress bar, are used to the present timing of indicating described rabbit just proceeding to;
Senses touch input, described touch input is the time shift attitude, wherein said time shift attitude is that near the display position of described playing progress rate formation single contact point is followed by the drag kick of described playing progress rate formation along described playing progress bar; And
Regulate the described present timing of described rabbit according to described time shift attitude.
85. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive display device, described method comprises step:
On described display device, reproduce video file;
The progress bar that on described display device, displays the play, described playing progress bar comprises the linear session line of the linear session span of the described video file that indication is being reproduced;
The touch input of detection on the specified portions of described playing progress bar, described touch input is to launch one of attitude or retracted posture, described expansion attitude is that two contact points are moved away from each other on described playing progress bar as the crow flies, and described retracted posture is that court is mobile each other as the crow flies on described playing progress bar for two contact points; And
According to detected touch input, launch or shrink the described specified portions of described playing progress bar,
The end points of the described specified portions of wherein said playing progress bar is by the initial position definition of described two contact points.
86., also comprise step as the described method of claim 85:
If detected touch input is to launch attitude, then shrink the remainder of described playing progress bar; And
If detected touch input is a retracted posture, then launch the remainder of described playing progress bar.
87. as the described method of claim 86, the described remainder of wherein said playing progress bar comprises the whole playing progress bar except that the described specified portions of described playing progress bar.
88., also comprise step as the described method of claim 85:
Control the reproduction speed of described video file changeably, if so that described rabbit occur in the corresponding time range of the deploying portion of described playing progress bar in, then described reproduction speed slows down, if and described rabbit occur in the corresponding time range of the constriction of described playing progress bar in, then described reproduction speed is accelerated.
89., also comprise step as the described method of claim 85:
Show translation UI element; And
Show roller UI element;
90., also comprise step as the described method of claim 85:
Show a plurality of voice-grade channel ranks adjusting UI elements.
91. a computer system comprises:
The touch-sensitive display device;
Be used on described display device, reproducing the device of video file;
Be used to make playing progress bar to be presented at device on the display apparatus, described playing progress bar comprises the linear session line of the linear session span of the described video file that indication is being reproduced;
Be used to detect the device of the touch input on the specified portions of described playing progress bar, described touch input is to launch one of attitude or retracted posture, described expansion attitude is that two contact points are moved away from each other on described playing progress bar as the crow flies, and described retracted posture is that court is mobile each other as the crow flies on described playing progress bar for two contact points; And
Be used for launching or shrink the device of the described specified portions of described playing progress bar according to detected touch input,
The end points of the described specified portions of wherein said playing progress bar is by the initial position definition of described two contact points.
92., also comprise the device that is used to launch or shrink the remainder of described playing progress bar as the described computer system of claim 91.
93. as the described computer system of claim 92, the described remainder of wherein said playing progress bar comprises the whole playing progress bar except that the described specified portions of described playing progress bar.
94. as the described computer system of claim 91, if also comprise the reproduction speed that is used for controlling changeably described video file so that described rabbit occur in the corresponding time range of the deploying portion of described playing progress bar in, then described reproduction speed slows down, if and described rabbit occur in the corresponding time range of the constriction of described playing progress bar in, the device that then described reproduction speed is accelerated.
95., also comprise as the described computer system of claim 91:
Be used to device that translation UI element is shown; And
Be used to device that roller UI element is shown.
96., also comprise being used to make a plurality of voice-grade channel ranks to regulate the device that the UI element is shown as the described computer system of claim 91.
Be used to make computer implemented method or handle the executable instruction that touches input 97. a program that is included in the computer-readable medium, described program comprise, described computing machine comprises the touch-sensitive display device, and described method comprises step:
On described display device, reproduce video file;
The progress bar that on described display device, displays the play, described playing progress bar comprises the linear session line of the linear session span of the described video file that indication is being reproduced;
The touch input of detection on the specified portions of described playing progress bar, described touch input is to launch one of attitude or retracted posture, described expansion attitude is that two contact points are moved away from each other on described playing progress bar as the crow flies, and described retracted posture is that court is mobile each other as the crow flies on described playing progress bar for two contact points; And
According to detected touch input, launch or shrink the described specified portions of described playing progress bar,
The end points of the described specified portions of wherein said playing progress bar is by the initial position definition of described two contact points.
98. as the described program of claim 97, wherein said method also comprises step:
If detected touch input is to launch attitude, then shrink the remainder of described playing progress bar; And
If detected touch input is a retracted posture, then launch the remainder of described playing progress bar.
99. as the described program of claim 98, the described remainder of wherein said playing progress bar comprises the whole playing progress bar except that the described specified portions of described playing progress bar.
100. as the described program of claim 97, wherein said method also comprises step:
Control the reproduction speed of described video file changeably, if so that described rabbit occur in the corresponding time range of the deploying portion of described playing progress bar in, then described reproduction speed slows down, if and described rabbit occur in the corresponding time range of the constriction of described playing progress bar in, then described reproduction speed is accelerated.
101., also comprise step as the described program of claim 97:
Show translation UI element; And
Show roller UI element.
102., also comprise step as the described program of claim 97:
Show a plurality of voice-grade channel ranks adjusting UI elements.
103. a computer implemented method that is used for the touch input of processing and utilizing touch-sensitive display device, described method comprises step:
Music file playing;
Show the music disc graphic icons of music file data, described music disc graphic icons is represented described music file data; And
At the reproduction period of described music file, the described music disc graphic icons of figure ground rotation.
104., also comprise step as the described method of claim 103:
Senses touch input, described touch input is the disc athletic posture, described disc athletic posture is the rotation attitude of the described music disc graphic icons that undertaken by at least one contact point;
According to described rotation attitude, rotate described music disc graphic icons; And
According to rotatablely moving of described music disc graphic icons, control the described reproduction of described music file changeably.
105., also comprise step as the described method of claim 103:
Show the stylus graphic icons, described stylus graphic icons is represented the replay queue of described music file.
106., also comprise step as the described method of claim 105:
The senses touch input, described touch input is the stylus athletic posture, described stylus athletic posture is at the contact point on the described stylus graphic icons and on described music disc graphic icons described stylus graphic icons to be drawn to the another location from a position, and another timing of the reproduction of described music file is represented in described another location; And
Described music file is reproduced in described another timing place in the reproduction of described music file.
107., also comprise step as the described method of claim 105:
Reproduce second music file;
Show the second music disc graphic icons of music file data, described music disc graphic icons is represented the second music file data;
Show the second stylus graphic icons, the described second stylus graphic icons is represented the replay queue of described second music file; And
Mix described music file data and the described second music file data to produce the music file data of mixing.
108. as claim 6,19,29,44,66,83 or 91 described computer systems, wherein said computer system is one of mobile phone and digital audio-frequency player.
CN200780051755.2A 2007-01-05 2007-12-28 Controlling, manipulating, and editing gestures of media files using touch sensitive devices Active CN101611373B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310719094.3A CN103631496B (en) 2007-01-05 2007-12-28 Attitude using touch-sensitive device control, manipulation and editing media file

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US87875407P 2007-01-05 2007-01-05
US60/878,754 2007-01-05
US11/818,342 2007-06-13
US11/818,342 US7956847B2 (en) 2007-01-05 2007-06-13 Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
PCT/US2007/089162 WO2008083360A1 (en) 2007-01-05 2007-12-28 Gestures for controlling, manipulating, and editing of media files using touch sensitive devices

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201310719094.3A Division CN103631496B (en) 2007-01-05 2007-12-28 Attitude using touch-sensitive device control, manipulation and editing media file

Publications (2)

Publication Number Publication Date
CN101611373A true CN101611373A (en) 2009-12-23
CN101611373B CN101611373B (en) 2014-01-29

Family

ID=38860004

Family Applications (2)

Application Number Title Priority Date Filing Date
CN 200720194296 Expired - Lifetime CN201181467Y (en) 2007-01-05 2007-12-05 Hand-hold mobile communicating device
CN200780051755.2A Active CN101611373B (en) 2007-01-05 2007-12-28 Controlling, manipulating, and editing gestures of media files using touch sensitive devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN 200720194296 Expired - Lifetime CN201181467Y (en) 2007-01-05 2007-12-05 Hand-hold mobile communicating device

Country Status (2)

Country Link
CN (2) CN201181467Y (en)
DE (1) DE202007014957U1 (en)

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101945499A (en) * 2010-09-06 2011-01-12 深圳市同洲电子股份有限公司 Method, terminal and system for transferring files
CN101957718A (en) * 2010-06-22 2011-01-26 宇龙计算机通信科技(深圳)有限公司 Method and device for moving icons and digital terminal
CN101986249A (en) * 2010-07-14 2011-03-16 上海无戒空间信息技术有限公司 Method for controlling computer by using gesture object and corresponding computer system
CN102025831A (en) * 2010-11-18 2011-04-20 华为终端有限公司 Multimedia playing method and terminal
WO2011140956A1 (en) * 2010-05-08 2011-11-17 杭州惠道科技有限公司 Multipoint touch method on circular touch-sensitive man-machine interface
CN102281349A (en) * 2010-06-14 2011-12-14 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102331902A (en) * 2010-06-02 2012-01-25 洛克威尔自动控制技术股份有限公司 The operated system and the method that are used for touch-screen
CN102402286A (en) * 2010-09-03 2012-04-04 微软公司 Dynamic gesture parameters
CN102446060A (en) * 2010-09-30 2012-05-09 Lg电子株式会社 Mobile terminal and method of controlling mobile terminal
CN102467327A (en) * 2010-11-10 2012-05-23 上海无戒空间信息技术有限公司 Method for generating and editing gesture object and operation method of audio data
CN102609143A (en) * 2012-02-15 2012-07-25 张群 Handheld electronic equipment and video playing and controlling method thereof
CN102609184A (en) * 2010-12-29 2012-07-25 三星电子株式会社 Method and apparatus for providing mouse right click function in touch screen terminal
CN102681748A (en) * 2011-03-09 2012-09-19 联想(北京)有限公司 Information processing equipment and information processing method
CN102714514A (en) * 2010-01-06 2012-10-03 三星电子株式会社 Method and apparatus for setting section of a multimedia file in mobile device
CN102890694A (en) * 2011-09-22 2013-01-23 北京师科阳光信息技术有限公司 Time shaft system and implementation method thereof
WO2013037259A1 (en) * 2011-09-15 2013-03-21 北京同步科技有限公司 Interactive multimedia information distribution terminal and information distribution method thereof
CN103035273A (en) * 2012-12-12 2013-04-10 宁波高新区百瑞音响科技有限公司 Device by utilizing knob type digital coding switch for switching audio files
CN103076985A (en) * 2013-01-31 2013-05-01 北京魔力时间科技有限公司 Video playing progress precisely controlling and displaying device based on touch screen and use method thereof
CN103092389A (en) * 2011-11-04 2013-05-08 德尔福技术有限公司 Touch screen device and method for achieving virtual mouse action
CN103168284A (en) * 2010-08-27 2013-06-19 苹果公司 Touch and hover switching
CN103218151A (en) * 2011-11-25 2013-07-24 三星电子株式会社 Device and method for displaying object in terminal
CN103247310A (en) * 2012-02-14 2013-08-14 索尼爱立信移动通讯有限公司 Multimedia playing control method, playing control module and playing terminal
CN103324433A (en) * 2012-03-23 2013-09-25 佳能株式会社 Display control apparatus and control method for the same
CN103327247A (en) * 2013-06-17 2013-09-25 山东神思电子技术股份有限公司 Portrait collection operation device and method
WO2013178152A1 (en) * 2012-08-28 2013-12-05 中兴通讯股份有限公司 Method for copying and pasting text through dragging for terminal, and terminal
CN103582865A (en) * 2011-05-31 2014-02-12 三星电子株式会社 Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same
CN103794228A (en) * 2012-10-29 2014-05-14 仁宝电脑工业股份有限公司 Electronic apparatus with proximity sensing capability and proximity sensing control method therefor
CN103885623A (en) * 2012-12-24 2014-06-25 腾讯科技(深圳)有限公司 Mobile terminal, system and method for processing sliding event into editing gesture
CN103902173A (en) * 2012-12-26 2014-07-02 联想(北京)有限公司 Portable terminal and information processing method and displaying processing method of portable terminal
CN104077028A (en) * 2014-05-28 2014-10-01 天津三星通信技术研究有限公司 Equipment and method for controlling display item in electronic equipment
CN104123088A (en) * 2013-04-24 2014-10-29 华为技术有限公司 Mouse operation implementing method and device and touch screen terminal
CN104185831A (en) * 2012-04-02 2014-12-03 辛纳普蒂克斯公司 Systems and methods for dynamically modulating a user interface parameter using an input device
CN104216625A (en) * 2013-05-31 2014-12-17 华为技术有限公司 Display object display position adjusting method and terminal equipment
CN104423799A (en) * 2013-08-23 2015-03-18 夏普株式会社 Interface device and interface method
CN104571871A (en) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 Method and system for selecting files
CN104581378A (en) * 2013-10-16 2015-04-29 三星电子株式会社 Apparatus and method for editing synchronous media
CN104737112A (en) * 2012-10-16 2015-06-24 微软公司 Thumbnail and document map based navigation in a document
CN104838340A (en) * 2012-12-14 2015-08-12 三星电子株式会社 Method and apparatus for controlling haptic feedback of input tool for mobile terminal
CN104902331A (en) * 2014-03-07 2015-09-09 联想(北京)有限公司 Play progress regulating method and electronic equipment
CN105009059A (en) * 2013-02-27 2015-10-28 阿尔卑斯电气株式会社 Operation detection device
CN105045513A (en) * 2015-08-27 2015-11-11 广东欧珀移动通信有限公司 Touch operation method and handheld device
CN105247463A (en) * 2013-03-03 2016-01-13 微软技术许可有限责任公司 Enhanced canvas environments
CN105573631A (en) * 2015-12-14 2016-05-11 联想(北京)有限公司 Touch display electronic device and control method thereof
CN105573616A (en) * 2015-12-10 2016-05-11 广东欧珀移动通信有限公司 Playlist control method and mobile terminal
CN105653111A (en) * 2014-11-14 2016-06-08 神讯电脑(昆山)有限公司 Touch control input method and electronic device thereof
CN106527917A (en) * 2016-09-23 2017-03-22 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interactive system
CN106612425A (en) * 2015-10-23 2017-05-03 腾讯科技(深圳)有限公司 Image adjusting method and terminal equipment
CN107092431A (en) * 2011-06-03 2017-08-25 索尼公司 Display control unit and display control method
CN107438839A (en) * 2016-10-25 2017-12-05 深圳市大疆创新科技有限公司 A kind of multimedia editing method, device and intelligent terminal
CN108349082A (en) * 2015-11-11 2018-07-31 库卡德国有限公司 Method and computer program for the graphic user interface for generating executor program
CN108509134A (en) * 2011-09-28 2018-09-07 搜诺思公司 Method and apparatus for the region for managing multizone media playback system

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
TWI363983B (en) * 2008-04-25 2012-05-11 Benq Corp Interactive electronic apparatus and interaction method thereof
DE102008032451C5 (en) * 2008-07-10 2017-10-19 Rational Ag Display method and cooking appliance therefor
DE102008032448B4 (en) 2008-07-10 2023-11-02 Rational Ag Display method and cooking device therefor
EP2863289A1 (en) 2008-11-18 2015-04-22 Studer Professional Audio GmbH Input device and method of detecting a user input with an input device
US8957865B2 (en) 2009-01-05 2015-02-17 Apple Inc. Device, method, and graphical user interface for manipulating a user interface object
US8330733B2 (en) * 2009-01-21 2012-12-11 Microsoft Corporation Bi-modal multiscreen interactivity
KR20100086678A (en) * 2009-01-23 2010-08-02 삼성전자주식회사 Apparatus and method for playing of multimedia item
CN101799727B (en) * 2009-02-11 2012-11-07 晨星软件研发(深圳)有限公司 Signal processing device and method of multipoint touch interface and selecting method of user interface image
CN101847055A (en) * 2009-03-24 2010-09-29 鸿富锦精密工业(深圳)有限公司 Input method based on touch screen
KR101640463B1 (en) 2009-05-19 2016-07-18 삼성전자 주식회사 Operation Method And Apparatus For Portable Device
KR101646922B1 (en) 2009-05-19 2016-08-23 삼성전자 주식회사 Operation Method of associated with a communication function And Portable Device supporting the same
US8373669B2 (en) * 2009-07-21 2013-02-12 Cisco Technology, Inc. Gradual proximity touch screen
KR101608770B1 (en) * 2009-08-03 2016-04-04 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR20120085783A (en) * 2009-09-23 2012-08-01 딩난 한 Method and interface for man-machine interaction
TWI407359B (en) 2009-10-09 2013-09-01 Egalax Empia Technology Inc Method and device for position detection
CN102043506B (en) 2009-10-09 2013-07-17 禾瑞亚科技股份有限公司 Method and device for analyzing positions
CN102043510A (en) 2009-10-09 2011-05-04 禾瑞亚科技股份有限公司 Method and device for analyzing two-dimension sensing information
US8643613B2 (en) 2009-10-09 2014-02-04 Egalax—Empia Technology Inc. Method and device for dual-differential sensing
CN102063216B (en) * 2009-10-09 2012-12-12 禾瑞亚科技股份有限公司 Device and method for parallel-scanning differential touch detection
EP2500799A4 (en) 2009-10-09 2014-07-23 Egalax Empia Technology Inc Method and apparatus for converting sensing information
US9864471B2 (en) 2009-10-09 2018-01-09 Egalax_Empia Technology Inc. Method and processor for analyzing two-dimension information
CN102043552B (en) 2009-10-09 2016-04-20 禾瑞亚科技股份有限公司 The method and apparatus of capacitive position detection
JP2011107912A (en) * 2009-11-16 2011-06-02 Sony Corp Apparatus, method and program for processing information
US8786559B2 (en) 2010-01-06 2014-07-22 Apple Inc. Device, method, and graphical user interface for manipulating tables using multi-contact gestures
US9411504B2 (en) 2010-01-28 2016-08-09 Microsoft Technology Licensing, Llc Copy and staple gestures
US8261213B2 (en) 2010-01-28 2012-09-04 Microsoft Corporation Brush, carbon-copy, and fill gestures
US9519356B2 (en) 2010-02-04 2016-12-13 Microsoft Technology Licensing, Llc Link gestures
US9274682B2 (en) 2010-02-19 2016-03-01 Microsoft Technology Licensing, Llc Off-screen gestures to create on-screen input
US9965165B2 (en) 2010-02-19 2018-05-08 Microsoft Technology Licensing, Llc Multi-finger gestures
US9310994B2 (en) 2010-02-19 2016-04-12 Microsoft Technology Licensing, Llc Use of bezel as an input mechanism
US9367205B2 (en) 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
US9454304B2 (en) 2010-02-25 2016-09-27 Microsoft Technology Licensing, Llc Multi-screen dual tap gesture
US8539384B2 (en) * 2010-02-25 2013-09-17 Microsoft Corporation Multi-screen pinch and expand gestures
US9075522B2 (en) 2010-02-25 2015-07-07 Microsoft Technology Licensing, Llc Multi-screen bookmark hold gesture
CN102193714A (en) * 2010-03-11 2011-09-21 龙旗科技(上海)有限公司 Man-machine interactive mode for data grouping management of mobile terminal
JP5557314B2 (en) * 2010-03-24 2014-07-23 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
US9990062B2 (en) * 2010-03-26 2018-06-05 Nokia Technologies Oy Apparatus and method for proximity based input
CN102270081B (en) * 2010-06-03 2015-09-23 腾讯科技(深圳)有限公司 A kind of method and device adjusting size of list element
US8773370B2 (en) 2010-07-13 2014-07-08 Apple Inc. Table editing systems with gesture-based insertion and deletion of columns and rows
KR20120020247A (en) * 2010-08-27 2012-03-08 삼성전자주식회사 Portable electronic device, apparatus and method for playing contents
US8767019B2 (en) 2010-08-31 2014-07-01 Sovanta Ag Computer-implemented method for specifying a processing operation
US8972467B2 (en) 2010-08-31 2015-03-03 Sovanta Ag Method for selecting a data set from a plurality of data sets by means of an input device
CA2823388A1 (en) * 2011-01-06 2012-07-12 Tivo Inc. Method and apparatus for gesture based controls
CN102799299B (en) * 2011-05-27 2015-11-25 华硕电脑股份有限公司 The computer system of tool touch control screen and the disposal route of gesture thereof
CN102243573A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method and device for managing element attribute in application program
CN102243662A (en) * 2011-07-27 2011-11-16 北京风灵创景科技有限公司 Method for displaying browser interface on mobile equipment
KR101262700B1 (en) * 2011-08-05 2013-05-08 삼성전자주식회사 Method for Controlling Electronic Apparatus based on Voice Recognition and Motion Recognition, and Electric Apparatus thereof
EP2555536A1 (en) 2011-08-05 2013-02-06 Samsung Electronics Co., Ltd. Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same
JP6004716B2 (en) 2012-04-13 2016-10-12 キヤノン株式会社 Information processing apparatus, control method therefor, and computer program
CN102750096A (en) * 2012-06-15 2012-10-24 深圳乐投卡尔科技有限公司 Vehicle-mounted Android platform multi-point gesture control method
US9582122B2 (en) 2012-11-12 2017-02-28 Microsoft Technology Licensing, Llc Touch-sensitive bezel techniques
CN104035696B (en) * 2013-03-04 2017-12-19 观致汽车有限公司 Display methods and device of the vehicle-mounted message center in touch display interface
US10180728B2 (en) 2013-05-17 2019-01-15 Citrix Systems, Inc. Remoting or localizing touch gestures at a virtualization client agent
US9477337B2 (en) 2014-03-14 2016-10-25 Microsoft Technology Licensing, Llc Conductive trace routing for display and bezel sensors
CN105224220A (en) * 2015-09-08 2016-01-06 深圳市金立通信设备有限公司 A kind of control method of media play and device
CN108616771B (en) * 2018-04-25 2021-01-15 维沃移动通信有限公司 Video playing method and mobile terminal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5900875A (en) * 1997-01-29 1999-05-04 3Com Corporation Method and apparatus for interacting with a portable computer system

Cited By (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8839108B2 (en) 2010-01-06 2014-09-16 Samsung Electronics Co., Ltd. Method and apparatus for selecting a section of a multimedia file with a progress indicator in a mobile device
CN102714514B (en) * 2010-01-06 2014-12-03 三星电子株式会社 Method and apparatus for setting section of a multimedia file in mobile device
CN102714514A (en) * 2010-01-06 2012-10-03 三星电子株式会社 Method and apparatus for setting section of a multimedia file in mobile device
WO2011140956A1 (en) * 2010-05-08 2011-11-17 杭州惠道科技有限公司 Multipoint touch method on circular touch-sensitive man-machine interface
CN102331902A (en) * 2010-06-02 2012-01-25 洛克威尔自动控制技术股份有限公司 The operated system and the method that are used for touch-screen
CN102331902B (en) * 2010-06-02 2015-02-11 洛克威尔自动控制技术股份有限公司 System and method for the operation of a touch screen
US9659015B2 (en) 2010-06-14 2017-05-23 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN102281349A (en) * 2010-06-14 2011-12-14 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102281349B (en) * 2010-06-14 2015-03-18 Lg电子株式会社 Mobile terminal and controlling method thereof
CN101957718A (en) * 2010-06-22 2011-01-26 宇龙计算机通信科技(深圳)有限公司 Method and device for moving icons and digital terminal
CN101986249A (en) * 2010-07-14 2011-03-16 上海无戒空间信息技术有限公司 Method for controlling computer by using gesture object and corresponding computer system
CN103168284A (en) * 2010-08-27 2013-06-19 苹果公司 Touch and hover switching
CN103168284B (en) * 2010-08-27 2016-08-31 苹果公司 Touch and hovering switching
US9710154B2 (en) 2010-09-03 2017-07-18 Microsoft Technology Licensing, Llc Dynamic gesture parameters
US9983784B2 (en) 2010-09-03 2018-05-29 Microsoft Technology Licensing, Llc Dynamic gesture parameters
CN102402286A (en) * 2010-09-03 2012-04-04 微软公司 Dynamic gesture parameters
CN101945499A (en) * 2010-09-06 2011-01-12 深圳市同洲电子股份有限公司 Method, terminal and system for transferring files
US9699286B2 (en) 2010-09-30 2017-07-04 Lg Electronics Inc. Mobile terminal and method of controlling mobile terminal to display image upon receiving proximity touch input
CN102446060B (en) * 2010-09-30 2016-01-06 Lg电子株式会社 The method of mobile terminal and control mobile terminal
CN102446060A (en) * 2010-09-30 2012-05-09 Lg电子株式会社 Mobile terminal and method of controlling mobile terminal
CN102467327A (en) * 2010-11-10 2012-05-23 上海无戒空间信息技术有限公司 Method for generating and editing gesture object and operation method of audio data
CN102025831A (en) * 2010-11-18 2011-04-20 华为终端有限公司 Multimedia playing method and terminal
CN102609184A (en) * 2010-12-29 2012-07-25 三星电子株式会社 Method and apparatus for providing mouse right click function in touch screen terminal
CN102681748B (en) * 2011-03-09 2015-01-28 联想(北京)有限公司 Information processing equipment and information processing method
CN102681748A (en) * 2011-03-09 2012-09-19 联想(北京)有限公司 Information processing equipment and information processing method
CN103582865A (en) * 2011-05-31 2014-02-12 三星电子株式会社 Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same
CN103582865B (en) * 2011-05-31 2017-05-03 三星电子株式会社 Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same
US9281010B2 (en) 2011-05-31 2016-03-08 Samsung Electronics Co., Ltd. Timeline-based content control method and apparatus using dynamic distortion of timeline bar, and method and apparatus for controlling video and audio clips using the same
CN107092431A (en) * 2011-06-03 2017-08-25 索尼公司 Display control unit and display control method
WO2013037259A1 (en) * 2011-09-15 2013-03-21 北京同步科技有限公司 Interactive multimedia information distribution terminal and information distribution method thereof
CN103001933A (en) * 2011-09-15 2013-03-27 北京同步科技有限公司 Interactive multimedia information distribution terminal and information distribution method thereof
CN102890694A (en) * 2011-09-22 2013-01-23 北京师科阳光信息技术有限公司 Time shaft system and implementation method thereof
CN108509134B (en) * 2011-09-28 2020-06-26 搜诺思公司 Method and apparatus for managing zones of a multi-zone media playback system
CN108509134A (en) * 2011-09-28 2018-09-07 搜诺思公司 Method and apparatus for the region for managing multizone media playback system
US11520464B2 (en) 2011-09-28 2022-12-06 Sonos, Inc. Playback zone management
CN103092389A (en) * 2011-11-04 2013-05-08 德尔福技术有限公司 Touch screen device and method for achieving virtual mouse action
CN103218151A (en) * 2011-11-25 2013-07-24 三星电子株式会社 Device and method for displaying object in terminal
CN103218151B (en) * 2011-11-25 2018-06-29 三星电子株式会社 The device and method for showing object in the terminal
CN103247310A (en) * 2012-02-14 2013-08-14 索尼爱立信移动通讯有限公司 Multimedia playing control method, playing control module and playing terminal
CN102609143A (en) * 2012-02-15 2012-07-25 张群 Handheld electronic equipment and video playing and controlling method thereof
CN103324433A (en) * 2012-03-23 2013-09-25 佳能株式会社 Display control apparatus and control method for the same
US9519365B2 (en) 2012-03-23 2016-12-13 Canon Kabushiki Kaisha Display control apparatus and control method for the same
CN104185831B (en) * 2012-04-02 2017-07-04 辛纳普蒂克斯公司 For the system and method using input unit dynamic regulation user-interface parameters
CN104185831A (en) * 2012-04-02 2014-12-03 辛纳普蒂克斯公司 Systems and methods for dynamically modulating a user interface parameter using an input device
WO2013178152A1 (en) * 2012-08-28 2013-12-05 中兴通讯股份有限公司 Method for copying and pasting text through dragging for terminal, and terminal
CN104737112A (en) * 2012-10-16 2015-06-24 微软公司 Thumbnail and document map based navigation in a document
CN104737112B (en) * 2012-10-16 2018-01-05 微软技术许可有限责任公司 Navigation based on thumbnail and document map in document
CN103794228A (en) * 2012-10-29 2014-05-14 仁宝电脑工业股份有限公司 Electronic apparatus with proximity sensing capability and proximity sensing control method therefor
CN103035273A (en) * 2012-12-12 2013-04-10 宁波高新区百瑞音响科技有限公司 Device by utilizing knob type digital coding switch for switching audio files
CN103035273B (en) * 2012-12-12 2016-05-25 宁波高新区百瑞音响科技有限公司 A kind of device that utilizes knob type digital code switch to switch audio file
CN104838340A (en) * 2012-12-14 2015-08-12 三星电子株式会社 Method and apparatus for controlling haptic feedback of input tool for mobile terminal
CN104838340B (en) * 2012-12-14 2018-10-19 三星电子株式会社 Method and apparatus of the control for the touch feedback of the input tool of mobile terminal
CN103885623A (en) * 2012-12-24 2014-06-25 腾讯科技(深圳)有限公司 Mobile terminal, system and method for processing sliding event into editing gesture
WO2014101512A1 (en) * 2012-12-24 2014-07-03 腾讯科技(深圳)有限公司 Method and system for processing swipe event, and mobile terminal
CN103902173B (en) * 2012-12-26 2017-12-26 联想(北京)有限公司 Portable terminal and its information processing method and display processing method
CN103902173A (en) * 2012-12-26 2014-07-02 联想(北京)有限公司 Portable terminal and information processing method and displaying processing method of portable terminal
CN103076985B (en) * 2013-01-31 2016-03-02 北京魔力时间科技有限公司 Accurately manipulate and display video playing progress rate device and using method based on touch screen
CN103076985A (en) * 2013-01-31 2013-05-01 北京魔力时间科技有限公司 Video playing progress precisely controlling and displaying device based on touch screen and use method thereof
CN105009059A (en) * 2013-02-27 2015-10-28 阿尔卑斯电气株式会社 Operation detection device
CN105009059B (en) * 2013-02-27 2018-11-02 阿尔卑斯电气株式会社 Operate detection device
CN105247463B (en) * 2013-03-03 2018-11-06 微软技术许可有限责任公司 The painting canvas environment of enhancing
CN105247463A (en) * 2013-03-03 2016-01-13 微软技术许可有限责任公司 Enhanced canvas environments
US11209975B2 (en) 2013-03-03 2021-12-28 Microsoft Technology Licensing, Llc Enhanced canvas environments
CN104123088B (en) * 2013-04-24 2018-01-02 华为技术有限公司 Mouse action implementation method and its device and touch screen terminal
CN104123088A (en) * 2013-04-24 2014-10-29 华为技术有限公司 Mouse operation implementing method and device and touch screen terminal
CN104216625A (en) * 2013-05-31 2014-12-17 华为技术有限公司 Display object display position adjusting method and terminal equipment
CN103327247A (en) * 2013-06-17 2013-09-25 山东神思电子技术股份有限公司 Portrait collection operation device and method
CN104423799A (en) * 2013-08-23 2015-03-18 夏普株式会社 Interface device and interface method
CN104423799B (en) * 2013-08-23 2019-09-24 夏普株式会社 Interface arrangement and interface method
CN104581378A (en) * 2013-10-16 2015-04-29 三星电子株式会社 Apparatus and method for editing synchronous media
CN104581378B (en) * 2013-10-16 2018-09-11 三星电子株式会社 Device and method for editing synchronized multimedia
CN104902331A (en) * 2014-03-07 2015-09-09 联想(北京)有限公司 Play progress regulating method and electronic equipment
CN104902331B (en) * 2014-03-07 2018-08-10 联想(北京)有限公司 A kind of playing progress rate adjusting method and electronic equipment
CN104077028A (en) * 2014-05-28 2014-10-01 天津三星通信技术研究有限公司 Equipment and method for controlling display item in electronic equipment
CN105653111A (en) * 2014-11-14 2016-06-08 神讯电脑(昆山)有限公司 Touch control input method and electronic device thereof
CN104571871A (en) * 2015-01-26 2015-04-29 深圳市中兴移动通信有限公司 Method and system for selecting files
CN105045513B (en) * 2015-08-27 2019-02-12 Oppo广东移动通信有限公司 Touch operation method and handheld device
CN105045513A (en) * 2015-08-27 2015-11-11 广东欧珀移动通信有限公司 Touch operation method and handheld device
CN106612425B (en) * 2015-10-23 2019-04-12 腾讯科技(深圳)有限公司 Image adjusting method and terminal device
CN106612425A (en) * 2015-10-23 2017-05-03 腾讯科技(深圳)有限公司 Image adjusting method and terminal equipment
CN108349082A (en) * 2015-11-11 2018-07-31 库卡德国有限公司 Method and computer program for the graphic user interface for generating executor program
US10940583B2 (en) 2015-11-11 2021-03-09 Kuka Deutschland Gmbh Method and computer program for producing a graphical user interface of a manipulator program
CN108349082B (en) * 2015-11-11 2022-02-08 库卡德国有限公司 Method and computer program for generating a graphical user interface for a manipulator program
CN105573616A (en) * 2015-12-10 2016-05-11 广东欧珀移动通信有限公司 Playlist control method and mobile terminal
CN105573631A (en) * 2015-12-14 2016-05-11 联想(北京)有限公司 Touch display electronic device and control method thereof
CN106527917A (en) * 2016-09-23 2017-03-22 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interactive system
CN106527917B (en) * 2016-09-23 2020-09-29 北京仁光科技有限公司 Multi-finger touch operation identification method for screen interaction system
CN107438839A (en) * 2016-10-25 2017-12-05 深圳市大疆创新科技有限公司 A kind of multimedia editing method, device and intelligent terminal

Also Published As

Publication number Publication date
CN201181467Y (en) 2009-01-14
DE202007014957U1 (en) 2007-12-27
CN101611373B (en) 2014-01-29

Similar Documents

Publication Publication Date Title
CN101611373B (en) Controlling, manipulating, and editing gestures of media files using touch sensitive devices
CN103631496B (en) Attitude using touch-sensitive device control, manipulation and editing media file
US20220334689A1 (en) Music now playing user interface
KR102174225B1 (en) Devices and methods for navigating between user interfaces
CN201266371Y (en) Handhold mobile communication equipment
CN101482794B (en) Mode-based graphical user interfaces for touch sensitive input devices
US10331297B2 (en) Device, method, and graphical user interface for navigating a content hierarchy
US10613732B2 (en) Selecting content items in a user interface display
AU2011253700B2 (en) Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
TWI432015B (en) Intelligent input method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant