CN205942662U - Electronic device and apparatus for grouping a plurality of images - Google Patents
Electronic device and apparatus for grouping a plurality of images Download PDFInfo
- Publication number
- CN205942662U CN205942662U CN201620470063.8U CN201620470063U CN205942662U CN 205942662 U CN205942662 U CN 205942662U CN 201620470063 U CN201620470063 U CN 201620470063U CN 205942662 U CN205942662 U CN 205942662U
- Authority
- CN
- China
- Prior art keywords
- image
- time
- image sequence
- activation
- release button
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004044 response Effects 0.000 claims abstract description 296
- 230000004913 activation Effects 0.000 claims abstract description 255
- 238000001514 detection method Methods 0.000 claims description 199
- 238000012545 processing Methods 0.000 claims description 140
- 230000008859 change Effects 0.000 claims description 90
- 238000003860 storage Methods 0.000 claims description 53
- 238000012217 deletion Methods 0.000 claims description 9
- 230000037430 deletion Effects 0.000 claims description 9
- 230000007717 exclusion Effects 0.000 claims 1
- 238000000034 method Methods 0.000 description 299
- 230000033001 locomotion Effects 0.000 description 121
- 238000010586 diagram Methods 0.000 description 117
- 238000003825 pressing Methods 0.000 description 66
- 230000008569 process Effects 0.000 description 65
- 230000014509 gene expression Effects 0.000 description 43
- 230000004048 modification Effects 0.000 description 42
- 238000012986 modification Methods 0.000 description 42
- 230000006870 function Effects 0.000 description 27
- 238000004891 communication Methods 0.000 description 24
- 230000009471 action Effects 0.000 description 22
- 239000000654 additive Substances 0.000 description 21
- 230000000996 additive effect Effects 0.000 description 21
- 238000007726 management method Methods 0.000 description 19
- 230000003287 optical effect Effects 0.000 description 17
- 230000000007 visual effect Effects 0.000 description 15
- 230000005540 biological transmission Effects 0.000 description 14
- 230000003993 interaction Effects 0.000 description 14
- 230000002093 peripheral effect Effects 0.000 description 14
- 230000004087 circulation Effects 0.000 description 13
- 230000002441 reversible effect Effects 0.000 description 13
- 230000009467 reduction Effects 0.000 description 12
- 230000003068 static effect Effects 0.000 description 12
- 241000282326 Felis catus Species 0.000 description 11
- 230000007423 decrease Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 11
- 238000010408 sweeping Methods 0.000 description 11
- 230000008878 coupling Effects 0.000 description 9
- 238000010168 coupling process Methods 0.000 description 9
- 238000005859 coupling reaction Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 9
- 210000004247 hand Anatomy 0.000 description 9
- 238000012384 transportation and delivery Methods 0.000 description 9
- 230000001133 acceleration Effects 0.000 description 7
- 238000009499 grossing Methods 0.000 description 7
- 230000001965 increasing effect Effects 0.000 description 7
- 238000007689 inspection Methods 0.000 description 7
- 238000000429 assembly Methods 0.000 description 6
- 230000000712 assembly Effects 0.000 description 6
- 230000006399 behavior Effects 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 6
- 238000002372 labelling Methods 0.000 description 6
- 230000002829 reductive effect Effects 0.000 description 6
- 230000000717 retained effect Effects 0.000 description 6
- 238000012163 sequencing technique Methods 0.000 description 6
- 230000005611 electricity Effects 0.000 description 5
- 230000002708 enhancing effect Effects 0.000 description 5
- 238000013507 mapping Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 230000002085 persistent effect Effects 0.000 description 5
- 238000010079 rubber tapping Methods 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000000977 initiatory effect Effects 0.000 description 4
- 230000021317 sensory perception Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 230000003213 activating effect Effects 0.000 description 3
- 238000003490 calendering Methods 0.000 description 3
- 238000005520 cutting process Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000005728 strengthening Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 241001422033 Thestylus Species 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 238000005496 tempering Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 241000283153 Cetacea Species 0.000 description 1
- 208000034423 Delivery Diseases 0.000 description 1
- 208000004547 Hallucinations Diseases 0.000 description 1
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 208000028804 PERCHING syndrome Diseases 0.000 description 1
- 241000269799 Perca fluviatilis Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004166 bioassay Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000007430 reference method Methods 0.000 description 1
- 238000002407 reforming Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Landscapes
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present disclosure provides an electronic device and an apparatus for grouping a plurality of images. An electronic device includes a camera. When in a first media capture mode for the camera, the device displays a live preview on the display. When the live preview is displayed, the device detects activation of the shutter button. In response to detecting activation of the shutter button, the device groups a plurality of images acquired by the camera at times proximate to activation of the shutter button into a sequence of images. The image sequence comprises: a plurality of images captured by a camera prior to detecting activation of a shutter button; a representative image representative of the first sequence of images and acquired by the camera after one or more of the other images in the first sequence of images; and a plurality of images acquired by the camera after acquiring the representative image.
Description
Technical field
Generally herein be related to the electronic equipment with Touch sensitive surface, including but not limited to have capture, display and/or in addition
Manipulate the electronic equipment of the Touch sensitive surface of digital content being shot or recorded by camera.
Background technology
Electronic equipment has been dramatically increased in recent years for the use capture, check, editing with sharing digital content.With
Family continually utilizes their portable electric appts (for example, smart phone, tablet PC and dedicated digital cameras) to remember
Record digital content (for example, image and/or video);Image management application (for example, from California storehouse than Dinon Herba Marsileae Quadrifoliae
" photo " of fruit company) and/or digital content management application (for example, from California storehouse than Dinon Apple
ITunes check and edit their digital content in);And by instant message, Email, social media application and
Other communications applications share their digital content.
Portable electric appts generally capture two kinds of digital content:Rest image and video.Rest image is usual
Captured by simply pressing shutter release button.The a certain moment on rest image freeze-off time, but it is around the piece in this moment
The loss in detail carved.The time period that videograph extends, it can include interesting a moment and not so interesting a moment two
Person.Typically require great editor to remove less interesting a moment.
Utility model content
Accordingly, there exist and shot by camera for capture or a moment recording the improved interface that interacts to having
The needs of electronic equipment.Such interface alternatively supplements or is substituted for capturing still image and video and the biography interacting
System method.
Disclosed equipment will be taken the photograph for a moment and with the new and improved method interacting for a moment by providing user's capture
Shadow expands to beyond rest image.In certain embodiments, this equipment is desk computer.In certain embodiments, this equipment
It is portable (for example, notebook, tablet PC or handheld device).In certain embodiments, this equipment is individual
People's electronic equipment (for example, wearable electronic, such as wrist-watch).In certain embodiments, this equipment is touch pad.At some
In embodiment, this equipment has touch-sensitive display (being also known as " touch screen " or " touch-screen display ").In some embodiments
In, this equipment has graphic user interface (GUI), one or more processors, memorizer and one or more module, is stored in
For executing program or the instruction set of multiple functions in memorizer.In certain embodiments, this user interface is mainly passed through to touch
Stylus on sensitive surfaces and/or finger contact and gesture are interacted with GUI.In certain embodiments, function alternatively includes image
Editor, drafting, demonstration, word processing, electrical form make, play games, making a phone call, video conference, Email, instant message
Receive and dispatch, temper support, digital photography, digital video, network browsing, digital music broadcasting, taking notes and/or digital video is broadcast
Put.Executable instruction for executing these functions is alternatively included in and is arranged to be run by one or more processors
Non-transient computer readable storage medium storing program for executing or other computer programs in.
Provide a kind of electronic equipment it is characterised in that including:Display unit, it is display configured to live preview;Phase
Machine unit, it is configured to gather image;And processing unit, it is coupled with described display unit and described camera unit, institute
State processing unit to be configured to:When in being in for the first acquisition of media pattern of described camera unit:Single in described display
Described live preview is shown in unit;When showing described live preview, detect the activation to shutter release button at the very first time;And
And in response to the activation to described shutter release button at the described very first time is detected:Will by described camera unit with described
On the time close to the described activation of described shutter release button at the very first time, the multiple images of collection are grouped into the first image sequence
In row, wherein said first image sequence includes:Detected in the described very first time to described fast by described camera unit
The multiple images of collection before the activation of door button;Presentation graphics, described presentation graphics represents described first image sequence
And gathered after one or more of other images in described first image sequence image by described camera unit;With
And the multiple images by the collection after gathering described presentation graphics of described camera unit.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image is the image of predefined quantity.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image is the image in the time predefined before the described very first time.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image is the image in the time predefined collecting before the time of described presentation graphics.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image carrys out the time range between the comfortable described very first time and the second time before the described very first time, and in detection
Gather the plurality of image independent of detection in time to before the activation to described shutter release button at the described very first time
Close to interacting of described second time and described shutter release button.
Alternatively, at the described very first time, described shutter release button is swashed detecting in described first image sequence
Before work, the plurality of image of collection meets one or more predefined packet criterions.
Alternatively, described predefined packet criterion include select detect pre- before the activation to described shutter release button
Define the image of quantity.
Alternatively, described predefined packet criterion includes selecting the figure of the predefined quantity before described presentation graphics
Picture.
Alternatively, described predefined packet criterion include select immediately preceding detect the activation to described shutter release button it
Image in front time predefined scope.
Alternatively, described predefined packet criterion include select immediately preceding collect described presentation graphics time it
Image in front time predefined scope.
Alternatively, described equipment starts collection storage image after entering described first acquisition of media pattern, and it
After delete be not grouped into when being in described first acquisition of media pattern in time close at the corresponding time to described
Image in the corresponding multiple images of the activation of shutter release button.
Alternatively, described equipment starts collection storage image after showing described live preview, and deletion is worked as afterwards
Be not grouped into when being in described first acquisition of media pattern in time close at the corresponding time to described shutter release button
The corresponding multiple images of activation in image.
Alternatively, described equipment to be adopted to the activation of described shutter release button independent of detection when showing described live preview
Collection and storage image, and delete afterwards be not grouped into when being in described first acquisition of media pattern close in time
The image at the corresponding time, the institute in the corresponding multiple images of the activation of described shutter release button being gathered and storing.
Alternatively, described first image sequence is stored as the first unique image collection in memory.
Alternatively, described live preview is with first resolution display image, and described first image sequence includes being shown
Show the image of the described first resolution in described live preview.
Alternatively higher than described first resolution is had by the described presentation graphics that described camera unit gathers
Two resolution.
Alternatively, described processing unit is configured to:In response to detect at the described very first time to described shutter by
The activation of button:The audio frequency corresponding with described first image sequence is associated with described first image sequence.
Alternatively, described processing unit is configured to:In response to detect at the described very first time to described shutter by
The activation of button:The metadata corresponding with described first image sequence is associated with described first image sequence.
Alternatively, described first acquisition of media pattern is configured to be enabled or disabled by the user of described equipment.
Alternatively, described live preview is shown as including can piece supplying for enable described first acquisition of media pattern
The part of media capture user interface;When described first acquisition of media pattern is activated, described piece supplying demonstration can be animated;And
And when described first acquisition of media pattern is disabled, described piece supplying can not be animated demonstration.
Alternatively, in response to the respective image sequence that the corresponding activation of described shutter release button is grouped is detected
Parameter can be by the user configuring of described equipment.
Alternatively, described live preview is shown as including can piece supplying for enable described first acquisition of media pattern
Media capture user interface part;And described shutter release button is shown in the software in described media capture user interface
Button;And described processing unit is configured to:In response to the described activation to described shutter release button is detected, in described display
The animation being associated with described shutter release button is shown on unit, described animation continues and in the described activation to described shutter release button
Described camera unit gathers the time quantum corresponding for the time quantum of the image of described first image sequence afterwards.
Alternatively, detected before the described very first time is to the activation of described shutter release button by described camera unit
The plurality of image of collection is before the activation to described shutter release button at the described very first time is detected with the first form
Be stored in memorizer, and in response to detect at the described very first time to the activation of described shutter release button and with second
Form is stored in described memorizer.
Alternatively, described processing unit is configured to:Detecting at the described very first time to described shutter release button
After activation, detection next activation to described shutter release button at the second time;And in response to detecting described second
To next activation described in described shutter release button at time:Will by described camera unit with described second time to described
On the described activation of the shutter release button close time, the multiple images of collection are grouped in the second image sequence, and wherein said second
Image sequence includes:Adopted before the activation to described shutter release button for described second time detecting by described camera unit
The multiple images of collection;And presentation graphics, described presentation graphics represents described second image sequence and by described camera
Unit gathers after one or more of other images in described second image sequence image.
Alternatively, described processing unit is configured to automatically exclude broad image from described first image sequence.
Alternatively, described first image sequence includes:Initial pictures in described first image sequence, in described initial graph
The image of the first quantity as gathering and described presentation graphics between, the final image in described first image sequence, and
The image of the second quantity of collection between described presentation graphics and described final image;And described processing unit is configured
For:The detection input corresponding with order to change the request of the described presentation graphics in described first image sequence;And ring
Ying Yu detect with order to change the described request of the described presentation graphics in described first image sequence corresponding described in
Input:Described presentation graphics is changed into by the presentation graphics being corrected according to the described input detecting;And pass through
Image is added and in described first image sequence in the end of described first image sequence according to the described input detecting
The other end at delete image to change the grouped the plurality of image in described first image sequence so that described first
Image sequence has the initial pictures being corrected and the final image being corrected.
Alternatively, described display unit is touch-sensitive display unit, and described processing unit is configured to:Receive in order to aobvious
Show the request of the described presentation graphics from described first image sequence;In response to receiving to show described representative diagram
The described request of picture, shows described presentation graphics on described touch-sensitive display unit;When showing described presentation graphics,
Touch input on described presentation graphics is received on described touch-sensitive display unit, described touch input includes changing over
Feature;And in response to receiving the described touch input on described presentation graphics on described touch-sensitive display unit,
On described touch-sensitive display unit, with the described feature based on described touch input, the speed determining that changes in time shows
Show the image in described first image sequence.
A kind of for device that multiple images are grouped it is characterised in that including:When being in the electricity for including display
The part being activated when in the first acquisition of media pattern of the camera of sub- equipment, including:For showing reality on the display
When preview part;For detecting the part of the activation to shutter release button at the very first time when showing described live preview;
And for the part of following operation being executed to the activation of described shutter release button in response to detecting at the described very first time:
By by described camera with the time close to the described activation of described shutter release button in the described very first time on collection many
Individual image is grouped in the first image sequence, and wherein said first image sequence includes:Detected described by described camera
Multiple images to collection before the activation of described shutter release button at the very first time;Presentation graphics, described presentation graphics generation
Table described first image sequence and by described camera one or more of other images in described first image sequence
Gather after image;And the multiple images by the collection after gathering described presentation graphics of described camera.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image is the image of predefined quantity.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image is the image in the time predefined before the described very first time.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image is the image in the time predefined collecting before the time of described presentation graphics.
Alternatively, gathered before the activation to described shutter release button at the described very first time is detected is the plurality of
Image carrys out the time range between the comfortable described very first time and the second time before the described very first time, and in detection
Gather the plurality of image independent of detection in time to before the activation to described shutter release button at the described very first time
Close to interacting of described second time and described shutter release button.
Alternatively, at the described very first time, described shutter release button is swashed detecting in described first image sequence
Before work, the plurality of image of collection meets one or more predefined packet criterions.
Alternatively, described predefined packet criterion include select detect pre- before the activation to described shutter release button
Define the image of quantity.
Alternatively, described predefined packet criterion includes selecting the figure of the predefined quantity before described presentation graphics
Picture.
Alternatively, described predefined packet criterion include select immediately preceding detect the activation to described shutter release button it
Image in front time predefined scope.
Alternatively, described predefined packet criterion include select immediately preceding collect described presentation graphics time it
Image in front time predefined scope.
Alternatively, described equipment starts collection storage image after entering described first acquisition of media pattern, and it
After delete be not grouped into when being in described first acquisition of media pattern in time close at the corresponding time to described
Image in the corresponding multiple images of the activation of shutter release button.
Alternatively, described equipment starts collection storage image after showing described live preview, and deletion is worked as afterwards
Be not grouped into when being in described first acquisition of media pattern in time close at the corresponding time to described shutter release button
The corresponding multiple images of activation in image.
Alternatively, described equipment to be adopted to the activation of described shutter release button independent of detection when showing described live preview
Collection and storage image, and delete afterwards be not grouped into when being in described first acquisition of media pattern close in time
The image at the corresponding time, the institute in the corresponding multiple images of the activation of described shutter release button being gathered and storing.
Alternatively, described first image sequence is stored as the first unique image collection in memory.
Alternatively, described live preview is with first resolution display image, and described first image sequence includes being shown
Show the image of the described first resolution in described live preview.
Alternatively, second higher than described first resolution point is had by the described presentation graphics of described collected by camera
Resolution.
Alternatively include:For executing to the activation of described shutter release button at the described very first time in response to detecting
The part below operating:The audio frequency corresponding with described first image sequence is associated with described first image sequence.
Alternatively include:For executing to the activation of described shutter release button at the described very first time in response to detecting
The part below operating:The metadata corresponding with described first image sequence is associated with described first image sequence.
Alternatively, described first acquisition of media pattern is configured to be enabled or disabled by the user of described equipment.
Alternatively:Described live preview is shown as including can piece supplying for enable described first acquisition of media pattern
The part of media capture user interface;When described first acquisition of media pattern is activated, described piece supplying demonstration can be animated;And
And when described first acquisition of media pattern is disabled, described piece supplying can not be animated demonstration.
Alternatively, in response to the respective image sequence that the corresponding activation of described shutter release button is grouped is detected
Parameter can be by the user configuring of described equipment.
Alternatively:Described live preview is shown as including can piece supplying for enable described first acquisition of media pattern
The part of media capture user interface;And described shutter release button be shown in software in described media capture user interface by
Button;And described equipment includes:For in response to detect the described activation to described shutter release button show with described shutter by
The part of the animation that button is associated, described animation continues to adopt with camera described after the described activation to described shutter release button
Collect the time quantum corresponding for the time quantum of the image of described first image sequence.
Alternatively, collection before the described very first time is to the activation of described shutter release button is being detected by described camera
The plurality of image deposited with the first form before the activation to described shutter release button at the described very first time detecting
Storage in memory, and in response to detecting at the described very first time to the activation of described shutter release button with the second form
It is stored in described memorizer.
Alternatively, described device includes:For the activation to described shutter release button at the described very first time is being detected
Detect the part of next activation to described shutter release button at the second time afterwards;And in response to detecting described
Part to next activation described in described shutter release button at second time:Will by described camera with described second time
On the time close to the described activation of described shutter release button, the multiple images of collection are grouped in the second image sequence, wherein institute
State the second image sequence to include:Detected before the activation to described shutter release button for described second time by described camera
The multiple images of collection;And presentation graphics, described presentation graphics represents described second image sequence and by described phase
Machine gathers after one of other images in described second image sequence or many images.
Alternatively, described device includes the part for automatically excluding broad image from described first image sequence.
Alternatively, described first image sequence includes:Initial pictures in described first image sequence, in described initial graph
The image of the first quantity as gathering and described presentation graphics between, the final image in described first image sequence, and
The image of the second quantity of collection between described presentation graphics and described final image;And described equipment includes:For
Detection and the part in order to change the corresponding input of the request of the described presentation graphics in described first image sequence;And
In response to the institute corresponding with order to change the described request of the described presentation graphics in described first image sequence is detected
State input:For the portion described presentation graphics to be changed into the presentation graphics being corrected according to the described input detecting
Part;And for by according to the described input that detects described first image sequence end add image and
Image is deleted grouped the plurality of in described first image sequence to change at the other end of described first image sequence
Image makes described first image sequence have the initial pictures being corrected and the part of the final image being corrected.
Alternatively, described display is touch-sensitive display, and described equipment includes:For receiving in order to show from institute
State the part of the request of described presentation graphics of the first image sequence;For in response to receiving to show described representativeness
The described request of image shows the part of described presentation graphics on described touch-sensitive display;For when the described representativeness of display
The part of the touch input on described presentation graphics, described touch input bag are received during image on described touch-sensitive display
Include the feature changing over;And in response to receiving on described presentation graphics on described touch-sensitive display
With the described feature based on described touch input, the speed determining that changes in time shows described the to described touch input
The part of the image in one image sequence.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to live preview;Camera
Unit, it is configured to gather image;And processing unit, it is coupled with display unit and camera unit.Processing unit is joined
Live preview is shown on the display unit when being set in being in for the first acquisition of media pattern of camera unit.Processing unit
It is additionally configured to detect the activation to shutter release button at the very first time when showing live preview, and in response to detecting
Activation to shutter release button at the very first time, by by camera unit with the very first time to shutter release button activation close
On time, the multiple images of collection are grouped in the first image sequence.First image sequence includes:Detected by camera unit
The multiple images to collection before the activation of shutter release button at the very first time;Presentation graphics, this represents the first image sequence
And gathered after one or more of other images in the first image sequence by camera unit;And by camera unit
The multiple images of collection after collection presentation graphics.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;And processing unit, it is coupled with display unit and Touch sensitive surface unit.Process
Unit is configured to show on the display unit presentation graphics.Representative diagram seems in the image sequence being shot by camera
Individual image.Image sequence is included by one or more images of camera collection after collection presentation graphics.Image sequence bag
Include by one or more images of camera collection before collection presentation graphics.Processing unit is additionally configured to when single in display
The Part I of detection first input during presentation graphics is shown on unit.Processing unit is additionally configured in response to detecting first
The Part I of input is using on the display unit by one or more images of camera collection after collection presentation graphics
Order show the display to substitute presentation graphics.Processing unit is additionally configured in the Part I the first input is detected
The Part II of detection afterwards first input.Processing unit is additionally configured to the Part II in response to the first input is detected,
Order shows one or more images, the presentation graphics being gathered before collection presentation graphics by camera on the display unit
With the one or more images by camera collection after collection presentation graphics.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;And processing unit, it is coupled with display unit and Touch sensitive surface unit.Process
Unit is configured to make it possible on the display unit show presentation graphics.Representative diagram seems the image sequence being shot by camera
One of row image.Image sequence is included by one or more images of camera collection after collection presentation graphics.Figure
As sequence is included by one or more images of camera collection before collection presentation graphics.Processing unit is additionally configured to work as
The Part I of detection first input when making it possible to show presentation graphics on the display unit.Processing unit is additionally configured to
In response to the Part I of the first input is detected:It is converted to accordingly previous display image sequence from display presentation graphics
Image, wherein this corresponding prior images are to be gathered before collection presentation graphics by camera;And representative from display
After image is converted to the corresponding prior images of display, starts order with corresponding prior images and make it possible to show by camera in collection
At least some of one or more images image of collection before presentation graphics and by camera collection presentation graphics it
At least some of the one or more images gathering afterwards image.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;One or more sensor units, it is configured to detection and Touch sensitive surface unit
Contact intensity;And processing unit, it is with display unit, Touch sensitive surface unit and one or more sensor unit coupling
Close.Processing unit is configured to make it possible on the display unit show presentation graphics.Representative diagram seems to be shot by camera
One of image sequence image.Image sequence include by camera collection presentation graphics after collection one or more
Image.Processing unit is additionally configured to detection first input when making it possible to show presentation graphics on the display unit, its
Including detection Touch sensitive surface unit on contact characteristic strength to the first intensity more than the first intensity threshold increase.Process
Unit is additionally configured to the increase of the characteristic strength in response to contact is detected, in a first direction to be at least partially based on first
The speed propulsion that intensity determines is by the one or more images by camera collection after collection presentation graphics.Processing unit
Be additionally configured to be at least partially based on first intensity determine speed propulsion by by camera collection presentation graphics it
After the one or more images gathering afterwards, detect the reduction of the intensity of contact to the second intensity less than the first intensity.Process
Unit is additionally configured to the reduction in response to characteristic strength contact is detected to the second intensity, is higher than first according to the second intensity
The determination of intensity threshold, continues propulsion by being adopted after collection presentation graphics by camera with the second speed in a first direction
One or more images of collection.Second speed is at least partially based on the second intensity to determine, and the second speed ratio first rate
Slowly.Processing unit is additionally configured to be reduced to the second intensity in response to characteristic strength contact is detected, according to the second low intensity
In the determination of the first intensity threshold, in a second direction that is opposite the first direction to be at least partially based on what the second intensity determined
Speed is moved through by one or more images of camera collection after collection presentation graphics.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;Memory cell, it is configured to storage image;And processing unit, its with aobvious
Show unit, memory cell and the coupling of Touch sensitive surface unit.Processing unit is configured to store in a memory cell multiple
Image sequence.Respective representative image that respective image sequence includes being shot by camera, by camera in collection respective representative figure
As one or more images of gathering afterwards and by camera before collection respective representative image collection one or more
Image.Processing unit is additionally configured to be shown in the first presentation graphics being used for the first image sequence on the display unit aobvious
Show in the removable first area on unit.Processing unit is additionally configured to detect the drag gesture on Touch sensitive surface unit.Place
Reason unit is additionally configured to the determination being on the first direction on Touch sensitive surface unit according to drag gesture:Removable the
In one region, utilize and adopt after collection is used for the first presentation graphics of the first image sequence to by camera on the display unit
At least some of one or more images for the first image sequence of collection image is chronological to be shown to substitute use
In the display of the first presentation graphics of the first image sequence, and on the display unit first area is moved in a first direction
Dynamic.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;And processing unit, it is coupled with display unit and Touch sensitive surface unit.Process
Unit is configured to store multiple images sequence.Respective image sequence include the respective representative image being shot by camera and by
One or more images of camera collection before collection respective representative image.Processing unit is additionally configured in display unit
On removable first area in make it possible to show for the first image sequence the first presentation graphics.Processing unit also by
Be configured to detect Touch sensitive surface unit on gesture, this gesture include by with first direction on the display unit on movement
The movement of corresponding contact.Processing unit is additionally configured in response to the gesture on Touch sensitive surface unit is detected:In display
On unit, first area is moved in a first direction;On the display unit removable second area is moved in a first direction
Dynamic;And the determination that criterion is satisfied is shown according to sequence, when moving second area in a first direction, in second area
In make it possible to show by camera before the second presentation graphics for the second image sequence for the collection collection for second
One of image sequence or or at least some of multiple images image in chronological order.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;Memory cell, it is configured to storage image;And processing unit, its with aobvious
Show unit, memory cell and the coupling of Touch sensitive surface unit.Processing unit is configured to store in a memory cell multiple
Image sequence.Respective representative image that respective image sequence includes being shot by camera, by camera in collection respective representative figure
As one or more images of gathering afterwards and by camera before collection respective representative image collection one or more
Image.Processing unit is additionally configured to store in a memory cell diverse with the image in multiple images sequence multiple
Image.Respective image in multiple images is not the part of the image sequence in multiple images sequence.Processing unit is configured to
Show the first image on the display unit.Processing unit is additionally configured to when showing the first image on the display unit detection the
One input.Processing unit is additionally configured in response to the first input is detected:It is in the first image sequence according to the first image
The determination of image, execution first operation includes showing the figure beyond the first image in the first image sequence on the display unit
At least some of picture image.Processing unit is additionally configured to
The determination of the image in diverse multiple images, execution is related to second behaviour diverse with the first operation of the first image
Make.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;And processing unit, it is coupled with display unit and Touch sensitive surface unit.Process
Unit is configured to make it possible on the display unit show presentation graphics.Representative diagram seems the image sequence being shot by camera
One of row image.Image sequence is included by one or more images of camera collection after collection presentation graphics.Figure
As sequence is included by one or more images of camera collection before collection presentation graphics.Processing unit is additionally configured to work as
Make it possible on the display unit when showing presentation graphics, detect the input for changing presentation graphics.Processing unit is joined
It is set to the input detecting for changing presentation graphics:Determination in first edit pattern is according to this equipment, changes generation
Table image, by camera after collection presentation graphics one or more images of collection and representative in collection by camera
One or more images of collection before image;And, edit mould according to being in diverse with the first edit pattern second
Determination in formula, change presentation graphics, and do not change by camera collection presentation graphics after collection one or more
Image, and do not change by one or more images of camera collection before collection presentation graphics.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;And process
Unit, it is coupled with display unit.Processing unit is configured to make it possible on the display unit be shown in presentation graphics
It is configured in the user interface of application communicating with other electronic equipments.Representative diagram seems the image sequence being shot by camera
One of image.Image sequence is included by one or more images of camera collection after collection presentation graphics.Image
Sequence is included by one or more images of camera collection before collection presentation graphics.Processing unit is additionally configured to work as
On the display unit make it possible to show presentation graphics when detection with for sending the request of presentation graphics or being used for selecting
For corresponding defeated of the request being sent to the presentation graphics of the second long-range electronic equipment in electronic equipment using application
Enter.Processing unit be additionally configured in response to detect with for sending the request of presentation graphics or being used for using for selection
Apply the corresponding input of the request of the presentation graphics being sent to the second electronic equipment:It is configured to according to the second electronic equipment
Using image sequence as one group of determination interacting, enabling show for image sequence is at least partly sent to
First option set of two electronic equipments;And it is not configured as using image sequence as one group therewith according to the second electronic equipment
The determination of interaction, enabling show for by second set of choices being at least partly sent to the second electronic equipment of image sequence
Close, the wherein second option set is different from the first option set.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Camera unit,
It is configured to gather image;And processing unit, it is coupled with display unit and camera unit.Processing unit is configured as
It is in for making it possible on the display unit when in the first acquisition of media pattern of camera unit show the live preview of scene
And execute the scene Recognition to scene.Processing unit is configured as making it possible to show that during live preview, detection is in the very first time
The single actuation to shutter release button for the place.Processing unit is configured to respond to the list to shutter release button at the very first time is detected
Secondary activation, according to scene meet action capture criterion determination, be at least partially based on to scene execution scene Recognition, retain by
Camera unit is in the multiple images with collection on the activation close time to shutter release button at the very first time and by multiple figures
As being grouped in the first image sequence.First image sequence includes:Detected in the very first time to shutter by camera unit
The multiple images of collection before the activation of button;Presentation graphics, this represents the first image sequence and by camera in the first figure
Gather as after one or more of other images in sequence;And adopted after collection presentation graphics by camera unit
The multiple images of collection.Processing unit is configured to respond to the single actuation to shutter release button, root at the very first time is detected
Be unsatisfactory for, according to scene, the determination that action captures criterion, retain in time close at the very first time to the activation of shutter release button
Single image.
According to some embodiments, a kind of electronic equipment includes:Display unit, it is display configured to image;Touch sensitive surface
Unit, it is configured to detect user input;And processing unit, it is coupled with display unit and camera unit.Processing unit
It is configured to make it possible on the display unit display image.This image is one of image sequence of being shot by camera figure
Picture.Image sequence includes presentation graphics.Image sequence include by camera collection presentation graphics after collection one or
Multiple images.Image sequence is included by one or more images of camera collection before collection presentation graphics.Processing unit
It is additionally configured to detection first input when making it possible to the image in display image sequence on the display unit.Processing unit is also
It is configured to respond to the first input is detected:Make it possible to show for being image sequence less than complete by image sequence editing
The user interface of the subset in portion.This user interface includes:Comprise the region of the expression of image in image sequence;User's scalable
Beginning editing icon, its via start the expression in the image comprising in image sequence for the editing icon region in position come
Define the beginning image in the subset of image sequence;And user's adjustable end editing icon, it is via end editing figure
The position being marked in the region of expression of the image comprising in image sequence is defining the end image in the subset of image sequence.
Start editing icon and be positioned in the region of expression of the image comprising in image sequence automatically selected by this equipment the
At one position.End editing icon is positioned in automatic by this equipment in the region of expression of the image comprising in image sequence
The second position selecting.Processing unit is additionally configured to when making it possible to show the user interface for montage sequence
Detection second input.Processing unit is additionally configured in response to the second input is detected according to the current location starting editing icon
Image sequence editing is the subset of image sequence by the current location with end editing icon.
According to some embodiments, a kind of electronic equipment includes:Display, Touch sensitive surface, for detection with Touch sensitive surface
The one or more optional sensor of intensity of contact, one or more processors, memorizer and one or more program;
One or more programs are stored in memorizer and are configured to be run by one or more processors and one or more
Program is included for executing any operation in method described herein or causing to appointing in method described herein
The instruction of the execution of what operation.According to some embodiments, a kind of computer-readable recording medium has the finger being stored therein
Order, instruction when by have display, Touch sensitive surface and for the contact with Touch sensitive surface for the detection intensity one or more
The electronic equipment of optional sensor makes this equipment execute any operation or initiation in method described herein when running
Execution to any operation in method described herein.According to some embodiments, one kind has display, touch-sensitive table
Face, deposit for the one or more optional sensor of the intensity of contact with Touch sensitive surface for the detection, memorizer and for operation
Graphic user interface on the storage electronic equipment of the one or more processors of one or more programs in memory includes
One or more of element of any middle display in the method being described herein, appointing in method as described in this article
Described in what, one or more of element is updated in response to input.According to some embodiments, a kind of electronic equipment
Including:Display, Touch sensitive surface and the one or more optional sensor for detecting the intensity of contact with Touch sensitive surface;
And for executing any operation in method described herein or causing to any in method described herein
The unit of the execution of operation.According to some embodiments, a kind of for have display and Touch sensitive surface and for detection with
Information processor used in the electronic equipment of one or more optional sensor of the intensity of the contact of described Touch sensitive surface
Including for executing any operation in method described herein or causing to any in method described herein
The unit of the execution of operation.
It thus provides have a moment being shot by camera for capture or recording and with the piece being shot or recorded by camera
Carve the improved method of interaction and the electronic equipment at interface.Such method and interface can supplement or be substituted for capturing static
Image and video the traditional method with rest image and video interactive.
Brief description
In order to more fully understand the embodiment of various descriptions, in conjunction with accompanying drawing, the description of the following examples is quoted,
The reference being similar in the accompanying drawings refers to the corresponding part in each accompanying drawing.
Figure 1A be a diagram that the block diagram of the portable multifunction device with touch-sensitive display according to some embodiments.
Figure 1B be a diagram that the block diagram according to some embodiments for the example components of event handling.
Fig. 2 illustrates the portable multifunction device with touch screen according to some embodiments.
Fig. 3 be a diagram that the frame of the exemplary multifunctional equipment with display and Touch sensitive surface according to some embodiments
Figure.
Fig. 4 A illustrates exemplary for the menu of the application on portable multifunction device according to some embodiments
User interface.
Fig. 4 B illustrates according to some embodiments for having the multifunctional equipment with the Touch sensitive surface of displays separated
Exemplary user interface.
Fig. 4 C-4E illustrates the exemplary dynamic intensity threshold according to some embodiments.
Fig. 5 A-5K illustrates according to some embodiments for capturing the example user of grouped associated image sequences
Interface.
Fig. 6 A-6FF illustrates according to some embodiments for showing (or playback) grouped associated image sequences
Exemplary user interface.
Fig. 7 A-7CC illustrates according to some embodiments example user circle for navigation by associated image sequences
Face.
Fig. 8 A-8L illustrates and according to some embodiments, associated image sequences is executed compared with individual images more not
The exemplary user interface of same operation.
Fig. 9 A-9G be a diagram that the flow process of the method according to the grouped associated image sequences of the capture of some embodiments
Figure.
Figure 10 A-10E be a diagram that the flow process of the method for display (playback) associated image sequences according to some embodiments
Figure.
Figure 10 F-10I be a diagram that the flow process of the method for display (playback) associated image sequences according to some embodiments
Figure.
Figure 10 J-10M be a diagram that the flow process of the method for display (playback) associated image sequences according to some embodiments
Figure.
Figure 11 A-11E be a diagram that the flow chart by the method for associated image sequences for the navigation according to some embodiments.
Figure 11 F-11I be a diagram that the flow chart by the method for associated image sequences for the navigation according to some embodiments.
Figure 12 A-12B be a diagram that complete compared with individual images to associated image sequences execution according to some embodiments
The flow chart of the method for different operations entirely.
Figure 13-19 is the functional block diagram of the electronic equipment according to some embodiments.
Figure 20 A-20L illustrates according to some embodiments for changing example user circle of the image in image sequence
Face.
Figure 21 A-21J illustrates according to some embodiments for the image from image sequence is sent to the second electronics
The exemplary user interface of equipment.
Figure 22 A-22D is illustrated according to some embodiments for being gathered photo (for example, enhancement mode using scene Recognition
Photo or picture) exemplary user interface.
Figure 23 A-23E illustrates the showing for montage sequence (for example, enhancement mode photo) according to some embodiments
Example property user interface.
The flow chart that Figure 24 A-24E illustrates the method according to the image in the modification image sequence of some embodiments.
Figure 25 A-25C illustrates, according to some embodiments, the image from image sequence is sent to the second electronic equipment
Method flow chart.
Figure 26 A-26D is illustrated and is gathered photo (for example, enhancement mode photo using scene Recognition according to some embodiments
Or picture) method flow chart.
Figure 27 A-27E illustrates the method for montage sequence (for example, enhancement mode photo) according to some embodiments
Flow chart.
Figure 28-31 is the functional block diagram of the electronic equipment according to some embodiments.
Specific embodiment
As the above mentioned, portable electric appts generally capture two kinds of digital content:Rest image and regarding
Frequently.Rest image is generally captured by simply pressing shutter release button.The a certain moment on rest image freeze-off time
(instant), but be around this moment moment or a moment (moment) loss in detail.The time period that videograph extends,
It can include interesting a moment and not so interesting a moment.Typically require great editor less interesting to remove
A moment.
Therefore, describe new and improved equipment and the method interacting for capture a moment and with a moment.
In certain embodiments, in response to pressing to shutter release button, this equipment will include presentation graphics (similar to
The moment of capture in conventional dead image) and gather the image of (acquire) before shutter release button is pressed and representing
Property image after the image sequence of image of collection be grouped together.Therefore, pressing of shutter release button is captured in time
A moment near a certain moment, rather than only this moment.In certain embodiments, also capture the extraneous information with regard to this moment,
Such as sound and metadata.From the angle of user, this process makes capture a moment (include the grouped image sequence of presentation graphics
Row) easy as capture moment (single rest image), user needs only to press shutter release button.In order to single static figure
As making a distinction, term " enhancement mode photo " is sometimes for being succinctly used to refer to grouped image sequence.
In certain embodiments, when checking presentation graphics, enhancement mode photo can " vivid " and in response to
User input (for example, on enhancement mode photo press and hold gesture or depth presses gesture) reset a moment.
In certain embodiments, when being navigated between each enhancement mode photo, for corresponding enhancement mode photo, just
The enhancement mode that the image shooting before the presentation graphics of enhancement mode photo is shown as entering over the display in view is shone
Piece, and/or just after presentation graphics, the image of shooting is shown as leaving the enhancement mode photo of display, this increasing
The strong display to a moment.
In certain embodiments, when being navigated between enhancement mode photo and conventional dead image, enhancement mode photo
" reset " in view and/or when leaving display when entering, be simultaneous for conventional dead image, when rest image is shown
Extraneous information (for example, position data) and/or animation in rest image is shown.
In certain embodiments, user can change the presentation graphics in enhancement mode photo and so that modification is applied to
(for example, user can be applied to all patterns for only representing property image or all images of being applied in enhancement mode photo
Switch over and rest image edit pattern between).
In certain embodiments, when enhancement mode photo is sent to another user by relative users, the equipment of relative users
Whether the equipment depending on other users is compatible with enhancement mode photo and assumes the different options (example for sending enhancement mode photo
As presented when the equipment of other users is compatible with enhancement mode photo for sending enhancement mode photo as enhancement mode photo
Option, and when the equipment when other users and enhancement mode photo are incompatible, assume the choosing for only sending presentation graphics
).
In certain embodiments, this equipment (for example, when being in image capture mode) execution scene Recognition.Work as scene
Be conducive to retaining enhancement mode photo (for example, scene includes mobile or face), then in response to pressing, this equipment to shutter release button
Retain enhancement mode photo.When scene is unfavorable for retaining enhancement mode photo (for example, scene is the picture of receipt), then in response to fast
Door the pressing of button, this equipment retains single rest image.
In certain embodiments, image sequence editing (trim) can be the subset of image sequence by user.This equipment carries
For the shank for editing sequence at (for example, based on scene Recognition) automatic picking position in the sequence.This shank can also
It is used for manually editing sequence.
Below, Figure 1A -1B, 2 and 3 provide the description to example devices.Fig. 4 A-4B, 5A-5K, 6A-6FF, 7A-7CC,
8A-8L, 20A-20L, 21A-21J, 22A-22D and 23A-23E illustrate user's capture associated image sequences, navigation associated picture
Sequence exemplary user interface operation being executed to associated image sequences or executing the operation related to associated image sequences.Figure
The flow chart that 9A-9G be a diagram that the method for the capture associated image sequences according to some embodiments.Figure 10 A-10M be a diagram that
The flow chart of the method for display (playback) associated image sequences according to some embodiments.Figure 11 A-11I be a diagram that according to one
The flow chart by including the method for photo of associated image sequences for the navigation of a little embodiments.Figure 12 A-12B be a diagram that basis
The method to the photo execution more diverse operation compared with picture including associated image sequences of some embodiments
Flow chart.Figure 24 A-24E is the flow chart of the method according to the image in the modification image sequence of some embodiments.Figure 25 A-
25C is the flow chart of the method that the image from image sequence is sent to the second electronic equipment according to some embodiments.Figure
26A-26D is to gather photo (for example, enhancement mode photo or picture) using scene Recognition according to some embodiments
The flow chart of method.Figure 27 A-27E is the method for the montage sequence (for example, enhancement mode photo) according to some embodiments
Flow chart.User circle in Fig. 5 A-5K, 6A-6FF, 7A-7CC, 8A-8L, 20A-20L, 21A-21J, 22A-22D and 23A-23E
Face is used for explanatory diagram 9A-9G, 10A-10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E
In process.
Example devices
The example of these embodiments referring in particular to embodiment, will be illustrated now in the accompanying drawings.In the following specifically describes, explain
State many details to provide the thorough understanding of the embodiment described to each.However, those of ordinary skill in the art
It will be clear that, still can achieve each described embodiment in the case of there is no these details.In other examples, not yet
Method known to specific descriptions, process, assembly, circuit and network are in order to avoid unnecessarily obscure the aspect of embodiment.
Also it will be understood that although term first, second grade is used to describe various key elements in some instances here, but
These key elements should not be limited by these terms.These terms are only used for distinguishing a key element and another key element.Such as first contact
Can be referred to as the second contact and the second contact be similarly properly termed as the first contact and without departing from each described enforcement
The scope of example.It is all contact that first contact contacts with second, but they are not identical contacts, unless context is in addition clear and definite
Point out.
Term used in the description of the embodiment of various descriptions is herein only used for describing specific embodiment, not
It is intended to limit.Singulative " one " used in the description of the embodiment of various descriptions and appended claims, " one
Individual " and " described " be intended to also include plural form, unless context clearly provides contrary instruction.It is also understood that being made herein
Any project that term "and/or" refers to and covers in the associated one or more entries of project listed and
All possible combination.It will be further appreciated that term " inclusion ", " having ", "comprising" and/or " containing " in this manual
Define feature, entirety, step, operation, element and/or partial presence during use, but be not precluded from one or more other
The presence of feature, entirety, step, operation, element, assembly and/or combinations thereof or interpolation.
As used herein, term " if " depending on context and alternatively be read as represent " ... when " or
Once " ..., " or " in response to determining " or " in response to detection ".Similarly, phrase " if it is determined that " or " if detected
[certain condition or event] " depends on context and is alternatively read as representing " once it is determined that, then " or " in response to determining "
Or " once [certain condition or event] is detected " or " in response to [certain condition or event] is detected ".
To electronic equipment, the user interface for this equipment and the reality for the associated procedure using this kind equipment
Apply example to be described.In certain embodiments, equipment is portable communication device (such as mobile phone), and it also comprises other work(
Can, such as PDA and/or music player functionality.The exemplary embodiment of portable multifunction device includes but is not limited to:It is derived from
California, storehouse are than the Apple company of DinoniPodWithEquipment.Such as have
Have laptop computer or the tablet PC of Touch sensitive surface (for example, touch-screen display and/or touch pad) etc other
Portable electric appts can also be used.It is to be further understood that in certain embodiments, this equipment is not portable communications
Equipment, but there is the desk computer of Touch sensitive surface (for example, touch-screen display and/or touch pad).
In the following discussion, describe a kind of electronic equipment including display and Touch sensitive surface.It is, however, to be understood that
It is that this computing device can include other physical user-interface device one or more, such as physical keyboard, mouse and/or behaviour
Vertical pole.
Described equipment generally supports various applications, one or more of such as following application application:Take notes application, paint
Figure application, demonstration application, text processing application, website establishment application, disk write application, spreadsheet application, game application, electricity
Words application, video conference application, e-mail applications, instant message are applied, are taken exercise and support application, photo management application, number
Camera applications, digital video recorder application, web page browsing application, digital music player application and/or video frequency player should
With.
On equipment, the various applications of execution optionally use at least one public physical user-interface device, such as touch-sensitive
Surface.The one or more functions of Touch sensitive surface and the corresponding informance showing on equipment are applied to next application from one
And/or alternatively adjusted in each self-application and/or change.In this way, the public physical structure of equipment is (such as touch-sensitive
Surface) various applications are alternatively supported by user interface directly perceived, transparent for a user.
Now concern is turned to the embodiment of the portable set with touch-sensitive display.Figure 1A is to illustrate according to some in fact
Apply the block diagram of the portable multifunction device 100 with touch-sensitive display system 112 of example.Touch-sensitive display system 112 is sometimes referred to
Make " touch screen ", and be sometimes often simply referred to as touch-sensitive display.Equipment 100 includes memorizer 102, and (it alternatively includes
One or more computer-readable recording mediums), Memory Controller 122, one or more processing unit (CPU) 120, periphery
Interface 118, RF circuit arrangement 108, voicefrequency circuit device 110, speaker 111, mike 113, input/output (I/O) subsystem
System 106, other inputs or control device 116 and outside port 124.Equipment 100 alternatively includes one or more optics and passes
Sensor 164.Equipment 100 is alternatively included for testing equipment 100 (the touch-sensitive display of for example, Touch sensitive surface, such as equipment 100
System 112) upper contact strength one or intensity sensor 165.Equipment 100 is alternatively included for producing on the appliance 100
One or more tactile output maker 167 of raw tactile output is (for example, in the touch-sensitive display system 112 of such as equipment 100
Or tactile output is produced on the Touch sensitive surface of touch pad 335 of equipment 300).These assemblies are alternately through one or more
Communication bus or signal line 103 are communicated.
As used in the specification and in the claims, term " tactile output " refers to and will use touching of this user by user
Feel the equipment that detects with respect to equipment the physical displacement of front position, the assembly (for example, Touch sensitive surface) of equipment with respect to
The physical displacement of another assembly (for example, shell) of equipment or assembly are with respect to the displacement of the center of gravity of equipment.For example, in equipment
Or the assembly of equipment contacts with to the user surface (for example, other parts of finger, palm or user's handss) touching sensitivity
In the case of, the tactile output being produced by physical displacement will be read as corresponding to the equipment experienced or apparatus assembly by user
Physical characteristics in change sense of touch.For example, Touch sensitive surface (for example, touch-sensitive display or Trackpad (trackpad))
Movement is alternatively read as " pressing click " or " lifting click " of physical actuation device button by user.In some cases,
Even if ought not there is the physical actuation device being associated with the Touch sensitive surface physically being pressed (such as displacement) by the movement of user to press
During the motion of button, user will experience sense of touch, such as " press click " or " lifting click ".As another example, even if working as
When the smoothness of Touch sensitive surface does not change, the motion of Touch sensitive surface is alternatively understood by user or is felt as Touch sensitive surface
" roughness ".Although being affected the sensory perception of the personalization by user to this deciphering touching by user, deposit
Sensory perception in the many touch common to most of users.Therefore, when tactile output is described as corresponding to user's
During specific sensory perception (for example, " lift click ", " pressing click ", " roughness "), unless clearly provided contrary instruction, no
Then by the physical displacement corresponding to equipment or its assembly, it is by described for generation for typical for produced tactile output
The sensory perception of (or average) user.
It should be appreciated that equipment 100 is only an example of portable multifunction device, and equipment 100 alternatively has
The more or less of assembly than shown assembly, alternatively combines two or more assemblies, or alternatively has different assemblies
Configuration or arrangement.Each assembly shown in Figure 1A can be in the combination of hardware, software or hardware and software, inclusion one
Or realize in multiple signal processing and/or special IC.
Memorizer 102 alternatively includes high-speed random access memory, and alternatively also includes nonvolatile memory,
Such as one or more disk storage equipments, flash memory device or other non-volatile solid state memory equipment.Equipment 100 other
Assembly (such as (one or more) CPU 120 and peripheral interface 118) is to the access of memorizer 102 alternately through memorizer control
Device 122 processed is controlling.
Peripheral interface 118 can be used for for the input of equipment and output periphery being coupled to (one or more) CPU 120 He
Memorizer 102.One or more processors 120 are run or are executed the various software programs of storage and/or instruction in memorizer 102
Collection, to execute for the various functions of equipment 100 and for processing data.
In certain embodiments, peripheral interface 118, (one or more) CPU 120 and Memory Controller 122 are optional
Ground is in the upper realization of one single chip (such as, chip 104).In some other embodiments, they are alternatively on separate chips
Realize.
RF (radio frequency) circuit arrangement 108 receives and sends RF signal, also referred to as electromagnetic signal.RF circuit arrangement 108
Convert the electrical signal to electromagnetic signal/convert electromagnetic signal into the signal of telecommunication, and via electromagnetic signal and communication network and
Other communication equipments are communicated.RF circuit arrangement 108 alternatively includes the known circuit arrangement for executing these functions, bag
Include but be not limited to:Antenna system, RF transceiver, one or more amplifier, tuner, one or more agitator, numeral letter
Number processor, CODEC chipset, subscriber identity module (SIM) card, memorizer etc..RF circuit arrangement 108 is alternately through wireless
Communication, with the Internet, Intranet and/or the such as cellular telephone network, WLAN being such as also referred to as WWW (WWW)
And/or the wireless network of Metropolitan Area Network (MAN) (MAN) and other equipment are communicated (LAN).Radio communication optionally uses multiple logical
Any one of beacon standard, agreement and technology, including but not limited to:Global system for mobile communications (GSM), enhanced data
Gsm environment (EDGE), high-speed downlink packet access (HSDPA), High Speed Uplink Packet accesses (HSUPA), only evolution
Data (EV-DO), HSPA, HSPA+, double small area HSPA (DC-HSPDA), Long Term Evolution (LTE), near-field communication (NFC), broadband
CDMA (W-CDMA), CDMA (CDMA), time division multiple acess (TDMA), bluetooth, Wireless Fidelity (Wi-Fi) are (for example,
IEEE 802.11a, IEEE 802.11ac, IEEE 802.11ax, IEEE 802.11b, IEEE 802.11g and/or IEEE
802.11n), voice over internet protocol (VoIP), Wi-MAX, agreement (for example, the internet message access association for Email
View (IMAP) and/or post office protocol (POP)), instant message (for example, scalable message transmitting-receiving and Presence Protocol (XMPP), be directed to
The Session initiation Protocol (SIMPLE) of instant message and field balancing extension, instant message and Site Service (IMPS)) and/or short
Messenger service (SMS), or any other suitable communication protocol, including till this document date of filing also untapped go out logical
Letter agreement.
Voicefrequency circuit device 110, speaker 111 and mike 113 provide audio interface between user and equipment 100.
Voicefrequency circuit device 110 receives voice data from peripheral interface 118, and this voice data is converted into the signal of telecommunication, and to raising one's voice
Device 111 transmits this signal of telecommunication.This signal of telecommunication is converted into the audible sound wave of the mankind by speaker 111.Voicefrequency circuit device 110 is also
Receive the signal of telecommunication changed by mike 113 from sound wave.Voicefrequency circuit device 110 convert the electrical signal to voice data and to
Peripheral interface 118 transmits this voice data and is used for processing.Voice data alternately through peripheral interface 118 from memorizer 102 and/
Or RF circuit arrangement 108 obtains and/or transmits to memorizer 102 and/or RF circuit arrangement 108.In certain embodiments, audio frequency
Circuit arrangement 110 also includes earphone jack (for example, 212 in Fig. 2).Earphone jack in voicefrequency circuit device 110 and can be removed
Audio input/output ancillary equipment (such as only has the receiver of output or can export (for example, the receiver of monaural or ears)
Can input the earphone of (for example, mike) again) between offer interface.
I/O subsystem 106 by the input/output ancillary equipment on equipment 100, (control by such as touch screen 112 and other inputs
Control equipment 116) it is coupled to peripheral interface 118.I/O subsystem 106 alternatively includes display controller 156, optical pickocff
Controller 158, intensity sensor controller 159, tactile feedback controller 161 and for other inputs or control device
One or more input controllers 160.This one or more input controller 160 receives from other inputs or control device 116
The signal of telecommunication/send the signal of telecommunication to other inputs or control device 116.Other inputs or control device 116 alternatively include physics
Button (for example, push button, rocker button etc.), dial, slide switches, stick, click type rotating disk etc..Alternative at some
In embodiment, any item that (one or more) input controller 160 is alternatively coupled in the following (or one all do not have
Have):Keyboard, the pointing device of infrared port, USB port and such as mouse etc.One or more buttons are (for example, in Fig. 2
208) alternatively include for speaker 111 and/or mike 113 volume control up/down button.One
Or multiple buttons alternatively include push button (for example, 206 in Fig. 2).
Touch-sensitive display system 112 provides input interface and output interface between equipment and user.Display controller 156
Receive the signal of telecommunication from touch-sensitive display system 112 and/or to the touch-sensitive display system 112 transmission signal of telecommunication.Touch-sensitive display system 112 to
User shows visual output.This visual output alternatively includes any group of figure, text, icon, video and above-mentioned items
Close (being referred to as " figure ").In certain embodiments, some or all of visual outputs correspond to user interface object.As herein
Used, term " can piece supplying (affordance) " refer to user interaction graphical user interface object (for example, be configured to right
The graphical user interface object that the input of order directional pattern user interface object responds).User interaction graphical user interface pair
The example of elephant includes but is not limited to button, slider, icon, optional menu item, switch or other users interface control.
Touch-sensitive display system 112 has based on sense of touch (haptic) and/or tactile (tactile) contact from user's acceptance
The Touch sensitive surface of input, sensor or sensor collection.Touch-sensitive display system 112 and display controller 156 are (together with memorizer
Any associated module and/or instruction set in 102) detect contact in touch-sensitive display system 112 (and any shifting of contact
Move or interrupt), and the contact detecting is converted into the user interface object (example with display in touch-sensitive display system 112
As one or more soft-key buttons, icon, webpage or image) interaction.In the exemplary embodiment, touch-sensitive display system 112 with
Contact point between user corresponds to finger or the stylus of user.
Although using other Display Techniques in other embodiment, touch-sensitive display system 112 optionally uses LCD (liquid crystal
Display screen) technology, LPD (light emitting polymer displays) technology or LED (light emitting diode) technology.Touch-sensitive display system 112 and aobvious
Show that device controller 156 optionally uses any one of multiple touch-sensing technology of currently known or later exploitation and connects to detect
Any movement touched and contact or interruption, these touch-sensing technology include but is not limited to:Capacitive, ohmic, red
Outer and surface acoustic wave technique, and other proximity sensor arrays or for determining or many with touch-sensitive display system 112
The other elements of individual contact point.In the exemplary embodiment, using projection-type mutual capacitance detection technology, such as in California
State, storehouse are than the Apple company of DinonAnd iPodWithThe technology of middle appearance.
Touch-sensitive display system 112 alternatively has the video resolution more than 100dpi.In certain embodiments, touch screen
Video resolution is more than 400dpi (for example, 500dpi, 800dpi or bigger).User optionally uses such as stylus, finger
Or the like any suitable object or appurtenance contacted with touch-sensitive display system 112.In certain embodiments, user interface
Be designed to contact based on finger and gesture come work, this compared with the input based on stylus, because finger is touching
Contact area on screen is bigger and possible accuracy is lower.In certain embodiments, equipment is by the rough input based on finger
Translate into accurate pointer/cursor position or order, to execute the desired action of user.
In certain embodiments, in addition to a touch, equipment 100 alternatively also include for activation or deexcitation specific
The touch pad of function.In certain embodiments, touch pad is the touch sensitive regions of equipment, and this touch sensitive regions is different from touch screen, its
Do not show visual output.Touch pad is optionally the Touch sensitive surface separating with touch-sensitive display system 112 or by touch screen shape
The extension of the Touch sensitive surface becoming.
Equipment 100 also includes the power-supply system 162 for powering to various assemblies.Power-supply system 162 alternatively includes electricity
Management system, one or more power supply (for example, battery, alternating current (AC)), charging system, power failure detection circuit, electricity
Source converter or phase inverter, power supply status indicator (for example, light emitting diode (LED)) and with portable set in power supply
Generation, management with distribute relevant any other assembly.
Equipment 100 alternatively also includes one or more optical pickocffs 164.Figure 1A shows and is coupled to I/O subsystem
The optical pickocff of the controller 158 of optical pickocff in system 106.(one or more) optical pickocff 164 alternatively wraps
Include charge-coupled image sensor (CCD) or complementary metal oxide semiconductors (CMOS) (CMOS) phototransistor.(one or more) optics passes
Sensor 164 receives the light by one or more lens projects from environment, and converts the light to the data of representative image.
Combine with image-forming module 143 (also referred to as camera model), (one or more) optical pickocff 164 alternatively captures static state
Image, enhanced photo and/or video.In certain embodiments, optical pickocff is located at the back side of equipment 100, with equipment just
The touch-sensitive display system 112 in face is relatively so that touch screen is activated the view finder obtaining for use as static and/or video image.
In certain embodiments, another optical pickocff is located at the front of equipment so that check other videos on touch screen as user
User images (for example, for autodyne, for video conference) are obtained during meeting participant.
Equipment 100 alternatively also includes one or more contact strength sensor 165.Figure 1A shows and I/O subsystem
The contact strength sensor of intensity sensor controller 159 coupling in system 106.(one or more) contact strength sensor
165 alternatively include one or more piezoresistive strain instrument, capacitive force sensor, electrostatic force transducer, piezoelectric force transducer, light
Educational level sensor, capacitive touch-sensitive surface or other intensity sensors are (for example, for measuring the power of contact on Touch sensitive surface
The sensor of (or pressure)).(for example, (one or more) contact strength sensor 165 receives contact strength information from environment
Pressure information or the replacement for pressure information).In certain embodiments, at least one contact strength sensor and touch-sensitive table
Face (for example, touch-sensitive display system 112) juxtaposition or close.In certain embodiments, at least one contact strength sensor position
On the back side of equipment 100, it is relative with the touch-sensitive display system 112 in the front positioned at equipment 100.
Equipment 100 alternatively also includes one or more proximity transducers 166.Figure 1A shows coupled to peripheral interface
118 proximity transducer 166.Alternatively, proximity transducer 166 is coupled with the input controller 160 in I/O subsystem 106.?
In some embodiments, when multifunctional equipment is near the ear of user (for example, when user just carries out call), connect
Nearly sensor cuts out and disables touch-sensitive display system 112.
Equipment 100 alternatively also includes one or more tactile output maker 167.Figure 1A shows and I/O subsystem
The tactile output maker of tactile feedback controller 161 coupling in system 106.(one or more) tactile exports maker 167
Alternatively include one or more electroacoustic equipment (such as, speaker or other audio-frequency assemblies) and/or convert electric energy to
Linear movement electromechanical equipment (such as, motor, solenoid, electrically active polymer, piezo actuator, electrostatic actuator or
Other tactiles export formation component (assembly of the tactile output for example, converting the electrical signal on equipment)).(one or more)
Tactile output maker 167 receives tactile feedback from tactile feedback module 133 and generates instruction, and generates energy on the appliance 100
The sense of touch output of enough user's impressions by equipment 100.In certain embodiments, at least one tactile output maker and touch-sensitive table
Face (for example, touch-sensitive display system 112) juxtaposition or close, and (for example, the surface of equipment 100 alternately through vertically
In/outer) or laterally (reciprocal in the surface identical plane with equipment 100) mobile Touch sensitive surface to generate sense of touch defeated
Go out.In certain embodiments, at least one tactile export maker sensor be located at equipment 100 the back side, its with positioned at equipment
The touch-screen display 112 in 100 front is relatively.
Equipment 100 alternatively also includes one or more accelerometers 168.Figure 1A shows coupled to peripheral interface 118
Accelerometer 168.Alternatively, accelerometer 168 can be coupled with the input controller 160 in I/O subsystem 106.One
In a little embodiments, based on the analysis to the data receiving from one or more accelerometers with longitudinal view or transverse views
By presentation of information on touch-screen display.In addition to (multiple) accelerometer 168, equipment 100 alternatively also includes magnetic force
Meter (not shown) and GPS (or GLONASS or other Global Navigation Systems) receptor (not shown), for obtaining and setting
Standby 100 position information relevant with orientation (for example, vertical or horizontal).
In certain embodiments, storage component software in the memory 102 include operating system 126, communication module (or
Instruction set) 128, contact/motion module (or instruction set) 130, figure module (or instruction set) 132, tactile feedback module (or refer to
Order collection) 133, text input module (or instruction set) 134, global positioning system (GPS) module (or instruction set) 135 and application
(or instruction set) 136.Additionally, as illustrated in figs. ia and 3, in certain embodiments, memorizer 102 storage device/overall situation is internal
State 157.Equipment/overall situation internal state 157 includes one or more of the following:Enliven application state, which indicates
Application program (if any) is currently active;Display state, indicates that what application, view and other information take
The regional of touch-sensitive display system 112;Sensor states, set with other inputs or control including from each sensor of equipment
The information of standby 116 acquisitions;And the position with equipment and/or the relevant position of attitude and/or location information.
Operating system 126 (for example, iOS, Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS or such as
The embedded OS of VxWorks) include for control and manage general system task (for example, memory management, storage set
Standby control, power management etc.) various component softwares and/or driver, and promote between various hardware and component software
Communication.
Communication module 128 promotes the communication with other equipment on one or more outside ports 124, and also includes using
The various component softwares of the data receiving in place's reason RF circuit arrangement 108 and/or outside port 124.Outside port 124 (example
As USB (universal serial bus) (USB), live wire etc.) it is suitable to directly be coupled to other equipment or by network (for example, the Internet, no
Line LAN etc.) indirectly it is coupled to other equipment.In certain embodiments, outside port is to be used in from Jia Lifuni with some
Ya Zhou, storehouse are than the Apple company of DinoniPodWith30 needle connectors in equipment
Identical, similar and/or compatible spininess (for example, 30 pin) adapter.In certain embodiments, outside port is to be used in some
From California, storehouse than the Apple company of DinoniPodWithIn equipment
Identical, the similar and/or compatible lightning adapter of lightning adapter.
Contact/motion module 130 is alternatively detected and touch-sensitive display system 112 (being combined with display controller 156)
Contact with other touch-sensitive devices (for example, touch pad or physics click type rotating disk).Contact/motion module 130 includes various soft
Whether the various operations of the detection correlation that part assembly is contacted with (such as finger or stylus) for execution, such as to determine that occur
Contact (for example, detects finger down event), determines contact strength (for example, the power of contact or pressure, or be used for contacting
Power or pressure replacement), it is determined whether there is moving and follow the tracks of motion across Touch sensitive surface and (for example, detecting of contact
One or more finger-dragging event), and determine to contact whether stop (for example, detecting in digit up event or contact
Disconnected).Contact/motion module 130 receives contact data from Touch sensitive surface.Determine that (it is represented by a series of contact data) are touched
The movement of point, alternatively include determining the speed (value) of contact, speed (value and direction) and/or acceleration (value and/
Or the change on direction).These operations be alternatively applied to single contact (for example, the contact of finger or stylus contact) or
The multiple contact simultaneously of person (for example, " multiple point touching "/multiple finger contacts).In certain embodiments, contact/motion module 130
Detect the contact on touch pad with display controller 156.
Contact/motion module 130 alternatively detects by the gesture of user input.On Touch sensitive surface, different gestures has not
Same contact mode (for example, the contact strength of different motions, timing and/or detection).Therefore, gesture is alternately through detection
Specific contact mode and be detected.For example, detection finger tapping down gesture includes:Detection finger down event, be followed by with
This finger down event (for example, in picture mark position) identical position (or substantially the same position) detection finger is upwards
(for example, being lifted away from) event.As another example, detect that the finger drag gesture on touch-surface includes:Detection finger down thing
Part, it is followed by detecting one or more finger-dragging event, is followed by detecting that finger (is lifted away from) event upwards again.Similarly, lead to
The contact mode crossing detection for stylus alternatively detects for stylus and touches, drags, pulling and other gestures.
Figure module 132 is included for rendering in touch-sensitive display system 112 or other display and showing each of figure
Plant known software assembly, including visual effect (for example, brightness, transparency, saturation, the contrast for changing shown figure
Degree or other perceptual property) assembly.As it is used herein, term " figure " include can display to the user that any right
As including but not limited to:Text, webpage, icon (such as including the user interface object of soft-key button), digital picture, video, dynamic
Draw etc..
In certain embodiments, figure module 132 stores the data representing figure to be used.Each figure is alternatively divided
Join corresponding code.Figure module 132 receives the one or more codes specifying figure to be shown from application etc., together with (if
In need) coordinate data and other graphic attribute data, then generate screen image data with defeated to display controller 156
Go out.
Tactile feedback module 133 includes various component softwares as follows, and this component software is used in response to the use with equipment 100
Family interacts, and generates and exports maker 167 for one or more position on the appliance 100 by (one or more) tactile
Produce the instruction of sense of touch output.
Text input module 134 (it is alternatively the assembly of figure module 132) is provided for answering various for Characters
With in (for example, contact person 137, Email 140, IM 141, browser 147 and any other application needing text input)
Soft keyboard.
GPS module 135 determines the position of equipment, and provides this information to use for various applications (for example, to be supplied to phone
138, for using in location-based during dialing, are supplied to camera 143 as picture/video metadata, and are supplied to base
In the application of the service of position, such as weather widget, local Yellow Page widget and map/navigation widget).
Application 136 is alternatively included with lower module (or instruction set) or its subset or superset:
Contact module 137 (sometimes referred to as address book or contacts list);
Phone module 138;
Video conference module 139;
Email client module 140
Instant message (IM) module 141;
Temper support module 142;
For static and/or video image camera model 143;
Image management module 144;
Browser module 147;
Calendaring module 148;
Widget module 149, it alternatively includes one or more of the following:Weather widget 149-1, stock are micro-
Part 149-2, computer widget 149-3, alarm clock widget 149-4, dictionary widget 149-5 and other widget being obtained by user,
And widget 149-6 that user creates;
Widget builder module 150, for making widget 149-6 of user's establishment;
Search module 151;
Video and musical player module 152, it is alternatively by video player module and musical player module group
Become;
Memorandum module 153;
Mapping module 154;And/or
Online Video module 155.
The example of the other application 136 being optionally stored in memorizer 102 includes other text processing applications, other figures
As editor's application, application of drawing, demonstration application, the support application of JAVA function, encryption, Digital Right Management, speech recognition and
Speech reproduction.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, figure module 132 and text input
Module 134, contact module 137 includes managing and (for example, is stored in the contact person in memorizer 102 or memorizer 370
In the application internal state 192 of module 137) executable instruction of address book or contacts list, including:Will be one or more
Name is added in address book;One or more names are deleted from address book;By one or more telephone numbers, one or many
Individual e-mail address, one or more physical address or other information are associated with name;Image is associated with name;Right
Name is classified and is sorted;There is provided telephone number and/or e-mail address to initiate and/or to promote by phone 138, regard
Communication of frequency meeting 139, Email 140 or instant message 141 etc..
In conjunction with RF circuit arrangement 108, voicefrequency circuit device 110, speaker 111, mike 113, touch-sensitive display system
112nd, display controller 156, contact module 130, figure module 132 and text input module 134, phone module 138 includes
In order to following executable instruction:Typing correspond to one of the character string of telephone number, reference address book module 137 or
Multiple telephone numbers, modification typing telephone number, dial corresponding telephone number, conversate and when session completes
When disconnect or hang up.As described above, radio communication optionally uses in multiple communication standards, agreement and technology any one
Kind.
In conjunction with RF circuit arrangement 108, voicefrequency circuit device 110, speaker 111, mike 113, touch-sensitive display system
112nd, display controller 156, (one or more) optical pickocff 164, the controller 158 of optical pickocff, contact module
130th, figure module 132, text input module 134, contacts list 137 and phone module 138, video conference module 139 is wrapped
Including can for initiate, carry out and terminate video conference between user and other participants one or more according to user instruction
Execute instruction.
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display controller 156, contact module 130, figure mould
Block 132 and text input module 134, email client module 140 includes for creating in response to user instruction, sending,
Receive and manage the executable instruction of Email.In conjunction with image management module 144, email client module 140 makes
It is very easy to create and send the Email with the still image being shot by camera model 143 or video image.
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display controller 156, contact module 130, figure mould
Block 132 and text input module 134, instant message module 141 includes corresponding to character string, the use of instant message for typing
In change previous typing character, be used for transmitting corresponding instant message (for example, using Short Message Service (SMS) or multimedia
Messenger service (MMS) agreement pushes and reminds clothes for the instant message based on phone, or using XMPP, SIMPLE, Fructus Mali pumilae
Business (APNs) or IMPS are for the instant message based on the Internet), instant for receive that instant message and checking received
The executable instruction of message.In certain embodiments, transmitted and/or the instant message that received be optionally included in MMS and/
Or enhanced messaging transmitting-receiving services the figure supported in (EMS), photo (for example, still image), enhancement mode photo, audio frequency literary composition
Part, video file and/or other adnexaes.As used herein, (for example, " instant message " refer to message based on phone
The message being sent using SMS or MMS) and message (for example, the disappearing using XMPP, SIMPLE, APNs or IMPS based on the Internet
Breath).
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display controller 156, contact module 130, figure mould
Block 132, text input module 134, GPS module 135, mapping module 154 and musical player module 146, temper support module
142 are included for creating the executable instruction tempering (for example, having time, distance and/or caloric burn target);With (
In sports equipment and intelligent watch) communication of exercise sensor;Receive the executable instruction of workout sensor data;Calibrate and be used for
The executable instruction of the sensor that monitoring is taken exercise;Select and play the executable instruction for the music taken exercise;And display,
Store and transmit the executable instruction of exercise data.
Pass in conjunction with touch-sensitive display system 112, display controller 156, (one or more) optical pickocff 164, optics
The controller 158 of sensor, contact module 130, figure module 132 and image management module 144, camera model 143 include for
Capture still image or video (inclusion video flowing) and store them in memorizer 102, change still image or video
Characteristic and/or the executable instruction deleting still image or video from memorizer 102.
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, figure module 132, text input
Module 134 and camera model 143, image management module 144 includes (for example, editing) for arranging, change or manipulate, mark,
Delete, assume (for example, in digital slide presentation or photograph album) and the executable finger of storage static state and/or video image
Order.
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure
Module 132 and text input module 134, browser module 147 includes that (inclusion is searched for browsing the Internet according to user instruction
Rope, link, reception and display webpage or webpage some and with the adnexa being linked to webpage and alternative document) can
Execute instruction.
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure
Module 132, text input module 134, email client module 140 and browser module 147, calendaring module 148 includes
For created according to user instruction, show, change and store calendar and be associated with calendar data (for example, calendar, treat
Working item list etc.) executable instruction.
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure
Module 132, text input module 134 and browser module 147, widget module 149 is alternatively downloaded by user and is used
Compact applications (for example, weather widget 149-1, stock widget 149-2, computer widget 149-3, alarm clock widget 149-4 and dictionary
Widget 149-5), or the compact applications (widget 149-6 that for example, user creates) being created by user.In certain embodiments,
Widget includes HTML (HTML) file, CSS (CSS) file and JavaScript file.Real at some
Apply in example, widget includes XML (Extensible Markup Language) file and JavaScript file (for example, Yahoo!Widget).
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display controller 156, contact module 130, figure mould
Block 132, text input module 134 and browser module 147, widget builder module 150 is including in order to create widget (example
As user's specified portions of webpage are transformed into widget) executable instruction.
Defeated in conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, figure module 132 and text
Enter module 134, search module 151 include for according to user indicate searching storage 102 in one or more search condition
Text, music, sound, image, video and/or alternative document that (search terms that for example, one or more users specify) mates
Executable instruction.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, figure module 132, audio-frequency electric
Road device 110, speaker 111, RF circuit arrangement 108 and browser module 147, video and musical player module 152 wrap
Include the music and with other audio files of one or more stored in file format (such as allowing that user downloads and playback is recorded
MP3 or AAC file) executable instruction, and include for (for example, in touch-sensitive display system 112 or wirelessly or
On the external display that person connects via outside port 124) display, assume or in addition play back the executable instruction of video.?
In some embodiments, equipment 100 alternatively includes the function of the MP3 player of such as iPod (trade mark of Apple company).
In conjunction with touch-sensitive display system 112, display controller 156, contact module 130, figure module 132 and text input
Module 134, memorandum module 153 includes creating and manage the executable of memorandum, do list etc. according to user's instruction
Instruction.
In conjunction with RF circuit arrangement 108, touch-sensitive display system 112, display system controller 156, contact module 130, figure
Module 132, text input module 134, GPS module 135 and browser module 147, mapping module 154 is included in order to according to user
Indicate and receive, show, changing and storage map and data (for example, the steering direction being associated with map;With regard in certain bits
Put or neighbouring shop and other points of interest data;And other location-based data) executable instruction.
In conjunction with touch-sensitive display system 112, display system controller 156, contact module 130, figure module 132, audio-frequency electric
Road device 110, speaker 111, RF circuit arrangement 108, text input module 134, email client module 140 and browse
Device module 147, Online Video module 155 include allow user access, browse, receiving (for example, by streaming and/or under
Carry), (for example, on touch screen 112 or wirelessly or via outside port 124 connect external display on) playback specific
Online Video, send or many having to the Email of specific Online Video link and managing such as H.264 etc
The executable instruction of the Online Video of individual file format.In certain embodiments, instant message module 141 rather than Email
Client modules 140 are used for being sent to the link of specific Online Video.
Each of above-mentioned identified module and application both corresponds to for executing one or more work(mentioned above
Can and method described in this application (for example, computer implemented method as herein described and other information processing method)
Instruction set.These modules (i.e. instruction set) do not need to realize as single software program, process or module, therefore these moulds
Each subset of block alternatively combines in embodiments or rearranges.In certain embodiments, memorizer 102 is alternatively
Store the subset of above-mentioned module data structure.Additionally, optionally stored other module sums being not described above of memorizer 102
According to structure.
In certain embodiments, equipment 100 be come exclusively by touch screen and/or touch pad predetermined on execution equipment
The equipment of the operation of function set.Set by touch screen and/or touch pad being used as the main input control of operation equipment 100
Standby, the number being physically entered control device (push button, dial etc.) on equipment 100 is alternatively reduced.
This predetermined function set exclusively by touch screen and/or touch pad execution is alternatively included between user interface
Navigation.In certain embodiments, when user touches touch pad, by equipment 100 from any user showing in equipment 100
Interface navigation is to main screen, key frame or root menu.In such embodiment, implemented " menu button " using touch pad.?
In some other embodiments, menu button is that physics pushes button or other are physically entered control device rather than touch pad.
Figure 1B is the block diagram illustrating the example components being used for event handling according to some embodiments.In some embodiments
In, memorizer 102 (in Figure 1A) or memorizer 370 (Fig. 3) include event classifier 170 (for example, in operating system 126) with
And respective application 136-1 (for example, any aforementioned applications 136,137-155,380-390).
Event classifier 170 receives event information, and determines the application 136-1 be delivered to event information and answering
Application view 191 with 136-1.Event classifier 170 includes event monitor 171 and event dispatcher module 174.At some
In embodiment, application 136-1 includes applying internal state 192, its instruction touch-sensitive display system when application is active or is carrying out
(one or more) current application view of display on system 112.In certain embodiments, equipment/overall situation content status 157 are by thing
Part grader 170 is used for determining which or which application is currently active, and applies internal state 192 by event classifier
170 are used for determining the application view 191 being delivered to event information.
In certain embodiments, application internal state 192 includes additional information, such as one of the following or many
Individual:The recovery information to be used when applying 136-1 to recover execution, indicates the information showing or prepares by application 136-
The user interface state information of 1 display, allows users to return to the application previous state of 136-1 or the state queue of view, with
And the queue of reforming/cancel of the prior actions taken by user.
Event monitor 171 receives event information from peripheral interface 118.Event information includes the information with regard to subevent
(for example, touching as the user in the touch-sensitive display system 112 of a part for multi-touch gesture).Peripheral interface 118 transmits
It is from I/O subsystem 106 or sensor, such as proximity transducer 166, (one or more) accelerometer 168 and/or mike
The information that 113 (by voicefrequency circuit devices 110) receive.The information that peripheral interface 118 receives from I/O subsystem 106 includes coming
Information from touch-sensitive display system 112 or Touch sensitive surface.
In certain embodiments, event monitor 171 request of interface 118 transmission to the periphery at a predetermined interval.As response,
Peripheral interface 118 sends event information.In other embodiments, peripheral interface 118 is only occurring critical event (for example, to receive
To exceeding predetermined noise threshold and/or the input being longer than predetermined lasting time) when just send event information.
In certain embodiments, event classifier 170 also includes hitting view determination module 172 and/or Active event is known
Other device determining module 173.
Hit view determination module 172 is provided for when touch-sensitive display system 112 shows more than one view, determining son
The event software program that where occurs in one or more views.View is permissible over the display by control and user
The other elements composition seen.
The another aspect of the user interface being associated with application is one group of view, is sometimes referred to as application view or use herein
Family interfaces windows, wherein display information and occur based on touch gesture.(respective application) that touch is wherein detected should
Optionally correspond to the program level in the program of this application or view hierarchies structure with view.For example, wherein detected
The lowest hierarchical level view touching is alternatively referred to as hitting view, and is alternatively at least partially based on the handss starting based on touch
The hit view of the initial touch of gesture, to determine the event set being identified as correctly entering.
Hit view determination module 172 receives the information relevant with the subevent based on the gesture touching.When application has
With hierarchy tissue multiple view when, hit view determination module 172 will should process this subevent in this hierarchy
Lowermost layer view identification be hit view.In most cases, hit view is wherein to there occurs initiation subevent (i.e.,
Form first subevent of event or potential event in the sequence of subevent) lowest hierarchical level view.Once by hit view
Determining module mark hit view, this hit view generally receives and is identified as to hit identical touch or the input of view with making it
The relevant all subevents in source.
Which view in Active event evaluator determining module 173 determination view hierarchies structure or which view should connect
Receive specific subevent sequence.In certain embodiments, Active event evaluator determining module 173 determines that only hit view should
Receive specific subevent sequence.In other embodiments, Active event evaluator determining module 173 determines the thing including subevent
Reason position is all the view being actively engaged in interior all views, it is thus determined that all views being actively engaged in all should receive spy
Stator sequence of events.In other embodiments, it is associated with a particular figure even if touching subevent and being limited to completely
Region, in hierarchy, higher view will remain in that as the view being actively engaged in.
Event information is assigned to event recognizer (for example, event recognizer 180) by event dispatcher module 174.In bag
Include in the embodiment of Active event evaluator determining module 173, event information is delivered to by enlivening by event dispatcher module 174
Event recognizer determining module 173 definite event evaluator.In certain embodiments, event dispatcher module 174 is by event
Information Store, in event queue, enters line retrieval by corresponding Event receiver module 182.
In certain embodiments, operating system 126 includes event classifier 170.Alternatively, application 136-1 includes event
Grader 170.In other embodiments, event classifier 170 is separate modular, or in memorizer 102 storage another
A part for module (such as contact/motion module 130).
In certain embodiments, application 136-1 includes multiple event processors 190 and one or more application view
191, each of which is included for processing the finger that the touch event in the corresponding views of the user interface of this application occurs
Order.Each application view 191 of application 136-1 includes one or more event recognizer 180.Generally, corresponding application view
191 include multiple event recognizer 180.In other embodiments, one or more event recognizer 180 are that standalone module is (all
As user interface external member (not shown)) a part, or the higher pair of application 136-1 therefrom inheritance method and other attributes
As.In certain embodiments, corresponding event datatron 190 includes one or more of the following:Data renovator 176,
Object renovator 177, GUI renovator 178 and/or the event data 179 receiving from event classifier 170.Event processor 190
Optionally with or call data renovator 176, object renovator 177 or GUI renovator 178 to carry out more new opplication internal state
192.Alternatively, one or more of application view 191 includes one or more corresponding event datatrons 190.Equally, one
In a little embodiments, one or more of data renovator 176, object renovator 177 and GUI renovator 178 are included corresponding
Application view 191 in.
Corresponding event recognizer 180 receives event information (for example, event data 179) from event classifier 170, and base
In this event information identified event.Event recognizer 180 includes Event receiver 182 and event comparator 184.In some enforcements
In example, event recognizer 180 also includes the subset of at least the following:(it is optional for metadata 183 and event delivery instruction 188
Ground includes subevent delivery instructions).
Event receiver 182 receives event information from event classifier 170.This event information is included with regard to subevent (example
As, touch or touch move) information.Depending on subevent, event information also includes additional information, the position of such as subevent
Put.When subevent is related to the motion touching, event information alternatively also includes speed and the direction of subevent.In some enforcements
In example, event includes equipment and is directed to the rotation of another orientation (for example, from longitudinally to horizontal rotation, otherwise also from one
So), and event information includes the corresponding informance of the current orientation (also referred to as equipment attitude) with regard to equipment.
Event information is compared by event comparator 184 with predefined event or subevent definition, and based on this
Relatively to determine event or subevent, or the state of determination or update event or subevent.In certain embodiments, event ratio
Include event definition 186 compared with device 184.This event definition 186 comprises the definition of event (for example, predetermined subevent sequence), example
As event 1 (187-1), event 2 (187-2) etc..In certain embodiments, the subevent in event 187 for example includes touch and opens
Begin, touch and terminate, touch mobile, touch cancellation and multiple point touching.In one example, the definition of event 1 (187-1) is to aobvious
Show the double-click of object.This double-click is for example included to display object, predefined phase the first touch (touch starts), predefined phase
First be lifted away from (touch terminates), to display object, predefined phase the second touch (touch starts) and predefined phase
Second is lifted away from (touch terminates).In another example, the definition of event 2 (187-2) is the dragging to display object.This dragging example
As include to display object, the touch of predefined phase (or contact), movement in touch-sensitive display system 112 for this touch and
Touch is lifted away from (touch terminates).In certain embodiments, event is also included for one or more associated event handlings
The information of machine 190.
In certain embodiments, event definition 187 includes the definition of the event for respective user interfaces object.At some
In embodiment, event comparator 184 execution hit test, to determine the user interface object being associated with subevent.For example, exist
Show in the application view of three user interface object wherein in touch-sensitive display system 112, when in touch-sensitive display system 112
When touch is detected, event comparator 184 execution hit test, so that determined which user interface in three user interface object
Object is associated with this touch (subevent).If each shown object is all associated with corresponding event processor 190,
Then using the result that this hit is tested, event comparator determines which event processor 190 should be activated.For example, event ratio
Select the event processor being associated with the subevent triggering this hit test and object compared with device 184.
In certain embodiments, the definition of corresponding event 187 also includes delay voltage, and it postpones the delivery of event information,
Until having determined whether subevent sequence is corresponding with the event type of event recognizer.
When corresponding event evaluator 180 determines that subevent sequence is not mated with any event in event definition 186, should
Corresponding event recognizer 180 entry event is impossible, event fails or event done state, and hereafter this corresponding event is known
The follow-up subevent based on the gesture touching ignored by other device 180.In the case, other active things are kept for hit view
Part evaluator (if any) continues to follow the tracks of and process the ongoing subevent based on the gesture touching.
In certain embodiments, corresponding event evaluator 180 include having configurable attribute, mark (flag) and/or
The metadata 183 of list, how its instruction event delivery system should execute the sub- thing going to the event recognizer being actively engaged in
Part transmits.In certain embodiments, metadata 183 includes configurable attribute, mark and/or list, and it indicates event recognition
How interactively with each other device is or is enabled to interactively with each other.In certain embodiments, metadata 183 includes indicating that subevent is
The no configurable attribute of various level, mark and/or the list being delivered in view or program hierarchy.
In certain embodiments, corresponding event evaluator 180 is when the one or more specific subevent of event is identified
Activate the event processor 190 being associated with event.In certain embodiments, corresponding event evaluator 180 is to event processor
The event information that 190 deliverys are associated with event.Activation event processor 190 be different from and send to corresponding hit view (or
Postpone to send) subevent.In certain embodiments, event recognizer 180 is dished out the mark being associated with identification events, and with
The event processor 190 that this mark is associated is caught this to indicate and is executed prior defined procedure.
In certain embodiments, event delivery instruction 188 includes subevent delivery instructions, and it delivers the thing with regard to subevent
Part information and do not activate event processor.On the contrary, subevent delivery instructions are to a series of subevents or the view that is actively engaged in
Associated event processor delivers event information.With a series of subevents or the event handling that is associated of the view that is actively engaged in
Machine receives this event information and executes prior defined procedure.
In certain embodiments, data renovator 176 create and more new opplication 136-1 used in data.For example, data
Renovator 176 updates telephone number used in contact module 137, or used in storage video player module 145
Video file.In certain embodiments, object renovator 177 create and more new opplication 136-1 used in data.For example, right
The position creating new user interface object or updating user interface object as renovator 177.GUI renovator 178 updates GUI.
For example, GUI renovator 178 prepares monitor information, and sends it to figure module 132 for showing on the touch sensitive display
Show.
In certain embodiments, one or more event processors 190 include or are able to access that data renovator 176, right
As renovator 177 and GUI renovator 178.In certain embodiments, data renovator 176, object renovator 177 and GUI be more
New device 178 is included in corresponding application 136-1 or the individual module of application view 191.In other embodiments, data is more
New device 176, object renovator 177 and GUI renovator 178 are included in two or more software modules.
It should be appreciated that the discussed above of the event handling touching with regard to the user on touch-sensitive display is also applied for operation tool
There is the user input of the other forms of multifunctional equipment 100 of input equipment, wherein not all user input is all to touch
Initiate on screen.For example, with single or multiple keyboard pressings or keep the mouse that alternatively matches mobile and mouse button by
Pressure;Contact movement (touch, pull, rolling) on Trackpad;Stylus inputs, the movement of equipment;Phonetic order;Detection
The eye motion arriving, the input of bioassay (biometric);And/or above-mentioned every any combinations, optionally all be used as with
Define the corresponding input in subevent of event to be identified.
Fig. 2 illustrates has the portable of touch screen (for example, touch-sensitive display system 112, Figure 1A) according to some embodiments
Formula multifunctional equipment 100.Touch screen alternatively shows one or more figures in user interface (UI) 200.In this embodiment
And in other embodiment as described below, user can (for example (be drawn in figure with one or more fingers 202 not in scale
Go out) or one or more stylus (drawing not in scale in figure)) select one or more figures by figure is made with gesture
Shape.In certain embodiments, the generation that selects to one or more figures interrupts the contact with one or more figures in user
When.In certain embodiments, gesture alternatively include one or more touch, one or more gently sweep (from left to right, from the right side to
Left, up and/or down) and/or the finger that contacted with equipment 100 rotation (from right to left, from left to right, to
Upper and/or downward).In some implementations or situation, figure will not be selected with contacting unintentionally of figure.For example, when with
When to select corresponding gesture be to touch, gently the sweeping gesture and alternatively will not select corresponding application of inswept application icon.
Equipment 100 alternatively also includes one or more physical button, such as " key frame " or menu button 204.As front
Described, menu button 204 is optionally for any application alternatively executing on the appliance 100 navigating in set of applications
136.Alternatively, in certain embodiments, menu button is implemented as the soft-key button in the GUI of display on touch display.
In certain embodiments, equipment 100 includes touch-screen display, menu button 204, is used for opening/pass hull closure
The push button 206 of power supply and locking device and (one or more) volume knob 208, subscriber identity module
(SIM) draw-in groove 210, earphone interface 212 and docking/charging external port 124.Push button 206 is optionally for by pressing
This button simultaneously makes this button be maintained at the predetermined time interval of pressed state to open/pass hull closure power supply;Pressed by pressing this
Button simultaneously carrys out locking device discharging this button before predetermined time interval;And/or unlocker device or initiation releasing process.
In certain embodiments, equipment 100 accepts the phonetic entry for activation or some functions of deexcitation also by mike 113.
Equipment 100 alternatively also includes one or more contact strength for detecting contact strength in touch-sensitive display system 112
Sensor 165 and/or one or more tactile output maker for generating the tactile output of the user to equipment 100
167.
Fig. 3 is the block diagram of the exemplary multifunctional equipment according to some embodiments with display and Touch sensitive surface.Equipment
300 need not to be portable.In certain embodiments, equipment 300 be laptop computer, desk computer, tablet PC,
Multimedia player device, navigator, educational facilities (such as children for learning toy), games system or control device are (for example,
Household or industrial controller).Equipment generally include one or more processing units (CPU) 310, one or more network or other
Communication interface 360, memorizer 370 and the one or more communication bus 320 for interconnecting these assemblies.Communication bus 320
Alternatively include the circuit arrangement (sometimes referred to as chipset) of communication between interconnection and control system assembly.Equipment 300 wraps
Include input/output (I/O) interface 330, it includes the display 340 of usually touch-screen display.I/O interface 330 is alternatively
Also include keyboard and/or mouse (or other pointing devices) 350 and touch pad 355, export for tactile is generated on equipment 300
Tactile export maker 357 and (for example, generate similar to (one or more) tactile output above with reference to described by accompanying drawing 1A
Device 167), sensor 359 (for example, optical pickocff, acceleration transducer, proximity transducer, touch-sensitive sensors and/or with
The similar contact strength sensor of (one or more) contact strength sensor 165 described by upper refer to the attached drawing 1A).Memorizer
370 include high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices;
And alternatively include nonvolatile memory, such as one or more disk storage equipments, optical disc memory apparatus, flash memory set
Standby or other non-volatile solid-state memory devices.Memorizer 370 alternatively includes away from (one or more) CPU 310
Or multiple storage device.In certain embodiments, memorizer 370 storage and the storage in portable multifunction device 100 (Fig. 1)
Program, module data structure or its subset that in device 102, the program of storage, module data structure are similar to.Additionally, memorizer
Non-existent appendage, module data structure in the memorizer 102 of 370 optionally stored portable multifunction devices 100.
For example, the optionally stored graphics module 380 of memorizer 370 of equipment 300, demonstration module 382, word processing module 384, website
Creation module 386, disk write module 388 and/or spreadsheet module 390, and the depositing of portable multifunction device 100 (Figure 1A)
Reservoir 102 does not alternatively store these modules.
Each element in said elements in Fig. 3 is optionally stored on one or more of aforementioned memory equipment
In.Each module in above-mentioned module corresponds to for executing the instruction set of function as described above.Above-mentioned module or program (refer to
Order collection) be not required to be implemented for independent software program, process or module, therefore in embodiments, each seed of these modules
Collection is alternatively combined or is otherwise re-arranged.In certain embodiments, the optionally stored above-mentioned mould of memorizer 370
The subset of block data structure.Additionally, the optionally stored add-on module data structure being not described above of memorizer 370.
Now concern is turned to the user interface (" UI ") alternatively realized on such as portable multifunction device 100
Embodiment.
Fig. 4 A illustrates the exemplary use of the application menu being used on portable multifunction device 100 according to some embodiments
Family interface.Similar user interfaces can be realized on equipment 300.In certain embodiments, user interface 400 includes elements below
Or its subset or superset:
Signal strength indicator 402, for (multiple) radio communication, such as cellular signal and Wi-Fi signal;
Time 404;
Bluetooth designator 405;
Battery status indicator 406;
Pallet 408, has the icon of the application of following frequent use, such as:
ο is used for the icon 416 of phone module 138, is labeled as " phone ", it alternatively includes missed call or voice disappears
The designator 414 of the number of breath;
ο is used for the icon 418 of email client module 140, is labeled as " mail ", it alternatively includes not reading electronics
The designator 410 of the number of mail;
ο is used for the icon 420 of browser module 147, is labeled as " browser ";And
ο is used for the icon 422 of video and musical player module 152, also referred to as iPod (trade mark of Apple company) mould
Block 152, is labeled as " iPod ";And
For the icon of other application, such as:
ο is used for the icon 424 of IM module 141, is labeled as " message ";
ο is used for the icon 426 of calendaring module 148, is labeled as " calendar ";
ο is used for the icon 42 of image management module 144, is labeled as " photo ";
ο is used for the icon 430 of camera model 143, is labeled as " camera ";
ο is used for the icon 432 of Online Video module 155, is labeled as " Online Video ";
ο is used for the icon 434 of stock widget 149-2, is labeled as " stock ";
ο is used for the icon 436 of mapping module 154, is labeled as " map ";
ο is used for the icon 438 of weather widget 149-1, is labeled as " weather ";
ο is used for the icon 440 of alarm clock widget 149-4, is labeled as " clock ";
ο is used for tempering the icon 442 of support module 142, is labeled as " take exercise and support ";
ο is used for the icon 444 of memorandum module 153, is labeled as " memorandum ";
ο is used for the icon 446 of setting application or module, and it provides the setting to equipment 100 and its each application 136
Access.
It should be understood that illustrated icon label in Fig. 4 A is exemplary only.For example, in certain embodiments, for regarding
The icon 422 of frequency and musical player module 152 is noted as " music " or " music player ".Other labels optionally for
Each application icon.In certain embodiments, the label for respective application icon includes answering corresponding to respective application icon
Title.In certain embodiments, the label of application-specific icon with corresponding to application-specific icon application title not
With.
Fig. 4 B illustrates has the detached Touch sensitive surface 451 (example with display 450 (for example, touch-screen display 112)
As, the flat board of Fig. 3 or touch pad 355) equipment (for example, the equipment 300 of Fig. 3) on exemplary user interface.Although will join
Provide following many examples according to the input on touch-screen display 112 (wherein Touch sensitive surface and display are combined), but
In certain embodiments, the equipment of the input on the Touch sensitive surface of the detection as shown in Figure 4 B of this equipment and displays separated.?
In some embodiments, Touch sensitive surface (for example, 451 in Fig. 4 B) has primary axis upper corresponding to display (for example, 450)
The primary axis (for example, 452 in Fig. 4 B) of (for example, 453 in Fig. 4 B).According to these embodiments, equipment detection is corresponding to
On display relevant position position (for example, in figure 4b, 460 correspond to 468 and 462 correspond to 470) place with touch
The contact (for example, 460 and 462 in Fig. 4 B) of sensitive surfaces 451.By this mode, aobvious when Touch sensitive surface and multifunctional equipment
When showing that device (for example, 450 in Fig. 4 B) separates, the user being detected on Touch sensitive surface (for example, 451 in Fig. 4 B) by equipment
Input (for example, contact 460 and contact 462 and its movement) by this equipment for manipulating user interface on this for the display.Should
Understand, similar approach is optionally for other users interface as herein described.
In addition although referring especially to finger input (for example, finger contact, finger tapping down gesture, finger gently sweep gesture) to
Go out the example below it should be appreciated that in certain embodiments, one or more fingers inputs can be used to from another defeated
The input entering equipment (for example, the input based on mouse or stylus input etc.) is substituting.For example, gently sweeping gesture can not be for example
Contact (being followed by the movement of this contact), but alternatively come with mouse-click (being followed by cursor along the movement gently sweeping path)
Substitute.As another example, Flick gesture can not be for example detection contact (is followed by stopping and detects this contact), but optional
Ground is substituted with click when cursor is located on the position of Flick gesture.Similarly, defeated when being detected simultaneously by multiple users
Fashionable it will be appreciated that alternatively simultaneously using multiple computer mouses, or alternatively contacted with finger using mouse simultaneously.
As it is used herein, term " focus selector " refer to user interface with user interacting when front portion
The input element dividing.In some implementations, including cursor or other positions labelling, this cursor be used as " focus selector " with
Box lunch detects input on Touch sensitive surface (touch pad 355 in such as Fig. 3 or the Touch sensitive surface 451 in Fig. 4 B) and (for example presses
Pressure input) and light be marked on particular user interface element (such as button, window, slider bar or other user interface elements) upper when,
Adjust this particular user interface element according to the input being detected.In some implementations, including make it possible to touch
Touch-screen display (the touch-sensitive display system 112 in such as Figure 1A of the user interface elements direct interaction on panel type display
Or the touch screen in Fig. 4 A), the contact that this touch screen is detected is used as " focus selector " so that when in particular user interface
Detect defeated on touch-screen display at the position of element (such as button, window, slider bar or other users interface element)
When entering (the pressing input for example passing through contact), adjust this particular user interface element according to the input being detected.At some
In implementation, on not having the correspondence movement of light target or touch-screen display, the movement of contact is (for example by using Tab key
Or focus is moved to another button from a button by directionkeys) in the case of, focus is by an area from user interface
Domain moves to another region of user interface;In these implementations, this focus selector is according to the difference of user interface
The movement of focus between region and move.Do not consider the particular form that focus selector is taken, this focus selector is typically
User interface elements (or the contact on touch-screen display), this focus selector (for example passes through to equipment instruction user
Be intended to the user interface elements that interact and) by user's control so that the intention linking up user with user interface interacts.For example,
When pressing input is detected on Touch sensitive surface (such as touch pad or touch screen), focus selector (such as cursor, contact or choosing
Select frame) position on the corresponding button instruction user is intended to activate corresponding button (with show on a display of the device
The other user interface elements showing are relatively).
As used in the specification and in the claims, on Touch sensitive surface, the term " intensity " of contact refers in touch-sensitive table
The power of contact (for example, finger contact or stylus contact) or pressure (power of unit area) on face, or refer to for touching
The replacement (agency) of the power of contact or pressure on sensitive surfaces.Contact intensity have the numerical value different including at least four and
More typically include the numerical range of hundreds of differences numerical value (for example, at least 256).Alternatively, using various methods and various
The combination of sensor or sensor (or measurement) contact strength determining.For example, below Touch sensitive surface or neighbouring
In Touch sensitive surface one or more force transducer optionally for measurement power at each point on Touch sensitive surface.At some
In implementation, the power measurement from multiple force transducers be combined (for example, weighted mean or plus with) contacted with determining
The power estimated.Similarly, the pressure-sensitive most advanced and sophisticated pressure optionally for the stylus determining on Touch sensitive surface of stylus.Alternatively,
The size of the contact area detecting on Touch sensitive surface and/or the change to it, the electric capacity of Touch sensitive surface close to contact and/
Or the change to it, close to contact the resistance of Touch sensitive surface and/or its change optionally is used as on Touch sensitive surface
The power of contact or the replacement of pressure.In some implementations, the replacement for contact force or contact pressure measures directly
Be used to determine whether to already exceed intensity threshold (for example, intensity threshold is to be described) corresponding to the unit substituting measurement.?
In some implementations, the measurement that substitutes for contact force or contact pressure is converted into the power of estimation or the pressure of estimation,
And the pressure of the power of this estimation or estimation is used for determining whether to already exceed intensity threshold that (for example, intensity threshold is with pressure
The pressure threshold measured of unit).The attribute that contact strength is used as user input allow for the use to optional equipment function
Family accesses, this optional equipment function otherwise have for display can piece supplying and/or receiving user's input (for example, via touch-sensitive
Display, Touch sensitive surface or physical/mechanical control, such as knob or button) limited effective area (real estate)
Reduced size of equipment on (for example, via touch-sensitive display) can not possibly be by user-accessible.
In certain embodiments, contact/motion module 130 determines behaviour using the set of one or more intensity threshold
Make whether to execute (for example, determining whether user " clicks on " icon) by user.In certain embodiments, according to software
Parameter is determining that (for example, intensity threshold is not by the threshold of activation of specific physical actuation device at least one subset of intensity threshold
Value is determining, and to adjust in the case of the physical hardware not changing equipment 100).For example, do not change Trackpad or
In the case of touch-screen display hardware, the mouse of touch pad or touch screen " is clicked on " threshold value and be can be configured so that predetermined threshold
Scope any on a large scale.Additionally, in certain embodiments, it is provided with software design patterns for adjusting intensity to the user of equipment
One of set of threshold value or multiple intensity threshold (for example, are disposably adjusted by system-level click " intensity " parameter
Single and/or multiple intensity thresholds).
As used in the specification and in the claims, " property strengths " of term contact refer to one based on contact
Or the characteristic of the contact of multiple intensity.In certain embodiments, property strengths are based on multiple intensity samples.Property strengths are optional
The intensity sample based on predetermined number for the ground or (for example after contact is detected, contact lift is being detected with respect to scheduled event
From before, detect contact start mobile before or after, detecting before contact terminates, increasing contact strength is being detected
Before or after big, and/or detecting before or after contact strength reduces) predetermined amount of time (such as 0.05 second, 0.1
Second, 0.2 second, 0.5 second, 1 second, 2 seconds, 5 seconds, 10 seconds) during the intensity sample set collected.The optional ground of property strengths of contact
In the following one or more:The maximum of contact strength, the intermediate value of contact strength, the meansigma methodss of contact strength, connect
Highest 10% numerical value of tactile intensity, half eminence numerical value of contact strength, 90% maximum numerical value of contact strength etc..Real at some
Apply in example, persistent period of contact is used for determining property strengths (for example when property strengths are contact strength in time average
During value).In certain embodiments, property strengths are compared with one or more intensity threshold collection to determine whether to be held by user
Row operation.For example, one or more intensity threshold collection can include the first intensity threshold and the second intensity threshold.In this example
In, the contact with the property strengths not less than first threshold leads to the first operation, has more than the first intensity threshold and not
Lead to the second operation more than the contact of the property strengths of the second intensity threshold, and have strong more than the characteristic of the second intensity threshold
The contact of degree leads to the 3rd operation.In certain embodiments, the comparison quilt between property strengths and one or more intensity threshold
Be used to determine whether to execute one or more operations (for example whether execution corresponding operating or abandon executing corresponding operating) and not by
It is used to determine whether to execute the first operation or the second operation.
In certain embodiments, for the purpose determining property strengths, identify a part of gesture.For example, Touch sensitive surface can
With receive from starting position transition and reach end position continuously gently sweep contact (such as drag gesture), in this end position
Place's contact strength increases.In this example, the property strengths of the contact at end position can be based on continuously gently sweeping contact
Only a part, rather than entirely gently sweep contact (only a part gently sweeping contact for example at end position).In some enforcements
In example, smoothing algorithm can be applied to and gently sweep contact strength before determining the property strengths of contact.For example, smoothing algorithm
Alternatively include one or more in the following:Unweighted moving averages smoothing algorithm, triangle smoothing algorithm, intermediate value filter
Ripple smoothing algorithm and/or exponential smoothing algorithm.In some cases, for the purpose determining property strengths, these smoothing algorithms
Narrow peak value or the valley gently sweeping in contact strength can be eliminated.
User interface map described herein alternatively includes various showing with respect to one or more intensity thresholds (for example,
Contact detection intensity threshold value IT0, light Compressive Strength threshold value ITL, deep Compressive Strength threshold value ITD(for example, it is higher than initially at least IL) and/
Or one or more of the other intensity threshold is (for example, less than ILIntensity threshold IH)) Touch sensitive surface on current intensity show
Figure.This intensity table is frequently not a part for shown user interface, but is provided to assist the understanding to accompanying drawing.?
In some embodiments, light Compressive Strength threshold value corresponds to equipment under this intensity and will execute generally and click on physics mouse or Trackpad
Button be associated operation intensity.In certain embodiments, deep Compressive Strength threshold value corresponds to equipment under this intensity and will hold
Row is with the intensity of generally different from the button operations associated clicking on physics mouse or Trackpad operations.In some embodiments
In, there are property strengths less than light Compressive Strength threshold value (for example and detect higher than Nominal contact detection when contact is detected
Intensity threshold IT0, when being wherein less than this Nominal contact detection intensity threshold value, this contact is no longer detected) when, this equipment is by basis
Movement on Touch sensitive surface for the contact and moving focal point selector, and do not execute and light Compressive Strength threshold value or deep Compressive Strength threshold value phase
The operation of association.Generally, unless otherwise indicated, otherwise these intensity thresholds are consistent between the different collection of user interface graphical
's.
In certain embodiments, equipment depends on based on during this input to the response of the input being detected by equipment
The criterion of contact strength.For example, for some " light press " inputs, the contact more than the first intensity threshold during inputting strong
Degree triggering first response.In certain embodiments, equipment depends on including in input to the response of the input being detected by equipment
The contact strength of period and the criterion of time-based criterion.For example, for some " deep pressure " inputs, super during inputting
The intensity (the first intensity threshold more than for light pressure) crossing the contact of the second intensity threshold is only meeting the first intensity threshold
And meeting has passed between the second intensity threshold triggering second response in the case of time delay.The holding of this time delay
Typically smaller than 200ms of continuous time (for example, 40,100 or 120ms, depending on the value of the second intensity threshold, with the second intensity
Threshold value increases, and time delay increases).Help avoid unexpectedly deep pressure input this time delay.As another example, for
, there is the reduction sensitivity time period, it occurred after the moment that the first intensity threshold is satisfied in some " deep pressure " inputs.Subtracting
During the little sensitivity time period, the second intensity threshold increases.This temporary transient increase in second intensity threshold additionally aids and keeps away
Exempt from unexpectedly deep pressure input.For other pressure inputs deeply, time-based standard is not dependent on to the response deep pressure input is detected
Then.
In certain embodiments, one or more of input intensity threshold value input intensity threshold value and/or corresponding output
Changed based on one or more factors, one or more factors such as user setup, contact movement, input opportunity, run
Application, the speed of intensity applying, the number of parallel input, user's history, environmental factorss (for example, environment noise), focus choosing
Select device position etc..Example factors described in the U.S. Patent application of Serial No. 14/399,606 and 14/624,296,
It passes through to quote to be integrally incorporated in this.
For example, what Fig. 4 C illustrated the intensity of touch input 476 being based partially in time and changed over is dynamic
Intensity threshold 480.Resistance to vibration threshold value 480 be two components and, the first component 474 and second component 478, the first component
474 from this when touch input 476 decay in time after p1 predefined time delay of being initially detected, and second point
Amount 478 follows the intensity 476 of touch input in time.The initial high intensity threshold value of the first component 474 decreases " deep pressure " response
Accident triggering, and still allow " deep press " response at once in the case that touch input 476 provides sufficient intensity.Second component
478 decrease, by the gradual strength fluctuation in touch input, the inadvertent free that " deep pressure " responds, and carry in touch input 476
For still allowing " deep pressure " response at once in the case of sufficient intensity.In certain embodiments, meet dynamically in touch input 476
Intensity threshold 480 (for example, at point 481 in figure 4 c), " deep pressure " response is triggered.
Fig. 4 D illustrates another resistance to vibration threshold value 486 (for example, intensity threshold 1D).Fig. 4 D also illustrates two, and other are strong
Degree threshold value:First intensity threshold IH and the second intensity threshold IL.In fig. 4d although before time p2 touch input 484 full
Foot the first intensity threshold IH and the second intensity threshold IL, but before p2 time delay of having passed at the time 482
Response is not provided.In addition in fig. 4d, resistance to vibration threshold value 486 decays in time, and decay from the time 482 (now
The response being associated with the second intensity threshold IL is triggered) time 488 after passage p1 predefined time delay starts.This
The resistance to vibration threshold value of type decreases and and then triggering and relatively low intensity threshold (the such as first intensity threshold IH or the
Two intensity threshold IL) after associated response or the accident of response that is associated of resistance to vibration threshold value ID simultaneously is touched
Send out.
Fig. 4 E illustrates another resistance to vibration threshold value 492 (for example, intensity threshold ID).In Fig. 4 E, from touch input
490 responses being associated with intensity threshold IL after p2 time delay of having passed when being initially detected are triggered.Meanwhile,
After p1 predefined time delay of having passed when touch input 490 is initially detected, resistance to vibration threshold value 492 decays.
Therefore, the reduction in the intensity of the touch input 490 after the response that triggering is associated with intensity threshold IL, followed by touch
The increase of the intensity of input 490, and do not discharge touch input 490, being capable of (for example, in the time 494) triggering and intensity threshold ID
Associated response, even if the intensity in touch input 490 is less than another intensity threshold (for example, intensity threshold IL).
The property strengths of contact are from less than light Compressive Strength threshold value ITLIntensity increase in light Compressive Strength threshold value ITLWith deep pressure
Intensity threshold ITDBetween intensity sometimes referred to as " light press " input.The property strengths of contact are from less than deep Compressive Strength threshold value ITD
Intensity increase to higher than deep Compressive Strength threshold value ITDIntensity sometimes referred to as " deep press " input.The property strengths of contact are from low
In contact detection intensity threshold value IT0Intensity increase to contact detection intensity threshold value IT0With light Compressive Strength threshold value ITLBetween strong
Degree is sometimes referred to as detecting the contact on touch-surface.The property strengths of contact are from higher than contact detection intensity threshold value IT0Intensity
Decrease below contact detection intensity threshold value IT0Intensity be sometimes referred to as detection contact being lifted away from from contact surface.Real at some
Apply in example, IT0It is zero.In certain embodiments, meet IT0More than zero.In some diagrams, using the circle with shade or oval next
Represent the intensity of the contact on Touch sensitive surface.In some diagrams, it is used without the circle of shade or ellipse to represent touch-sensitive
Corresponding contact on surface and do not denote that the intensity of corresponding contact.
In certain embodiments described herein, in response to the gesture including corresponding pressing input is detected, or response
Inputted by the corresponding pressing that corresponding contact (or multiple contact) executes in detecting, execute one or more operations, wherein this phase
Should press input be based at least partially on detect contact (or multiple contact) intensity increase to higher than pressing input intensity threshold value
To detect.In certain embodiments, increase to higher than pressing input intensity threshold value (for example in response to corresponding contact intensity is detected
The corresponding operating that " the pressing impact " of corresponding pressing input is executed) operate accordingly to execute.In certain embodiments, press
Input includes corresponding contact intensity and increases to higher than pressing input intensity threshold value, and contact strength subsequently decreases below pressing
Input intensity threshold value, and subsequently decrease below pressing input threshold value (for example to phase in response to corresponding contact intensity is detected
The corresponding operating that " the lifting impact " of input executes should be pressed), execution is corresponding to be operated.
In certain embodiments, equipment employs intensity sluggishness to avoid the accident sometimes referred to as " shaken " to input, wherein
Equipment is limited or selects the sluggish intensity threshold with pressing input intensity threshold value with predetermined relationship, (for example sluggish intensity threshold
Less than pressing input intensity threshold X volume unit, or sluggish intensity threshold is 75%, the 90% of pressing input intensity threshold value
Or some rational proportions).Therefore, in certain embodiments, pressing input includes corresponding contact intensity increase to higher than pressing defeated
Enter intensity threshold and contact strength subsequently decreases below corresponding to the sluggish intensity threshold pressing input intensity threshold value, and
Subsequently decrease below sluggish intensity threshold (such as " lifting to corresponding pressing input in response to corresponding contact intensity is detected
The corresponding operating that impact " executes) operate accordingly to execute.Similarly, in certain embodiments, only when equipment detects contact
Intensity increases to the intensity at or greater than pressing input intensity threshold value from the intensity at or below sluggish intensity threshold, and
When alternatively contact strength is subsequently decreased at or below sluggish intensity, just detection pressing input, and in response to detecting
Pressing inputs (for example depending on multiple situations, contact strength increases or contact strength reduces) to execute corresponding operation.
For convenience of description, in response to any one of the following is detected, alternatively trigger in response to inputting with pressing
The pressing that intensity threshold is associated inputs or responsive to include the gesture that pressing inputs and the description of operation executing:Contact is strong
Degree increases to higher than pressing input intensity threshold value, and contact strength increases to defeated higher than pressing from the intensity less than sluggish intensity threshold
Enter the intensity of intensity threshold, contact strength decrease below pressing input intensity threshold value, or contact strength decrease below right
Should be in the sluggish intensity threshold of pressing input intensity threshold value.Additionally, in this example, which describe in response to contact is detected
Intensity decreases below pressing input intensity threshold value and executes operation, and this operation is optionally in response to contact strength reduction is detected
Extremely it is less than corresponding to the sluggish intensity threshold pressing input intensity threshold value or be decreased to less than pressing the slow of input intensity threshold value
Stagnant intensity threshold and be performed.As described above, in certain embodiments, the triggering of these responses additionally depends on time-based standard
Then it is satisfied (time delay for example, passing between the first intensity threshold is satisfied and the second intensity threshold is satisfied).
User interface and associated process
Attention is now directed to may be implemented within display, Touch sensitive surface and optionally for detection and touches
User interface (" UI ") on the electronic equipment of one or more sensors of the intensity of the contact of sensitive surfaces and associated mistake
The embodiment of journey, electronic equipment such as portable multifunction device 100 or equipment 300.
Fig. 5 A-5K illustrates according to some embodiments for capturing the example user of grouped associated image sequences
Interface.User interface in these accompanying drawings is used for process described below being described, including Fig. 9 A-9G, 10A-10M, 11A-
Process in 11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.For ease of explaining, reference pair is had
The operation of the equipment execution of touch-sensitive display system 112 discusses some in each embodiment.In such embodiments, focus selects
Device is optionally:Respective finger or stylus contact contact corresponding representative points (for example, corresponding contact with finger or stylus
The centre of form or with corresponding contact be associated point) or detect in touch-sensitive display system 112 two or more contact
The centre of form.However, alternatively, using display 450 and single Touch sensitive surface 451 in response to when display on display 450
The contact on Touch sensitive surface 451 is detected when user interface illustrated in the accompanying drawings and focus selector and execute similar behaviour
Make.
Fig. 5 A illustrates the media capture user interface 500 for showing live preview on touch screen 112.Scheme in Fig. 5 A
The live preview showing is camera from portable multifunction device 100 (for example, together with the camera model of optical pickocff 164
143) preview of the image obtaining.Live preview in media capture user interface 500 in real time or near real-time ground (for example,
In the process time amount being produced required for shown image by portable multifunction device 100) display obtains from camera
Image.Therefore, in the example illustrating in fig. 5, user is just being look at wherein sky above tree 504 for the sea-gull 502 and is circling in the air in the air
Scene, and portable multifunction device 100 in real time or near real-time ground this scene is reproduced on touch screen 112.One
In a little embodiments, live preview is with (for example, lower than the resolution upper limit of camera) first resolution display image.
In this example, portable multifunction device 100 is configured in when being in live preview strengthening media
Drainage pattern (for example, wherein portable multifunction device 100 is configured to obtain enhancement mode photo) or another acquisition of media
Pattern (for example, wherein portable multifunction device 100 be configured to capturing still image, video, consecutive image or any its
The image of his type) in.In certain embodiments, media capture user interface 500 includes strengthening acquisition of media mould for enabling
Formula (for example, beat opening/closing strengthen acquisition of media pattern) can piece supplying 506.In certain embodiments, media capture user interface
500 include enhancing acquisition of media pattern is in visually indicating of closing.For example, in fig. 5, can piece supplying 506 display word " close
Close ".
In certain embodiments, when enhancing acquisition of media pattern is in and opens, portable multifunction device 100 provides and increases
Strong acquisition of media pattern is in visually indicating (for example, to indicate the image when media capture user interface 500 is shown of opening
And/or voice data is captured).For example, as shown in Fig. 5 C-5H, when enhancing acquisition of media pattern is in and opens, can piece supplying
506 are animated demonstration using the animation illustrating the clock with the pointer 508 advancing around clock.
In certain embodiments, as shown in Figure 5 B, portable multifunction device 100 detection is when enhancing acquisition of media pattern quilt
During disabling to can piece supplying 506 selection (for example, detection can tap gesture 510 in piece supplying 506).As response, portable many
Function device 100 enable strengthen acquisition of media pattern (as in Fig. 5 C-5H by can piece supplying 506 micon).
Portable multifunction device 100 when strengthen acquisition of media pattern be in open when capture media (for example, image and/
Or audio frequency).For example, it is in when opening because strengthening video mode in Fig. 5 C-5E, image 512-1 (Fig. 5 C), image 512-2
(Fig. 5 D) and 512-3 (Fig. 5 E) are captured (for example, being stored in permanent memory).In certain embodiments, with image phase
Corresponding audio frequency is also captured (for example, using mike 113) and is associated with image (for example, for subsequent and image
Play back together, as shown in Fig. 6 E-6I).In certain embodiments, other information (for example, metadata, such as time, position or thing
Number of packages evidence) obtained and be associated (for example, showing, as shown in Fig. 6 J-6M) for subsequent with the image being captured.
Media capture user interface 500 includes shutter release button 514 (being illustrated as shutter release icon).As illustrated in figure 5f,
Media capture user interface 500 is configured to detect (for example, by tap gesture 518) activation to shutter release button 514.Response
In the activation to shutter release button 514 is detected, portable multifunction device 100 by by camera with to shutter release button 514
The multiple images 512 activating collection on the close time are grouped in image sequence (for example, so-called " enhancement mode photo ").
Some images 512 that enhancement mode photo includes shooting before tap gesture 518 (for example, are stored in as already pointed out
At least some of image 512-1 in permanent memory, image 512-2 and image 512-3), presentation graphics (for example, scheme
5F with the shutter corresponding image 512-4 of activation) and tap gesture 518 after shooting some images (for example, Fig. 5 G
Image 512-5 and Fig. 5 H image 512-6).
In certain embodiments, presentation graphics is similar to quiet with it when the shutter release button of conventional digital camera is activated
The only single image of image model capture.In certain embodiments, presentation graphics 512-4 with shutter release button 514 by tapping
At the time of gesture 518 activation, the image of collection is corresponding.In certain embodiments, presentation graphics 512-4 with detecting
To after the activation of shutter release button 514 soon consider shutter sluggishness (activation to shutter release button is being detected and capturing/deposit
Storage presentation graphics between time delay) time place gather image corresponding.In certain embodiments, by collected by camera
Presentation graphics 512-4 be used to indicate such as (as shown in Figure 6A) image and assume the image sequence in pattern.
As it is indicated above, in certain embodiments, live preview is with first resolution display image.In some enforcements
In example, the image of the first resolution that image sequence 512 includes being displayed in live preview, simultaneously by the representative of collected by camera
Property image 512-4 has the second resolution higher than first resolution.For example, as shown in fig. 5i, image sequence 512 includes
(in chronological order):Image 512-2, image 512-3, image 512-4, image 512-5 and image 512-6, wherein image
512-4 is presentation graphics.In certain embodiments, presentation graphics 512-4 is with than image 512-2, image 512-3, image
The higher resolution of 512-5 or image 512-6 is stored.
As shown in Fig. 5 F-5H, in certain embodiments, after the activation of shutter release button 514, media capture user interface
500 show animation (for example, when portable many work(when the residual image that its capture will be comprised in grouped image sequence
Energy equipment 100 captures presentation graphics 512-4 and shows animation during the image of collection after presentation graphics 512-4).In figure
In 5F-5H, media capture user interface 500 show animation that shutter release button 514 splits off and fly to be moved back in together (for example, thus
Provide a user with image and/or the still captured instruction of audio frequency).In certain embodiments, this animation is circulation animation, circulation
Animation shutter release button 514 can be kept and presses or swashed again before camera completes to gather the image for image sequence
By seamless extension in the case of work.
In certain embodiments, after completing capture images sequence, portable multifunction device 100 returns to reference to Fig. 5 A
The function of description is so that the second image sequence can be by user to obtain with capture image sequence similar mode described above
?.
As shown in Fig. 5 I-5K, in certain embodiments, portable multifunction device 100 shows for editing and/or configuring
(for example, second user interface 520 is in the user in image sequence edit pattern at the second user interface 520 of image sequence
Interface).In Fig. 5 I, the image 512 being comprised in image sequence is those with solid-line boundary:Image 512-2, image
512-3, image 512-4, image 512-5 and image 512-6, wherein image 512-4 are presentation graphicses.Therefore, image
512-2 is the initial pictures in image sequence, and has one between initial pictures 512-2 and presentation graphics 512-4
(although in certain embodiments, there is bigger integer number between initial pictures and presentation graphics in image (image 512-3)
The image of amount, such as 5,10 or 30 images).Image 512-6 is the final image in image sequence, and in representativeness
Exist between image 512-4 and final image 512-6 image (image 512-5) (although, in certain embodiments, in generation
There are the image of bigger integer amount, such as 5,10 or 30 images, and this number between table image and final image
Amount does not need identical with the quantity of the image between initial pictures and presentation graphics).Around the image 512-4's in Fig. 5 I
Bold boundary indicates that it is presentation graphics.
As indicated at figure 5j, second user interface 520 is configured to receive the presentation graphics for changing in image sequence
Request (for example, receiving be not touch gestures 522 on the image of current presentation graphics 512-4).As it can be seen from figure 5k, should
By presentation graphics is changed into image 512-3, equipment responds to touch gestures 522 that (it has thick in Fig. 5 K
Body is it is meant that it is new presentation graphics).In certain embodiments, the image between initial pictures and presentation graphics
Quantity and the quantity of image between presentation graphics and final image be fixing so that portable multifunction device
100 pass through to add the image to image sequence at one end and remove (for example, delete or exclude) image at the other end
To change image sequence.For example, in Fig. 5 K, image 512-1 be added to image sequence be maintained at initial pictures with
The quantity of the image between presentation graphics is fixed, and image 512-6 is removed from image sequence to be maintained at representative simultaneously
The quantity of image between image and final image for the property is fixed.
Fig. 6 A-6FF illustrates (to be had for showing (or playback) grouped associated image sequences according to some embodiments
When be referred to as enhancement mode photo) exemplary user interface.User interface in these accompanying drawings is used for illustrating described below
Process, including in Fig. 9 A-9G, 10A-10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E
Process.Although will be with reference to (wherein Touch sensitive surface and display are combined, such as in portable multifunction device with touch-screen display
Illustrate on 100) on input provide following example, but in certain embodiments, described equipment detects as institute in Fig. 4 B
The input on Touch sensitive surface 451 detached with display 450 illustrating.
Fig. 6 A illustrates user interface 600.Portable multifunction device 100 shows grouped in user interface 600
Presentation graphics 602-3 in image sequence 602.In certain embodiments, user interface 600 is in image and presents in pattern
User interface.As explained below, image sequence 602 includes presentation graphics 602-3, is gathering representative diagram by camera
Existed as one or more images (for example, the image 602-5 of the image 602-4 and Fig. 6 D of Fig. 6 C) of gathering afterwards and by camera
One or more images (for example, image 602- of the image 602-1 and Fig. 6 F of Fig. 6 E of collection before collection presentation graphics
2).
In certain embodiments, user interface 600 be image management application in user interface (for example, from Jia Lifu
" photo " of the Apple than Dinon for Buddhist nun's sub-library).For this reason, in certain embodiments, the camera of shooting image sequence 602 is just
Take the part (for example, camera includes the optical pickocff 164 of Figure 1A together with image-forming module 143) of formula multifunctional equipment 100.One
In a little embodiments, by the camera shooting not being portable multifunction device 100, (for example, image sequence 602 exists image sequence 602
It is transferred to portable multifunction device 100) after being taken using the camera on another equipment.In certain embodiments, scheme
As sequence 602 obtains to the activation of shutter release button at the very first time in response to detecting, such as herein with reference to Fig. 5 A-5K
With described by method 900 and/or Figure 22 A-22D and method 2600.In certain embodiments, presentation graphics 602-3 with by
The presentation graphics of collected by camera is corresponding, such as herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method
Described by 2600.
In certain embodiments, portable multifunction device 100 stores multiple grouped image sequences, some of which
It is using portable multifunction device 100 collection, and some of which is to be taken using the camera on distinct device
It is transferred to portable multifunction device 100 afterwards.For example, in some cases, user can be as with regard to method 900/
Acquisition (for example, shoot, capture) multiple equipment (for example, tablet PC, notebook and/or number described by 2600
Word camera, all devices in addition to portable multifunction device 100) on image sequence and by synchronous for image sequence or
In addition it is transferred on portable multifunction device 100.
In certain embodiments, user interface 600 be information receiving and transmitting application in user interface (for example, from Jia Lifu
Buddhist nun's sub-library is than the Apple of Dinon " message ").In some cases, user may be in the portable many work(of herself
(for example, shoot, capture) respective image sequence can be obtained on equipment 100, and also (for example, in information receiving and transmitting application)
Receive the different images sequence from different user.Therefore, in certain embodiments, image sequence 602 be stored in portable
At least one being obtained using portable multifunction device 100 including in the multiple images sequence on formula multifunctional equipment 100
Image sequence and at least one figure being obtained using the camera on the diverse distinct device with portable multifunction device 100
Respective image sequence as sequence.
In certain embodiments, when portable multifunction device 100 is in collection view mode, presentation graphics
602-3 is displayed in user interface 600.
User interface 600 alternatively includes one or more tool bars.For example, as shown, user interface 600 includes
Comprise multiple can piece supplying 606 (for example it is allowed to user uses Email, information receiving and transmitting or other application by image sequence 602
The transmission delivering to other users can piece supplying 606-1;The editor drawing the user interface for editing image sequence 602 can piece supplying
606-2;User can by its indicate image sequence 602 be its collection one of collection can piece supplying 606-3;And allow user
The deletion deleting image sequence 602 can piece supplying 606-4) operation instrument bar 604.As another example, user interface 600 includes
Comprise another multiple can piece supplying (all photos for example, navigating to the photo for the user that navigates when activated can piece supplying
610-1;And navigate to " the completing " at the different user interface of user interface being such as used for obtaining photo when activated can
Piece supplying 610-2) navigation tool bar 608.
Image sequence 602 shown in Fig. 6 A-6V depicts wherein cat 612 and goes in visual field, so that its back of the body is rolled on the ground
And the scene stood up and leave.Meanwhile, the bird 614 cheeping stops falling on branch.When, in reality, such scene can
Several seconds can be spent to represent, but in certain embodiments, image sequence 602 is captured in short time window.For example, one
In a little embodiments, image sequence 602 describe around obtain during presentation graphics 602-3 moment (for example, 0.5 second, 1.0 seconds,
In 1.5 seconds, 2.0 seconds or 2.5 seconds) a moment.For example, the interest of user may be worked as when cat 612 starts to roll about in meadow
Provoked, remind user to shoot presentation graphics 602-3.In certain embodiments, image sequence 602 includes just obtaining generation
Image before table image 602-3 and just obtaining the image after presentation graphics 602-3 so that image sequence
602 include enhancement mode photo, and this works as user's execution (as described in this article) with regard to representative diagram by enhancement mode photo for a moment
As 602-3 some operations when can " vivid ".
Fig. 6 B illustrates when portable multifunction device 100 display user interface 600 by portable multifunction device 100
The Part I 616-1 of the first input 616 detecting.Specifically, when portable multifunction device 100 is in user interface 600
During (it is displayed on touch screen 112) upper display presentation graphics 602-3, portable multifunction device 100 detection first is defeated
Enter 616 Part I 616-1.In certain embodiments, meet predefined criterion according to the first input 616 (for example, to predefine
Enhancement mode photo show criterion) the operation to execute in Fig. 6 B-6O diagram for the determination.For example, in certain embodiments, when
When one input 616 or Part I 616-1 is to press and hold gesture (as shown in Fig. 6 B-6O), execution (for example, triggering) figure
The operation of diagram in 6B-6O.In certain embodiments, portable multifunction device 100 is included for detection and touch screen 112
One or more sensors of the intensity of contact, and predefine intensity standard when the first input 616 has satisfaction (for example, reaching)
Then (for example, the first input 616 exceedes light pressure threshold value IT as shown in intensity schematic diagram 618L, intensity schematic diagram 618 is not to be
The part of shown user interface 600 but be provided to the schematic diagram of supplementary explanation accompanying drawing) characteristic strength when, execution
The operation of diagram in (for example, triggering) Fig. 6 B-6O.In certain embodiments, when the first input 616 or Part I 616-1 tool
Have predefined route characteristic (for example, motionless, as the situation with regard to pressing and holding gesture, or substantial linear, as
Gently sweep/drag gesture in situation) and when meeting predefined criterion of strength (for example, exceeding predefined intensity threshold), execution
The operation of diagram in (for example, triggering) Fig. 6 B-6O.For purposes of explanation, below the operation of diagram in Fig. 6 B-6O is described as
By exceeding light pressure threshold value IT as shown in intensity schematic diagram 618LPress and hold gesture to trigger.
Fig. 6 C illustrates portable multifunction device 100 to when portable multifunction device 100 shows user interface 600
The response of (from Fig. 6 B's) continuity of Part I 616-1 of the first input 616 detecting.Specifically, as Fig. 6 B-6D institute
Show, in response to the Part I 616-1 of the first input 616 is detected, portable multifunction device 100 is using in user interface
In 600, the display of the one or more images being gathered after collection presentation graphics 602-3 by camera is substituted in user
Display to presentation graphics in interface 600.According to some embodiments, as the Part I 616-1 the first input 616 is detected
When, order display is by one or more images of camera collection after collection presentation graphics 602-3.For this reason, Fig. 6 C diagram
Display to image 602-4, image 602-4 be after the presentation graphics 602-3 in image sequence 602 collection next
Image.In image 602-4, cat 612 has been stood up after so that its back of the body is rolled on the ground and has started to leave.Bird 614 protects
Hold and perch in tree.Therefore, image 602-4 is the image shooting after respective image 602-3.
Fig. 6 D illustrates portable multifunction device 100 to when portable multifunction device 100 shows user interface 600
The response of (from Fig. 6 C's) continuity of Part I 616-1 of the first input 616 detecting.In figure 6d, portable many work(
Equipment 100 using in user interface 600, the display of image 602-5 can be substituted in user interface 600 to image 602-4
Display, image 602-5 be by camera collection image sequence 602 in presentation graphics 602-3 after collection last
Individual image.Therefore, Fig. 6 A-6D illustrates and wherein has two of collection after presentation graphics 602-3 in image sequence 602
The example of individual image.It will be appreciated, however, that various embodiments and/or in the case of, image sequence can include being adopted by camera
Image (for example, 2,5,10 or 20 of difference (for example, the integer) quantity of collection after collection presentation graphics 602-3
Image).
In image 602-5, cat 612 partly walks out visual field, and bird 614 keeps perching (for example, image in tree
602-5 is the image shooting after respective image 602-4).Therefore, Fig. 6 B-6D illustrates following example:Wherein according to one
What a little embodiments were deep enough presses and holds gesture so that enhancement mode photo starts to show forward or forward from presentation graphics,
Thus create the vivid impression of image.In certain embodiments, unless the first input 616 is during Part I 616-1
Be terminated, first input 616 Part I 616-1 continue sequentially to substitute by camera collection presentation graphics 602-3 it
The time quantum that all images in the image sequence 602 gathering afterwards spend (for example, has and sequentially substitutes by camera in collection generation
The persistent period of the time quantum that all images in the image sequence 602 of collection after table image 602-3 spend).So
Embodiment in, sequentially substituting by camera collection presentation graphics 602-3 after collection image sequence 602 in institute
The part of the first input 616 that the time quantum having image cost is carried out is not qualified as the part of Part I 616-1, but phase
Counter is considered as the as described below first further part inputting 616.
In certain embodiments, in response to the Part I 616-1 of the first input 616 is detected, to be shown based on such as intensity
It is intended to speed (for example, the speed of display of the intensity of contact in the Part I 616-1 of the first input 616 shown in 618
Intensity with the contact in the Part I 616-1 of the first input 616 increases and increases, and the speed showing is with first
The intensity of the contact in the Part I 616-1 of input 616 reduces and reduces) sequentially show by camera in collection presentation graphics
One or more images of collection after 602-3.In certain embodiments, during the Part I 616-1 of the first input 616
During the order of image sequence 602 is shown, portable multifunction device 100 each respective image in image sequence 602
The upper persistent period stopping to the characteristic strength proportional (or inversely proportional) of the first input 616 when respective image is shown.
Thus, for example, in such embodiments, portable multifunction device 100 is in presentation graphics 602-3 (Fig. 6 B) and image
602-4 (Fig. 6 C) is upper above to stop the shorter time period than in image 602-5 (Fig. 6 D), because the intensity of the first input 616 is right
During the display of presentation graphics 602-3 (Fig. 6 B) and image 602-4 (Fig. 6 C), ratio is in the display phase to image 602-5 (Fig. 6 D)
Between higher, as shown in the intensity schematic diagram 618 by corresponding in figure.
In certain embodiments, after display speed is to this initial dependency of the contact strength in the first input 616,
(in response to the part after a while of the first input 616 as described below is detected) is independent of the part after a while of the first input 616
In contact strength, the follow-up display to image sequence 602 is carried out with fixing display speed.Thus, for example, portable multi-function
The persistent period that equipment 100 is equal in the upper stop of image 602-1 (Fig. 6 E) and image 602-2 (Fig. 6 F), despite the presence of such as by phase
Answer the difference of the intensity of the first input 616 shown in intensity schematic diagram 618 of in figure.
In certain embodiments, as described by below with reference to Fig. 6 E-6I, portable multifunction device 100 in response to
The Part I 616-1 the first input 616 is detected shows or many being gathered after the representative 602-3 of collection by camera
After individual image, equipment 100 circulates back and forth and shows whole image sequence in response to the Part II 616-2 of the first input 616
602 (or display whole image sequence 602, as long as the first input 616 and/or its intensity are maintained).In certain embodiments,
When image sequence 602 is circulated or shows again, hand over from the display that starts to image sequence 602 for the end of image sequence 602
Pitch (cross fade) animation that fades.
Fig. 6 E illustrates following situation:Wherein, after the Part I 616-1 of the first input 616 is detected, portable
(for example, portable multifunction device 100 continues detection to the Part II 616-2 of multifunctional equipment 100 detection first input 616
Enough contacts in finger gesture and/or intensity).In response to the Part II 616-2 of the first input 616 is detected (as schemed
Shown in 6E-6I), portable multifunction device 100 order display in user interface 600 is gathering presentation graphics by camera
One or more images (for example, the image 616-2 of the image 616-1 and Fig. 6 F of Fig. 6 E) of collection, representative diagram before 616-3
One or more images (for example, figure of Fig. 6 H as 602-3 (Fig. 6 G) and by camera collection after collection presentation graphics
Image 602-5 as 602-4 and Fig. 6 I).Therefore, in certain embodiments, in response to detecting second of the first input 616
Point 616-2, from the initial pictures of whole image sequence 602 to final image show this sequence (unless, the such as first input 616
It is interrupted).
In certain embodiments, the Part II 616-2 of the first input 616 is the Part I with the first input 616
616-1 is continuously and immediately following part behind.In certain embodiments, unless the second input 616 is during Part I 616-2
It is terminated, all images that the Part I 616-2 of the second input 616 continues in sequentially alternate image sequence 602 spend
Time quantum (for example, has the persistent period of the time quantum that all images in sequentially alternate image sequence 602 spend).
In image 602-1 (Fig. 6 E), cat 612 initially enters visual field, and bird 614 not yet stops to fall and dwelling on branch.In figure
As, in 602-2 (Fig. 6 F), cat 612 has completely passed into visual field, and bird 614 stop over falls to dwelling on branch.Therefore, image 602-
2 is the image shooting after 602-1, and image 602-1 and 602-2 is both at presentation graphics 602-3 (Fig. 6 G)
Shoot before.(it is identical that respective image is displayed on each in figure therein at them.For example, image 602-4 and Fig. 6 C
With identical in Fig. 6 H.For brevity, the various aspects by reference to these figures of other figures description are not repeated).
In certain embodiments, gathering representative diagram to by camera during the Part I 616-1 of the first input 616
As the order of one or more images of collection after 602-3 (as shown in Fig. 6 B-6D) show with input first 616 second
A difference between the order of whole image sequence 602 (as shown in Fig. 6 E-6I) being shown during part 616-2 is, rings
Ying Yu detects the Part II 616-2 of the first input 616, portable multifunction device 100 (for example, via speaker 111)
Assume the audio frequency 620 corresponding with image sequence 602.This is in Fig. 6 F-6I by word " the " figure stemming from bird 614
Show.(in this example, word " " does not appear in image, but is provided in figure to indicate by speaker
The audio frequency of 111 generations).In certain embodiments, in response to the Part II 616-2 of the first input 616 is detected, show together
Whole image sequence 602 and the corresponding audio frequency 620 recording when collection image sequence 602.In certain embodiments, in response to
The Part I 616-1 the first input 616 is detected does not assume audio frequency.In certain embodiments, if to image sequence 602
First complete playback after (for example, in response to the Part II 616-2 of the first input 616 is detected) maintain first input
616, do not assume audio frequency again during the subsequent playback to this sequence in response to continuing the first input 616 is detected (as joined
Examine what Fig. 6 J-6M was explained, Fig. 6 J-6M illustrates the second playback of whole image sequence 602).In certain embodiments, for
Given input, only assumes audio frequency during the first complete playback to image sequence 602.In certain embodiments, for giving
Fixed input, (for example, completely returns to different second of image sequence 602 in the different subsequent playback to image sequence 602
Put) during or many predefined playback (for example, the first complete playback and second to image sequence 602 completely plays back)
Period only assumes audio frequency.
In certain embodiments, detect the second of the first input 616 in response to (for example, during the first complete playback)
Part 616-2, suitable with fixed rate (for example, with the speed identical speed obtained with image, also referred to as " 1x " speed)
Sequence display image sequence 602.For example, in certain embodiments, during the first complete playback, audio frequency is assumed with 1x speed, and
Corresponding image sequence 602 is shown with 1x speed, gives to play back natural look and sound.In certain embodiments, 1x speed meaning
Taste portable multifunction device 100 stop on respective image and obtain pass through between respective image and next image when
The substantially the same time quantum in the area of a room.
In certain embodiments, independent of the intensity of the contact in the first input 616, figure is sequentially shown with fixed rate
As the image in sequence 602.For example, portable multifunction device 100 is in image 602-1 (Fig. 6 E) and image 602-2 (Fig. 6 F)
Upper stop identical time span, the different input intensities shown in the intensity schematic diagram 618 despite the presence of corresponding in figure.One
In a little embodiments, during the Part II 616-2 of the first input 616, the speed that image sequence 602 is shown sequentially depends on
The intensity of the contact of the first input 616.For example, speed increases with the intensity increase of contact.
In certain embodiments, as described by below with reference to Fig. 6 J-6M, portable multifunction device 100 in response to
(for example, this equipment completes to image sequence 602 the Part II 616-2 display image sequence 602 of the first input 616 is detected
First completely play back) after, equipment 100 circulates back and forth and shows whole in response to the Part III 616-3 of the first input 616
Image sequence 602 (for example, as long as the first input 616 and/or its intensity are maintained).In certain embodiments, work as image sequence
602 when being circulated or being shown again, from the end of image sequence 602 to image sequence 602 start show cross-fade animation.
Fig. 6 J-6M illustrates following situation:Wherein, after the Part II 616-2 of the first input 616 is detected, just
Take formula multifunctional equipment 100 detection first input 616 Part III 616-3 (for example, portable multifunction device 100 continues
Enough contacts in detection finger gesture and/or intensity).In response to the Part III 616-3 of the first input 616 is detected,
Portable multifunction device 100 order display in user interface 600 was gathered before collection presentation graphics 602-3 by camera
One or more images (for example, the image 602-2 of the image 602-1 and Fig. 6 K of Fig. 6 J), presentation graphics 616-3 (figure
One or more images (for example, the image 602-4 of Fig. 6 M) 6L) and by camera collection after collection presentation graphics.
However, in the illustrated example, the first input 616 is terminated, thus obtaining in more detail below during Part III 616-3
The difference in functionality of description.Therefore, in certain embodiments, in response to the Part III 616-3 of the first input 616 is detected, from
The initial pictures of whole image sequence 602 show this sequence to final image, unless completed showing to whole image sequence 602
Before showing, the first input 616 is interrupted (for example, stopping).In certain embodiments, as long as the first input 616 is maintained, circulation
Continue to, although difference in functionality and/operation optionally may be used in different circulations upper (or being executed in different circulations).For example,
As described above, portable multifunction device 100 provides audio frequency in the first complete playback.Similarly, implement at some
In example, in response to the Part III 616-3 of the first input 616 is detected, and show that second completely plays back, portable multi-function
The equipment 100 display metadata corresponding with image sequence 602 622 (date of for example, illustrating to be associated with image sequence 602,
Time, position or any other information).
As it is indicated above, Fig. 6 J-6O illustrates the first input 616 wherein during Part III 616-3 being aborted
(for example, by being lifted away from or intensity drops to predefined threshold value IT as shown in the intensity schematic diagram 618 by Fig. 6 N0Below)
Example.Fig. 6 N-6O illustrates the termination (example according to some embodiments in response to the during Part III 616-3 first input 616
As stopped or suspending) operation that carries out.In certain embodiments, when first input 616 Part II 616-2 or first
Execution similar operation when the first input 616 is terminated during dividing 616-1.In certain embodiments, when the first input 616 is by end
When only, portable multifunction device 100 determine currently displaying image occur from presentation graphics 602-3 before or it
Afterwards.After the presentation graphics 602-3 that currently displaying image occurs in as shown by Fig. 6 N-6O (for example, after which by
Shoot), portable multifunction device 100 presses reversed time order from currently displaying image (for example, the image 602-4 of Fig. 6 N)
To presentation graphics 602-3 order display image sequence 602, (for example, portable multifunction device 100 traces back to presentation graphics
602-3).On the contrary, (for example, it was taken before it) before currently displaying image occurs in presentation graphics 602-3, just
Take formula multifunctional equipment 100 in chronological order from currently displaying image to presentation graphics 602-3 order display image sequence
602 (for example, portable multifunction device 100 to advance (advance) to circulate until presentation graphics 602-3 to front sequence).
In some cases, grouped image sequence seems asymmetric it is meant that existing with regard to its representative diagram
The image of the not equal number occurring before presentation graphics and after presentation graphics.In certain embodiments, portable
It is in currently displaying figure on forward temporal order direction or on reversed time order direction that multifunctional equipment 100 determines
As there is less image and presentation graphics between.After portable multifunction device 100 with currently displaying image with
There is between presentation graphics that direction order display (for example, traveling through) image sequence of less image.
Fig. 6 P-6V illustrates wherein user and is controlled to grouped by controlling the intensity pressing and holding gesture 636
The embodiment of the display of the image in image sequence.Fig. 6 P is similar with Fig. 6 A and is provided as describing in Fig. 6 Q-6V
Function starting point.In certain embodiments, when pressing and holding the predefined criterion of gesture 636 satisfaction, triggering is directed to image
The playback function of sequence 602.For example, it is maintained at by lower threshold value IT when accordingly pressing and holding inputLWhen following, portable many work(
The display that energy equipment 100 will not substitute in response to pressing and holding gesture to presentation graphics 602-3 is (for example, portable
Multifunctional equipment 100 is instead executed different functions).On the contrary, when the gesture 636 that presses and holds as shown in Fig. 6 Q exceedes
Light pressure threshold value ITLWhen, (shown in intensity schematic diagram 618) is pressed and held gesture 636 by portable multifunction device 100
Intensity is mapped at least some of the image in image sequence 602 image.For example, because triggering is directed to image sequence in Fig. 6 Q
The playback function of row 602, so when the intensity pressing and holding gesture 636 is in strength range 618-3 (Fig. 6 Q and Fig. 6 U)
When portable multifunction device 100 display presentation graphics 602-3.Similarly, when the intensity pressing and holding gesture 636 is in
Portable multifunction device 100 display image 602-1 when in strength range 618-1;At the intensity pressing and holding gesture 636
Display image 602-2 when in strength range 618-2 (Fig. 6 V);When the intensity pressing and holding gesture 636 is in strength range
Display image 602-4 when in 618-4 (Fig. 6 T and Fig. 6 R);And when the intensity pressing and holding gesture 636 is in strength range
Display image 602-5 when in 618-5 (Fig. 6 S).Therefore, (for example, Fig. 6 Q-6V illustrates the intensity based on user input of user
Obtain rearwardly and a forwardly smoothly moving picture and show the replacement to the image in grouped image sequence) come backward (or back) and
Forward (or forward) swipe image in grouped image sequence (for example, directly control aobvious in grouped image sequence
The image showing) ability.
Fig. 6 W is illustrated wherein user and is controlled by control input intensity 654 in grouped image sequence 656
Presentation graphics after obtain the display of image embodiment.In example shown in Fig. 6 W, in light pressure threshold value ITLWith
Deep pressure threshold value ITDBetween intensity level be mapped to after the presentation graphics in grouped image sequence 656 obtain phase
Answer image.Intensity schematic illustration shown in Fig. 6 W is mapped to if the arrow instruction by them is in grouped image sequence
The input intensity 654 of the specific image obtaining after the presentation graphics in row 656.So when input exceedes light pressure threshold value ITL
When, user can swipe by grouped image sequence forward and afterwards by the intensity of control input backward
The image obtaining after presentation graphics.In certain embodiments, when input exceedes deep pressure threshold value ITDWhen, replaced with fixed rate
For (for example, advancing) grouped image sequence 656, (for example, this equipment is with the grouped image sequence of fixed rate playback
656, it is circulated back to beginning after the final image in grouped image sequence 656 is shown).Fig. 6 W also illustrates and warp
What the image sequence 656 of packet was associated (is carried for example, and together with image sequence 658 grouped as described above
For) audio frequency 658 and metadata 660.
Fig. 6 X illustrates the embodiment substantially similar with the embodiment with reference to Fig. 6 A-6O description, except equipment 100 is to user
The response of the initial part of input is different from those embodiments with reference to Fig. 6 A-6O description.Specifically, the reality illustrating in Fig. 6 X
Apply in example, in response to the Part I of the first input is detected (for example, with those the similar users with reference to Fig. 6 A-6O description
Input), equipment 100 by the initial pictures (for example, as shown in schematic diagram 656) that are to a transition directly in image sequence or passes through
Simply play forward image sequence (for example, by playing forward several images, as shown in schematic diagram 650) and intersect afterwards
It is faded to initial pictures (for example, non-originating broadcasting final image in image sequence for the image sequence forward) to start back
Put.
In Fig. 6 X, playback during user input 648 is by one or more curves (for example, curve 662 and/or take
Disappear 664) represent.According to some embodiments, represent that the bold portion of the curve of the playback during user input 648 represents and returned
The image put, and dotted portion represents the image not being played.
So, for example, in schematic diagram 650, equipment 100 is initially displayed presentation graphics 652.In response to user input
648, equipment 100 is play forward three images (for example, or an image or ten images) and is arrived image 660, utilizes afterwards
Display to initial pictures 654 to substitute the display to image 660.According to above with reference to Fig. 6 A-6O description after equipment 100
Any play forward image sequence from initial pictures 654 in embodiment (for example, cycles through on following cycle and has
Sound, metadata, etc. enhancement mode photo).Therefore, equipment 100 is by being shown in gathering after presentation graphics 652
Individual or multiple images to be converted to display initial pictures 654 from display presentation graphics 652 that (or any other accordingly previously schemed
Picture).In certain embodiments, equipment 100 is by presentation graphics 652 and/or the image of collection after presentation graphics
One or more cross-fade and/or obscure in initial pictures 654.
As another example, in schematic diagram 656, equipment 100 is initially displayed presentation graphics 652.Defeated in response to user
Enter 648, equipment 100 utilizes the display to initial pictures 654 (or any other corresponding prior images) to substitute to representativeness
The display of image 652.According to any in the embodiment above with reference to Fig. 6 A-6O description come by image sequence after equipment 100
Play forward from initial pictures 654 (for example, following cycle cycles through have sound, metadata, etc. enhancement mode shine
Piece).Therefore, equipment 100 is to a transition directly to show initial pictures 654 from display presentation graphics 652.In certain embodiments,
Equipment 100 by presentation graphics 652 cross-fade and/or obscures in initial pictures 654.
In certain embodiments, as shown in schematic diagram 656, it is converted to display initial pictures from display presentation graphics 652
654 (for example, corresponding prior images) do not include show by camera collection presentation graphics 652 after gather one or more
Not any (for example, this equipment does not directly travel back to initial pictures 654) in image.
In certain embodiments, feature (for example, first of the first input 648 based on user input 648 for the equipment 100
The characterisation contact intensity divided) which kind of to apply change (for example, the transformation shown in schematic diagram 650 or schematic diagram 656 to determine
Shown in transformation).For example, when the Part I of the first input 648 exceedes deep pressure threshold value ITD(as shown in schematic diagram 668-2)
When, equipment 100 changes according to schematic diagram 656.When the Part I of the first input 648 is not above deeply pressing threshold value ITD (such as
Shown in schematic diagram 668-1) when, equipment 100 changes according to schematic diagram 650.
In certain embodiments, some images of collection quilt when generating image sequence during the collection to image sequence
Abandon or merge.For example, abandon (for example, not including) broad image in image sequence, and/or one or more details in a play not acted out on stage, but told through dialogues
Image is combined to improve the quality of the image in image sequence.In some cases, abandon and/or fusion image obtain when
Between on the image sequence that is non-uniformly spaced.For example, if by camera ten images of collection per second, but three images are fused
To form the corresponding single image in image sequence, then corresponding single image represents bigger than other images in image sequence
Time channel.Therefore, in certain embodiments, according to removing and/or merge come to image sequence to the image in image sequence
The playback of row is retimed, and (for example, in the above examples, when playing back broadcasting image sequence with 1x, equipment 100 is corresponding
Stop 0.3 second on single image, or it is in addition by the length of residence time three times).
According to some embodiments, Fig. 6 Y-6BB illustrates the first image (for example, the enhancement mode being initially displayed in image sequence
Photo) user interface.User interface plays figure in the following manner forward or backward according to the intensity of the contact of user input
As sequence:It is mapped to, higher than the strength range of threshold value, the forward rate being moved through image sequence, and be less than the intensity model of threshold value
Enclose and be mapped to the speed backward being moved through image sequence.In certain embodiments, user interface is not followed to image sequence
Ring.So, when initial pictures are shown, have the contact of intensity higher than threshold value with the speed proportional to contact strength to
Front broadcasting image and the stopping when reaching final image.When user loosens contact so that contact strength drops to below threshold value
When, this equipment is play image backward with the speed based on contact strength and is stopped when reaching initial pictures.
Fig. 6 Y illustrates user interface 640.In certain embodiments, user interface 640 is screen locking user interface.For example,
User can allow her to put in her pocket equipment 100 without by mistake on the appliance 100 with locking device 100
Execute operation (for example, unexpectedly calling someone).In certain embodiments, when user's (for example, by pressing any button) calls out
During awake equipment 100, screen locking user interface 640 is shown.In certain embodiments, on touch screen 112 gently sweep gesture initiate right
The process that equipment 100 is unlocked.
Portable multifunction device 100 shows the representative diagram in grouped image sequence 602 in user interface 640
As 602-1.In certain embodiments, image sequence 602 is that user (for example, is arranging user for what her screen locking was selected
Select in interface) enhancement mode photo.In example shown in Fig. 6 Y-6BB, image sequence is to depict wherein cat 612 to walk
To in visual field and make the enhancement mode photo of its back of the body rolling scene on the ground.Meanwhile, bird 614 stops falling on branch.At some
In embodiment, image sequence includes one or more images (for example, presentation graphics of collection after collection presentation graphics
602-1 is the initial pictures in image sequence).
In certain embodiments, user interface 640 also includes fast access information 642, such as time and date information.
When showing presentation graphics 602-1 on touch screen 112, equipment 100 detection is for it on touch screen 112
The characteristic strength of contact exceedes the input 644 (for example, pressing and holding gesture) of intensity threshold.In this example, intensity threshold
It is gently to press threshold value ITL.As shown in schematic diagram 618 (Fig. 6 Y), input 644 includes exceeding light pressure threshold value ITLContact.
In response to the increase of the characteristic strength of contact is detected, this equipment is in chronological order to be at least partially based on input
The speed propulsion that the characteristic strength of 644 contact determines passes through one or many of collection after collection presentation graphics 602-1
Individual image.So, for example, with as indicated in speed schematic diagram 646 (Fig. 6 Y) based on shown in intensity schematic diagram 618 (Fig. 6 Y)
The speed of contact strength presentation graphics 602-1 (Fig. 6 Y) is shown using substituting to the display of image 602-2 (Fig. 6 Z)
Show.Image 602-2 is the image gathering after presentation graphics 602-1 in image sequence 602.With such as speed schematic diagram
In 646 (Fig. 6 Z), the faster speed based on the contact strength shown in intensity schematic diagram 618 (Fig. 6 Z) of instruction is using to image
The display of 602-3 (Fig. 6 AA) is substituting the display to image 602-2 (Fig. 6 Y).Image 602-3 be in image sequence 602
The image of collection after image 602-2.
In Fig. 6 AA, the intensity of the contact of input 644 drops to ITLHereinafter, its be in this example for backward or to
The front threshold value play by image sequence 602.As a result, with the speed backward of the current contact strength based on input 644 using first
Front image 602-2 (Fig. 6 BB) carrys out alternate image 602-3 (Fig. 6 AA).
In certain embodiments, in speed schematic diagram 646 (Fig. 6 Y-6AA) speed of instruction with such as intensity schematic diagram 618
IT shown in (Fig. 6 Y-6AA)LThe absolute value of the difference between the current contact strength of input 644 is proportional.The direction of movement
(for example, moving forward) is above again below (for example, being moved rearwards by) IT based on current contact strengthL(or any other
Suitable threshold value).
In certain embodiments, forward rate or backward speed in real time or near real-time be determined so that user can
To be accelerated by the characteristic strength that change contacts or to slow down (on forward direction or backward directions) by the movement of image.Cause
This, in certain embodiments, user forwardly and rearwardly can be swiped by increasing and reducing the contact strength of user input 644
By image sequence 602 (in for example, between the initial pictures in image sequence and final image).
According to some embodiments, Fig. 6 CC-6DD be a diagram that speed V of movement how with input 644 current contact strong
The related figure of degree I.
As shown in Fig. 6 CC, in this example, for the threshold value moving forward/being moved rearwards by be gently pressure threshold value ITL.When defeated
The current contact strength entering 644 is equal to light pressure threshold value ITLWhen, equipment 100 sequentially or by reversed time order advances between being late for
By image sequence.Therefore, equipment 100 maintains currently displaying image (for example, the speed of movement from image sequence 602
Equal to 0x, wherein 1x is the speed that the image in image sequence 602 is collected).When the current contact strength of input 644 is just high
In light pressure threshold value ITLWhen, with first rate (for example, 0.2x), in chronological order or image sequence is passed through in propulsion to equipment 100.When defeated
The current contact strength entering 644 is in light pressure threshold value ITLDuring following equal amount, equipment 100 with first rate (for example ,-
0.2x, wherein negative sign represent reversed time order or play back backward) pass through image sequence by reversed time order or propulsion.
In this example, equipment 100 has and reaches deep pressure threshold value IT when the current contact strength of input 644D(or any its
His upper limit threshold) and prompting threshold value ITHThe maximum rate V reaching when (or any other suitable lower threshold)max(for example, just
Or negative 2x).When contact is detected on Touch sensitive surface, the speed being moved through image sequence is subject to maximum reverse rate constraint.
Fig. 6 DD is shown in which the speed of movement in light pressure threshold value ITLWith deep pressure threshold value ITDBetween from 0x exponentially
Increase to VmaxExemplary response curve.In deep pressure threshold value ITDHereinafter, the speed of movement is constant.
According to some embodiments, some conditions alternatively lead to equipment 100 to deviate the current contact being based only on input 644
The speed of the movement of intensity.For example, when equipment 100 when being advanced through image sequence 602 close to final image when, with
The speed of movement is that the speed of movement in the case of the current contact strength being based only on input 644 compares, and equipment 100 is put
Slowly the speed (for example, equipment 100 is worked as at the end of it reaches image sequence and slightly " braked ") moving.Similarly, at some
In embodiment, when equipment 100 when backward propulsion by image sequence 602 when close to initial pictures when, with the speed in movement be
It is based only on the speed of movement in the case of the current contact strength of input 644 to compare, equipment 100 slows down the speed of movement
(for example, equipment 100 slightly " is braked " when it reaches the beginning of image sequence retrogressing).
Fig. 6 EE-6FF illustrate wherein applying for information receiving and transmitting (for example, from California storehouse than Dinon Herba Marsileae Quadrifoliae
Fruit company " message ") user interface 680 in display and/or replay image sequence 602 embodiment.In some embodiments
In, (for example, user can be to for the message conversation that image sequence 602 is shown in the scrollable field 682 of information receiving and transmitting application
Upper roll or scroll down through with check in region 682 earlier or newer message).In certain embodiments, representative diagram
As 602-3 is initially displayed in information receiving and transmitting application 680.In certain embodiments, image sequence 602 is in response to gently sweeping handss
Gesture/drag gesture (for example, plays back) showing.In certain embodiments, to the display of the image in image sequence 602 by pulling
To control, (for example, user can be by being moved respectively to the right or the left side come in image sequence by drag gesture for the position of gesture
Swipe forward or backward in 602).For example, in Fig. 6 EE-6FF, contact 686 moves into place from position 686-1 (Fig. 6 EE)
686-2 (Fig. 6 FF), it is by image sequence 602 from presentation graphics 602-3 (Fig. 6 EE) propulsion image 602-4 (Fig. 6 FF).
In certain embodiments, gently sweep gesture to trigger to image sequence 602 in the termination (for example, being lifted away from) gently sweeping gesture
Playback.In certain embodiments, image sequence 602 does not play back but on the contrary until the termination of drag gesture during drag gesture
(for example, being lifted away from) just plays back.In certain embodiments, (for example, image sequence 602 plays back in response to pressing and holding gesture
Image sequence 602 in information receiving and transmitting application 680 is according to any in the embodiment with reference to Fig. 6 A-6DD description playing back).?
In some embodiments, the image sequence 602 in information receiving and transmitting application 680 is according to appointing in the embodiment with reference to Fig. 7 A-7CC description
What is playing back.
In certain embodiments, the scrollable field that image sequence 602 is applied with information receiving and transmitting is scrolled and is shown
(for example, being played), and image in some cases by be interspersed in via information receiving and transmitting application send and receive (for example,
In conversation bubble) in text message 684 or other message.In some cases, user may be portable at herself
(for example, shoot, capture) respective image sequence is obtained on formula multifunctional equipment 100, and also (for example, in information receiving and transmitting
In application) receive different images sequence from different user.Therefore, in some cases, it is stored in portable multi-function
Multiple images sequence on equipment 100 is included at least one image sequence being obtained using portable multifunction device 100 and makes
At least one image sequence being obtained with the camera on distinct device diverse with portable multifunction device 100.
Fig. 7 A-7CC is illustrated and (is sometimes referred to as strengthened by associated image sequences for navigation according to some embodiments
Type photo) exemplary user interface.User interface in these accompanying drawings is used for process described below is described, including figure
Process in 9A-9G, 10A-10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.Although will
With reference to (wherein Touch sensitive surface and display are combined, and such as illustrate on portable multifunction device 100 with touch-screen display
) on input provide following example, but in certain embodiments, the detection of described equipment as illustrated in fig. 4b with aobvious
Show the input on the detached Touch sensitive surface 451 of device 450.
Portable multifunction device 100 shows user interface 700.User interface 700 alternatively includes one or more works
Tool bar.For example, as shown, user interface 700 include comprising multiple can piece supplying 706 (for example it is allowed to user uses electronics postal
First image sequence 702 is sent to the transmission of other users by part, information receiving and transmitting or other application can piece supplying 706-1;Draw and use
Editor in the user interface editing the first image sequence 702 can piece supplying 706-2;User can indicate the first image sequence by it
Row 702 are that the collection of one of its collection can piece supplying 706-3;And allow user delete image sequence 702 deletion can piece supplying
Operation instrument bar 704 706-4).As another example, user interface 700 includes comprising another multiple can piece supplying (for example, lead
The all photos navigating for the photo of the user that navigates can piece supplying 710-1;Navigate to and be such as used for obtaining the user interface of photo
" the completing " at different user interface can piece supplying 710-2) navigation tool bar 706.
Fig. 7 A-7CC illustrates following example:Wherein (for example, portable multifunction device 100 stores multiple images sequence
First image sequence 702 of 7A-7CC, the second image sequence 724, the 3rd image sequence 726 and the 4th grouped image sequence
760).First, the first grouped image sequence 702 includes being shot by camera the first presentation graphics 702-3 (Fig. 7 A), by
Camera collection the first presentation graphics 702-3 after collection one or more images (for example, the image 702-4 of Fig. 7 C and
The image 702-5 of Fig. 7 D) and the one or more images by camera collection before collection the first presentation graphics 702-3
(for example, the image 702-1 of the image 702-2 and Fig. 7 I of Fig. 7 H).Therefore, the time sequencing of the first image sequence 702 is (for example,
Order by image shot by camera) be:Image 702-1, image 702-2, image 702-3, image 702-4 and image 702-5.
First image sequence 702 depicts wherein cat 712 and goes in visual field, so that its back of the body is rolled on the ground and stand up simultaneously
The scene left.Meanwhile, bird 714 stops falling on branch.When, in reality, such scene may spend to represent for several seconds, but
It is in certain embodiments, the first image sequence 702 is captured in short time window.For example, in certain embodiments, herein
Described in image sequence in any can describe around obtain its respective representative image when a moment (for example, at half second
Or in one second) a moment.For example, the interest of user may be worked as when cat 712 starts to roll about in meadow and be provoked, and remind and use
Family shoots the first presentation graphics 702-3.In certain embodiments, the first image sequence 702 includes just obtaining representativeness
Image before image 702-3 and just obtaining the image after the first presentation graphics 702-3 so that the first image sequence
Row 702 include enhancement mode photo, and this works as user's execution (as described in this article) with regard to representativeness by enhancement mode photo for a moment
Can " vivid " during some operations of image 602-3.
Second grouped image sequence 724 includes the second presentation graphics 724-3 (Fig. 7 F) and by camera in collection the
At least one or more image (for example, figure of the image 724-1 and Fig. 7 D of Fig. 7 C of collection before two presentation graphics 724-3
As 724-2).Second image sequence 724 include by camera collection the second presentation graphics 724-3 after collection one or
Multiple images.Therefore, the time sequencing (for example, by the order of image shot by camera) of the second image sequence 724 is:Image 724-
1st, image 724-2 and image 724-3.Second image sequence 724 is described as follows scene:Wherein sea-gull 728 is on certain distance
Circle in the air (the image 724-1 of Fig. 7 C), circles in the air (the image 724-2 of Fig. 7 D) towards prospect and start to fly the remote (figure of Fig. 7 F again
As 724-3).
3rd grouped image sequence 726 includes third representative image 726-1 and by camera in collection third generation table
Property image 726-1 after collection at least one or more image (for example, image 724- of the image 726-3 and Fig. 7 I of Fig. 7 H
2).3rd image sequence 726 is included by one or more figures of camera collection before collection third representative image 726-1
Picture.Therefore, the time sequencing (for example, by the order of image shot by camera) of the 3rd image sequence 726 is:Image 726-1, image
726-2 and image 726-3.3rd image sequence 726 depicts following scene:Wherein whale 730 is primered the (figure of Fig. 7 K
As 726-1), or even the visual field went swimming (the image 726-2 of Fig. 7 I) in ship 732, and disappear from visual field, slip into
In ocean (the image 726-3 of Fig. 7 H).
4th grouped image sequence 760 includes forth generation table image 760-3 and by camera in collection forth generation table
Property image 760-1 before collection at least one or more image (for example, image 760- of the image 760-1 and Fig. 7 W of Fig. 7 V
2).Therefore, the time sequencing (for example, by the order of image shot by camera) of the 4th image sequence 760 is:Image 760-1, image
760-2 and image 760-3.4th image sequence 760 is described as follows scene:Wherein fireworks tube 762 launches the (image of Fig. 7 V
760-1), fly (the image 760-2 of Fig. 7 W) in the air and explode (the image 760-3 of Fig. 7 X).
In certain embodiments, the first image sequence 702 was gathered before the second image sequence 724 by camera, and the
One image sequence 702 is gathered after the 3rd image sequence 726 by camera.
In certain embodiments, user interface 700 be image management application in user interface (for example, from Jia Lifu
" photo " of the Apple than Dinon for Buddhist nun's sub-library).For this reason, in certain embodiments, shoot the first image sequence 702 (and/or
Second image sequence 724;3rd image sequence 726;Etc.) camera be portable multifunction device 100 part (for example,
Camera includes the optical pickocff 164 of Figure 1A together with image-forming module 143).In certain embodiments, the first image sequence 702 by
Be not portable multifunction device 100 camera shoot (for example, the first image sequence 702 is using the camera on another equipment
It is transferred to portable multifunction device 100) after being taken.In certain embodiments, the first image sequence 702 is in response to inspection
Measure and at the very first time, the activation of shutter release button is obtained, such as herein with reference to Fig. 5 A-5K and method 900 and/or figure
Described by 22A-22D and method 2600.In certain embodiments, the first presentation graphics 702-3 and the generation by collected by camera
Table image is corresponding, as described by herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600
's.
In certain embodiments, some in multiple images sequence are using portable multifunction device 100 collection, and
And some are to be transferred to portable multifunction device 100 using the camera on distinct device after being taken.For example, exist
Under certain situation, user can acquisition as described by with regard to method 900/2600 (for example, shoot, capture) multiple equipment (example
As, tablet PC, notebook and/or digital camera, all in addition to portable multifunction device 100 set
Standby) on image sequence and will be synchronous for image sequence or be in addition transferred on portable multifunction device 100.
In certain embodiments, user interface 700 be information receiving and transmitting application in user interface (for example, from Jia Lifu
Buddhist nun's sub-library is than the Apple of Dinon " message ").In certain embodiments, the first image sequence 702 is shown in message receipts
Send out the message conversation in the scrollable field of application, and the scrollable area that the first image sequence 702 is applied with information receiving and transmitting
Domain is scrolled and is shown, and image is interspersed in the (example sending and receiving via information receiving and transmitting application in some cases
As in conversation bubble) in text message or other message.In some cases, user may be portable at herself
(for example, shoot, capture) respective image sequence is obtained on formula multifunctional equipment 100, and also (for example, in information receiving and transmitting
In application) receive different images sequence from different user.Therefore, in some cases, it is stored in portable multi-function
Multiple images sequence on equipment 100 is included at least one image sequence being obtained using portable multifunction device 100 and makes
At least one image sequence being obtained with the camera on distinct device diverse with portable multifunction device 100.
In certain embodiments, presentation graphics 702-3 is displayed in user interface 700 and (for example, leads to when user rolls
It is displayed on when image or the message of crossing her in image management application or information receiving and transmitting application).
Fig. 7 A illustrates user interface 700.Portable multifunction device 100 is representative by first in user interface 700
Image 702 is shown in the removable first area 734 on touch screen 112.It should be appreciated that the side of removable first area 734
Boundary is not always displayed on touch screen 112 and is provided to assist the description to accompanying drawing.
Fig. 7 B illustrates the portable many of (starting at the 736-1 of the position) drag gesture 736 on detection touch screen 112
Function device 100.In certain embodiments, predefined criterion (for example, next photograph predefined is met according to drag gesture 736
Piece navigation criteria shows criterion) the operation to execute in Fig. 7 B-7K diagram for the determination.For example, in certain embodiments, work as dragging
Gesture 736 has predefined route characteristic, and (for example, drag gesture 736 is characterized by laterally (or vertical) speed;That is, drag gesture
More side by side (or up and down) compared with up and down (or side by side) in orientation shown in Fig. 7 A-7CC) when execution (for example, touch
Send out) operation of diagram in Fig. 7 B-7F.In certain embodiments, portable multifunction device 100 is included for detection and touch screen
One or more sensors of the intensity of 112 contact, and predefine when drag gesture 736 has satisfaction (for example, reaching)
Criterion of strength (for example, exceed as in the document elsewhere described in light pressure threshold value ITL) characteristic strength when, execution
The operation of diagram in (for example, triggering) Fig. 7 B-7F.In certain embodiments, when drag gesture 736 has predefined route characteristic
(for example, drag gesture 736 is characterized by lateral velocity) and meet predefined criterion of strength and (for example, exceed predefined intensity threshold
Value) when execution (for example, trigger) Fig. 7 B-7F in diagram operation.
User interface 700 as shown in Fig. 7 A assumes pattern with image and illustrates to the first presentation graphics 702-3's
Display.In certain embodiments, as shown in Figure 7 A, removable first area 734 be to show image in the first image sequence and
Do not show the region from the image of image sequence in addition to the first image sequence.
In figure 7b, drag gesture 736 is to the left.Therefore, first area 734 is moved by portable multifunction device 100
Move the left side, as shown in figs. 7 c-7d.In addition, portable multifunction device 100 is in the removable first area of user interface 700
Utilize in 734 to by camera after collection the first presentation graphics 702-3 collection for the one of the first image sequence 702
At least some of individual or multiple images image (that is, the image 702-5 of the image 702-4 and Fig. 7 D of Fig. 7 C) is chronological
Show and to substitute the display to the first presentation graphics 702-3.That is, portable multifunction device 100 shows in the first region
The animation of the first image sequence 702 is shown.
In certain embodiments, to the collection after collection the first presentation graphics by camera in first area 734
The chronological display of at least some of one or more images for the first image sequence 702 image is according to dragging handss
The mobile appearance of the contact in gesture 736.Therefore, if drag gesture 736 arrives the mobile acceleration on the left side, in first area 734
In the display of the time schedule of image is accelerated.Therefore, if the movement on drag gesture 736 to the left side slows down, in the firstth area
In domain 734, the display of the time schedule of image is slowed down.Therefore, if drag gesture 736 arrives the mobile time-out on the left side, the
Display halt to the time schedule of image in one region 734.And, if the mobile reverse directions (example of drag gesture 736
As from drag gesture to the left to drag gesture to the right), then to the image in the first image sequence 702 in first area 734
The display of progress is reversed, and image presses reversed time order quilt according to the movement to drag gesture 736 in reverse directions
Illustrate.More generally, in certain embodiments, for respective image sequence, to the progress of the image in respective image sequence
Display is according to the mobile appearance of the contact in drag gesture.
In certain embodiments, user by predefine mode change one or more features of drag gesture 736 Lai
Operation shown in triggering Fig. 6 A-6FF.For example, in certain embodiments, when user suspends drag gesture 736 and deeper
When pressing on touch screen 112, portable multifunction device 100 plays back the first image sequence 702, as retouched with reference to Fig. 6 A-6FF
State, even if only a part for first area 734 is on display.In certain embodiments, portable multifunction device
100 are configured to detect being altered or modified of one or more features of drag gesture 736.
Fig. 7 C-7D also illustrates in certain embodiments, according to drag gesture 736 to the left, portable multifunction device 100
Second area 738 is moved to left.In certain embodiments, by second area 738 move to left including from the left side by
At least partly the moving on touch screen 112 of two regions 738.In certain embodiments, removable second area 738 is display
Image in second image sequence 724 and do not show the image in the image sequence in addition to the second image sequence 724
Region (for example, the first image sequence 702 and the 3rd image sequence 726 are not shown in removable second area 738).?
In some embodiments, as seen in figure 7 c, removable second area 738 adjacent with removable first area 734 (for example, removable
The right of dynamic first area 734).In certain embodiments, when second area 738 moves to left, portable multi-function
Equipment 100 shows the collection before collection the second presentation graphics 724-3 by camera in second area 738 in chronological order
At least some of one or more images for the second image sequence 724 image.
In certain embodiments, the movement to first area 734 is corresponding with the movement to drag gesture 736.For example, exist
In some embodiments, the movement to first area 734 between Fig. 7 B and Fig. 7 C (is schemed with position 736-1 (Fig. 7 B) and 736-2
The distance between 7C) proportional.Similarly, in certain embodiments, the movement to first area 734 between Fig. 7 C and Fig. 7 D
Proportional in the distance between position 736-2 (Fig. 7 C) and 736-3 (Fig. 7 D), therefore give user and pull removable firstth area
The impression in domain 734.In certain embodiments, as shown in figs. 7 c-7d, first area 734 is moved to left including by the firstth area
At least partly the moving away touch screen 112, move to left of domain 734.
In certain embodiments, as shown in Fig. 7 B-7D, first area 734 and second area 738 are with phase same rate across touch
Screen 112 mobile (for example, the distance of the distance in the movement to first area 734 and the movement to second area 738 with to dragging
The distance of the movement of gesture 736 is corresponding).In certain embodiments, as shown in Fig. 7 L-7P, first area 734 and second area
738 are moved with different rates.For example, in Fig. 7 L-7P, in response to drag gesture 752, the movement of second area 738 is less than and rings
Should in drag gesture 752 to the movement of first area 734 (for example, the movement to first area 734 distance with to pulling handss
The distance of the movement of gesture 752 matches, and is movement to drag gesture 752 in the distance of the movement to second area 738
Distance fraction, such as 50%).
In certain embodiments, adopted before collection the second presentation graphics 724-3 to by camera in second area 738
At least some of one or more images for the second image sequence 724 of collection image chronological display basis is dragged
Drag the contact in gesture 736 mobile appearance (for example, with the similar side describing above with respect to the first image sequence 702
Formula).For example, during drag gesture 736, the image of image in first area 734 and second area 738 is simultaneously with mutually synchronized
Rate advances, the movement based on drag gesture 736 for its medium-rate.In certain embodiments, for example as retouched below with reference to Fig. 7 L-7P
State, during drag gesture 752, the image in image and second area 736 in first area 734 is pushed away with different rates
Enter.For example, in Fig. 7 L-7P, the image response in second area is less than first area in the speed that drag gesture 752 advances
Speed that image response in 734 advances in drag gesture 752 (is for example, image response in first area 734 in pulling handss
The 50% of the speed of gesture 752 propulsion).
In certain embodiments, as the alternative of the example shown in Fig. 7 B-7D, when second area 738 is mobile
During to the left side, second area 738 only shows for the second presentation graphics 724-3 of the second image sequence and does not show second
Other images in image sequence 724.
In certain embodiments, user interface 700 includes next icon 750-1 (for example, Fig. 7 A) and previous icon
750-2.In certain embodiments, the drag gesture in a first direction similar to detection, detection is to next icon 750-1's
Activation also leads in first area 734, the animation of the image from the first image sequence 702 be shown and in second area 738
In the animation of the image from the second image sequence 724 is shown.In certain embodiments, detection is to next icon 750-1
Activation lead to substitute display to the first presentation graphics 702-3 to the display of the second presentation graphics 724-3, without
In first area 734, the animation of the image from the first image sequence 702 is shown and do not have in second area 738 to next
Show from the animation of the image of the second image sequence 724.In certain embodiments, the activation to next icon 750-1 for the detection
Lead to the display of the second presentation graphics 724-3 be substituted with the display to the first presentation graphics 702-3, without to the first sequence
Other images in row or the second sequence are shown.
When the operation occurring in the mobile description Fig. 7 B-7D of the left/right with regard to drag gesture 736 it is contemplated that with regard to dragging
Drag the similar operation of the up/down movement of gesture, and be intended to be fallen within the scope of the appended claims, unless separately bright
Really state.For example, in certain embodiments, alternative as the example shown in Fig. 7 B-7D, replace by second area 738 to
Move left on touch screen 112, when drag gesture 736 is that second area 738 is suitable in z layer (in front and back) to the left or during downward gesture
In sequence below first area 734, and move to left when first area 734 moves away touch screen 112 (or the bottom of towards
Portion) when second area 738 exposed (for example, opening).
As shown in Figure 7 F, in certain embodiments, after second area 738 moves to left, such as with reference to Fig. 7 B-7D
Described, portable multifunction device 100 shows the second presentation graphics 724-3 in user interface 700.In some enforcements
In example, the display to the second presentation graphics 724-3 occurs according to the operation below with reference to Fig. 7 E-7F description.
As seen in figure 7e, in certain embodiments, portable multifunction device 100 is when mobile first area 734 and second
The termination (for example, being lifted away from) of drag gesture 736 is detected during region 738.As response, portable multifunction device 100 determination is dragged
Drag whether gesture 736 meets next sequence navigation criteria.For example, in certain embodiments, when the movement to first area 734 makes
First area 734 is left touch screen 112 and is exceeded half (for example, the midpoint of first area 734 has been moved away from touch screen
112), when, next sequence navigation criteria is satisfied.In certain embodiments, as seen in figure 7e, when to first area 734 and second
The movement in region 738 makes the border 740 between first area 734 and second area 738 pass through the midpoint of touch screen 112
742 (or the midpoint of the user interface 700 in the case of not centered in touch screen 112 or any other is suitable
Predefined, such as across 1/3rd or a quarter of user interface 700) when, next sequence navigation criteria is satisfied.One
In a little embodiments, when the speed of drag gesture 736 meets predefined speed criterion (for example, when the average speed of drag gesture 736
Degree or instantaneous velocity outpace threshold value when), next sequence navigation criteria is satisfied.In certain embodiments, work as drag gesture
During 736 speed instruction " flicking " gesture, next sequence navigation criteria is satisfied.
As shown in Figure 7 F, when next sequence navigation criteria is satisfied, portable multifunction device 100 is (for example, by inciting somebody to action
First area 734 further moves to the left side until first area 734 leaves touch screen 112 completely) will be complete for first area 734
Move away touch screen 112 and (for example, by by second area 738 further move to the left side until second area 738 complete
Entirely it is on touch screen 112) second area 738 is moved fully on touch screen 112.As a result, portable multifunction device 100
Show the second presentation graphics 724-3 (Fig. 7 F).Therefore, in certain embodiments, when next sequence navigation criteria is satisfied
The termination of drag gesture 736 gives user and the second presentation graphics 724-3 is fastened (snap) print to user interface 700
As.
On the contrary, in certain embodiments, when next sequence navigation criteria is not satisfied, portable multifunction device 100
(for example, leaving touch screen 112 by second area 738 moves to right completely until second area 738) is by second area
738 move away touch screen 112 and completely (for example, by first area 734 is moved back to the right until first area
734 are completely on touch screen 112) first area 734 is moved fully on touch screen 112.As a result, portable multi-function sets
Standby 100 show the first presentation graphics 702-3 (for example, returning to the view shown in Fig. 7 A) again.Therefore, implement at some
In example, the termination of drag gesture 736 when next sequence navigation criteria is not satisfied gives user by the first presentation graphics
702-3 fastens the impression returning in user interface 700.When first area 734 is moved fully to by portable multifunction device 100
On touch screen 112 and when second area 738 is moved away touch screen 112 completely, (for example, by reversed time order) backward
Show the first image sequence 702 and the second image sequence 724.
Fig. 7 G-7K illustrates the feature similar with Fig. 7 B-7F, but except Fig. 7 B-7F illustrates according to some embodiments
The navigation of next sequence, Fig. 7 G-7K illustrates the previous sequence navigation according to some embodiments.
Fig. 7 G illustrates the portable many of (starting at the 744-1 of the position) drag gesture 744 on detection touch screen 112
Function device 100.In certain embodiments, predefined criterion (for example, predefined previous photograph is met according to drag gesture 744
Piece navigation criteria shows criterion) the operation to execute in Fig. 7 G-7K diagram for the determination.In certain embodiments, for court of navigating
To previous photo (for example, previous grouped image sequence) predefined criterion with reference to Fig. 7 B-7F description for navigating
It is similar to towards the predefined criterion of next photo (for example, next grouped image sequence), except for this two corresponding
Drag gesture is substantially (or at least most of such) in the opposite direction.
Fig. 7 G is similar with Fig. 7 B, except when portable multifunction device 100 shows pulling during the first presentation graphics 702-3
Gesture 744 is on the rightabout of drag gesture 736 (Fig. 7 B).That is, in Fig. 7 G, drag gesture 744 is to the right
's.Therefore, first area 734 is moved to right by portable multifunction device 100, as shown in Fig. 7 H-7I.In addition, it is portable
Multifunctional equipment 100 is gathering the first representative diagram using to by camera in the removable first area 734 of user interface 700
As at least some of one or more images for the first image sequence 702 of collection image (that is, Fig. 7 H before 702-3
Image 702-2 and Fig. 7 I image 702-1) show to substitute to the first presentation graphics 702-3 by reversed time order
Display.In certain embodiments, as above with reference to described by Fig. 7 B-7F, for respective image sequence, in respective regions
To the display of the progress of the image in respective image sequence according to the mobile appearance of the contact in drag gesture (for example, from Fig. 7 G
The position 744-3 of position 744-2 to Fig. 7 I of position 744-1 to Fig. 7 H movement).
Fig. 7 H-7I also illustrates in certain embodiments, according to drag gesture 744 to the right, portable multifunction device 100
3rd region 746 is moved to right.In certain embodiments, the 3rd region 746 is moved to right including by the 3rd region
746 at least partly move right on touch screen 112.In certain embodiments, may move the 3rd region 746 is display the 3rd
Image in image sequence 726 and do not show the area of the image in the image sequence in addition to the 3rd image sequence 726
Domain (for example, the first image sequence 702 and the second image sequence 724 are not shown in removable 3rd region 746).At some
In embodiment, as shown in fig. 7h, may move that the 3rd region 746 is adjacent with removable first area 734 (for example, to may move the
The left side in one region 734, contrary with removable second area 738).In certain embodiments, when the 3rd region 746 is moved to
During the right, portable multifunction device 100 is shown by camera in the collection third generation by reversed time order in the 3rd region 746
At least some of one or more images for the 3rd image sequence 726 of collection image before table image 726-1.
In certain embodiments, adopt after collection third representative image 726-1 to by camera in the 3rd region 746
At least some of one or more images for the 3rd image sequence 726 of collection image presses the display root of reversed time order
According to the contact in drag gesture 744 mobile appearance (for example, with similar with describe above with respect to the first image sequence 702
Mode).For example, during drag gesture 744, the image of image in the 3rd region 734 and second area 746 is simultaneously with identical
Speed falls back, the movement based on drag gesture 744 for its medium-rate.
In certain embodiments, the drag gesture 744 in a second direction similar to detection, detection is to previous icon
The activation of 750-2 (for example, Fig. 7 A) also leads in first area 734, the animation of the image from First ray 702 be shown
With in the 3rd region 744, the animation of the image from the 3rd sequence 726 is shown.In certain embodiments, detection is to previous
The activation of individual icon 750-2 leads to the display of third representative image 726-1 is substituted and the first presentation graphics 702-3 is shown
Show, without showing to the animation of the image from First ray 702 in first area 734 and not having in the 3rd region
In 744, the animation of the image from the 3rd image sequence 724 is shown.In certain embodiments, detection is to previous icon
The activation of 750-2 leads to the display of third representative image 726-1 is substituted with the display to the first presentation graphics 702-3, and
Other images in First ray 702 or the 3rd sequence 726 are not shown.
In certain embodiments, as the alternative of the example shown in Fig. 7 G-7I, when in response to drag gesture 744
When 3rd region 746 is moved to right, the 3rd region 746 only shows the third representative for the 3rd image sequence 726
Image 726-1 and do not show other images in the 3rd image sequence 726.
When the operation occurring in the mobile description Fig. 7 G-7I of the left/right with regard to drag gesture 744 it is contemplated that with regard to dragging
Drag the similar operation of the up/down movement of gesture, and be intended to be fallen within the scope of the appended claims, unless separately bright
Really state.For example, in certain embodiments, alternative as the example shown in Fig. 7 G-7I, replace moving the 3rd region 746
Move the left side on touch screen 112, when drag gesture 744 is that the 3rd region 746 is suitable in z layer (in front and back) to the right or during gesture upwards
In sequence below first area 734, and move to right when first area 734 moves away touch screen 112 (or towards top
Portion) when the 3rd region 746 exposed (for example, opening).
As shown in fig. 7k, in certain embodiments, after the 3rd region 746 moves to right, such as with reference to Fig. 7 G-7I
Described, portable multifunction device 100 shows third representative image 726-1 in user interface 700.In some enforcements
In example, the display to third representative image 726-1 occurs according to the operation below with reference to Fig. 7 J-7K description.
As shown in figure 7j, in certain embodiments, portable multifunction device 100 is when mobile first area 734 and the 3rd
The termination (for example, being lifted away from) of drag gesture 744 is detected during region 746.As response, portable multifunction device 100 determination is dragged
Drag whether gesture 744 meets previous sequence navigation criteria.For example, in certain embodiments, when the movement to first area 734
Make first area 734 leave touch screen 112 to exceed half (for example, the midpoint of first area 734 has been moved away from touch
Screen 112) when, previous sequence navigation criteria is satisfied.In certain embodiments, as shown in figure 7j, when to first area 734 and
The movement in three regions 746 makes the border 748 between first area 734 and the 3rd region 746 pass through the midpoint of touch screen 112
742 (or the midpoint of the user interface 700 in the case of not centered in touch screen 112 or any other is suitable
Predefined point) when, previous sequence navigation criteria is satisfied.
As shown in fig. 7k, when previous sequence navigation criteria is satisfied, portable multifunction device 100 is (for example, by inciting somebody to action
First area 734 further moves to the right until first area 734 leaves touch screen 112 completely) will be complete for first area 734
Move away touch screen 112 and (for example, by by the 3rd region 746 further move to the right until the 3rd region 746 complete
Entirely it is on touch screen 112) the 3rd region 746 is moved fully on touch screen 112.As a result, portable multifunction device 100
Display third representative image 726-1 (Fig. 7 K).Therefore, in certain embodiments, when previous sequence navigation criteria is satisfied
The termination of drag gesture 744 gives user and third representative image 726-1 fastens to the impression in user interface 700.
On the contrary, in certain embodiments, when previous sequence navigation criteria is not satisfied, portable multifunction device 100
(for example, leaving touch screen 112 by the 3rd region 746 moves to left completely until the 3rd region 746) is by the 3rd region
746 move away touch screen 112 and completely (for example, by first area 734 is moved back to the left side until first area
734 are completely on touch screen 112) first area 734 is moved fully on touch screen 112.As a result, portable multi-function sets
Standby 100 show the first presentation graphics 702-3 (for example, returning to the view shown in Fig. 7 A) again.Therefore, implement at some
In example, the termination of drag gesture 744 when previous sequence navigation criteria is not satisfied gives user by the first presentation graphics
702-3 fastens the impression returning in user interface 700.When first area 734 is moved fully to by portable multifunction device 100
On touch screen 112 and when completely the 3rd region 738 is moved away touch screen 112, (for example, in chronological order) shows forward
First image sequence 702 and the 3rd image sequence 726.
Fig. 7 L-7P illustrates wherein in response to drag gesture 752 first area 734 and second area 738 with different rates
The embodiment of movement on touch screen 112.Fig. 7 L is similar with Fig. 7 A and is provided as the function shown in Fig. 7 M-7P
Starting point.As shown in Fig. 7 M-7O, drag gesture 752 moves into place 752-2 (Fig. 7 N) from position 752-1 (Fig. 7 M) and moves to
Position 752-3 (Fig. 7 O).In certain embodiments, the movement to first area 734 and the mobile phase pair to drag gesture 752
Should.For example, as drag gesture 752 mobile 1 centimetre (cm), 1cm is moved in first area 734.In certain embodiments, the secondth area
The movement based on drag gesture 752 for the domain 738 is moving, but the distance of the movement of second area 738 is less than drag gesture 752
The distance of movement.For example, as drag gesture 752 mobile 1 centimetre (cm), second area 738 moves 0.5cm.
In this example, first area 734 (for example, in z layer) above second area 738 so that working as first area
734 when leaving touch screen 112 in response to drag gesture 752, and second area 738 is gradually exposed.In opening of drag gesture 752
Begin, second area 738 partly but is not to be fully on touch screen 112 (for example, to the half of touch screen 112 or four points
Three).When first area 734 is slid off touch screen 112 to the right by user, remainder is slided into by second area 738
So that first area 734 being slid off touch screen 112 completely and corresponding to, second area 738 is slided completely on touch screen 112
To on touch screen 112.With the speed different from the speed moving to the 2nd 738 on touch screen 112, first area 734 is mobile
Leave touch screen 112 and provide a user with the side navigated in the architecture (for example, z layer) of enhancement mode photo with regard to user
To intuitive visual clue.
In certain embodiments, during drag gesture 752, in the image and second area 736 in first area 734
Image is advanced with different rates.In certain embodiments, the image propulsion of the image in first area 734 and second area 738
Respective rate both movements based on drag gesture 752.In certain embodiments, the image in first area 734 with than
The speed that the speed of the image propulsion in second area 738 is higher advances.In Fig. 7 M-7P, the image in first area 734 rings
Should be advanced with the twice of the speed of the image in second area 738 in drag gesture 752.For example, as shown in Fig. 7 N-7O, in phase
During the time period, first area 734 propulsion is by two images (702-4 the and Fig. 7 O of Fig. 7 N in the first image sequence 702
702-5), and second area 738 maintains the display (724-2 of Fig. 7 N and 7O) to the single image in the second image sequence.
Fig. 7 P illustrates second area 738 and has been advanced to show presentation graphics 724-3.
Fig. 7 Q-7CC illustrates wherein equipment 100 and enhancement mode photo slides into the embodiment on display.Work as enhancement mode
When photo is slid on display, equipment 100 play in chronological order include enhancement mode photo image sequence (for example, from
Initial pictures are to presentation graphics).In certain embodiments, new enhancement mode photo is slided into and replace on display more than such as
Described currently displaying enhancement mode photo.For example, enhancement mode photo is slided into currently displaying enhancing on display
Type photo slides off display.As another example, enhancement mode photo is slided into and covers on display (for example, in z direction
On) currently displaying enhancement mode photo.In certain embodiments, currently displaying enhancement mode photo does not play back when it is substituted
(for example, equipment maintains the display to the presentation graphics from currently displaying enhancement mode photo).In certain embodiments, increase
Strong type photo is play forward, and no matter it is (for example, in camera film) next enhancement mode photo or previous enhancement mode photograph
Piece.
For this reason, Fig. 7 Q illustrates equipment 100 shows the first image in the removable first area 734 on touch screen 112
First presentation graphics 702-3 of sequence 702.
As shown in figure 7q, this equipment detects the gesture 740 being moved to the left including being carried out on touch screen 112 by contact
(for example, gently sweeping gesture).As shown in figure 7r, in response to gesture 740, first area 734 is moved to the left and leaves display by equipment 100
Device and removable second area 738 is moved to the left on display (for example, second area 738 is and first area 734 phase
Adjacent moving area).Arrow instruction in Fig. 7 R moves to second area 738 on display from the continuation of gesture 740
Inertia.In this example, because gesture 740 arrives the left side, slide into image sequence (for example, the second image sequence on display
Row 724) it is (for example, in camera film) next image sequence.
As shown in Fig. 7 R-7T, the determination that is satisfied of criterion is shown according to sequence, when second area 738 moves to left
When, equipment 100 is play forward by being gathered before collection the second presentation graphics 724-3 by camera for the second image sequence
At least some of one or more images of row 724 image.For example, equipment 100 passes through display from the second image sequence 724
Initial pictures 724-1 (Fig. 7 R) start.Equipment 100 (for example, in chronological order) is played back to representativeness from initial pictures 724-1
Image 724-3 (Fig. 7 S-7T).In certain embodiments, the playback to the second image sequence 724 is timed so that representative diagram
As 724-3 just occurs when moving area 738 completes to move on touch screen 112 (Fig. 7 T).
In certain embodiments, sequence display criterion includes navigation criteria (for example, instruction equipment 100 should even not have
The criterion being converted to next photo or previous photo is completed) in the case of having further user input.For example, equipment 100 with
Family has been flicked fast enough and/or has been pulled far enough to be converted to next image sequence (for example, the second image sequence 724)
In the case of only play pass through the second image sequence 724.It is more fully described navigation criteria above with reference to Fig. 7 E.
In certain embodiments, when first area 734 slides off display, when playing back the second image sequence 724
During at least a portion, equipment 100 maintains the display to the first presentation graphics 702-3 (for example, statically, not substitute to first
The display of presentation graphics 702-3) (for example, when the second image sequence 724 plays back, the first image sequence 702 does not return
Put).Therefore, presentation graphics 702-3 is displayed in the first area 734 in each in Fig. 7 Q-7S.
In certain embodiments, flick to the right/drag gesture obtains similar function, except (for example, in camera film
) previous enhancement mode photo rather than (for example, in camera film) next enhancement mode photo be slid on display.Example
As, in Fig. 7 U, what equipment 100 detection was similar with gently sweeping gesture 740 gently sweeps gesture 742, except gently sweeping gesture 742 on the right.
In Fig. 7 V-7X, the 4th image sequence 760 (it is the previous enhancement mode photo in camera film in this example) is from initial graph
As 760-1 is played to forward presentation graphics 760-3.I.e., in certain embodiments, no matter gently sweep/drag gesture is previous photograph
Piece navigation gesture or next photo navigation gesture, equipment 100 is all play forward enhancement mode photo and (for example, rather than is oppositely play
Previous photo in camera film, as above with reference to described by Fig. 7 G-7K).Fig. 7 U-7X is in addition similar with Fig. 7 Q-7T.
As shown in Fig. 7 Y-7CC, in certain embodiments, sequence display criterion includes the lift of gesture is detected when equipment 100
From when the criterion that is satisfied (for example, once her finger is lifted away from touch screen 112 by user, equipment 100 only commences play out
New enhancement mode photo).For example, in Fig. 7 Y, equipment 100 detects beginning at the 764-1 of position for the drag gesture 764.In figure
In 7Z, drag gesture 764 is moved into place 764-2 by user, and equipment 100 is correspondingly by the 4th image sequence
The initial pictures 760-1 of row 760 is partially moved on display.In Fig. 7 AA, drag gesture 764 is entered one by user
Moved further is to position 764-3, and the initial pictures 760-1 of the 4th image sequence 760 is correspondingly moved by equipment further
Move on display.However, equipment 100 does not start the playback to the 4th image sequence 760 is lifted away from drag gesture 764 until user
(Fig. 7 BB).This criterion avoids the overstimulation when enhancement mode photo is dragged on display user and display is left in dragging
User.As shown in Fig. 7 CC, in certain embodiments, the playback to moving area and/or mobile quilt after being lifted away from of gesture
Timing makes the presentation graphics (for example, forth generation table image 760-3) of new enhancement mode photo just exist during playing back
It is shown when new enhancement mode photo completes to slide on display.
Fig. 8 A-8L illustrates and according to some embodiments, associated image sequences is executed compared with individual images more not
The exemplary user interface of same operation.User interface in these accompanying drawings is used for process described below is described, including figure
Process in 9A-9G, 10A-10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.Although will
With reference to (wherein Touch sensitive surface and display are combined, and such as illustrate on portable multifunction device 100 with touch-screen display
) on input provide following example, but in certain embodiments, the detection of described equipment as illustrated in fig. 4b with aobvious
Show the input on the detached Touch sensitive surface 451 of device 450.
Fig. 8 A-8L illustrates following example:Wherein portable multifunction device 100 storage multiple images sequence, wherein it
One is (being displayed in user interface 800) grouped image sequence 802.Some features of user interface 800 and user
Interface 600 (Fig. 6 A-6W) is similar with user interface 700 (Fig. 7 A-7CC), and is not repeated here for brevity.First
First, image sequence 802 includes being shot by camera presentation graphics 802-3 (Fig. 8 A), by camera in collection presentation graphics
After 802-3 collection one or more images (for example, the image 802-5 of the image 802-4 and Fig. 8 D of Fig. 8 C) and by phase
One or more images (for example, image 802-1 and Fig. 8 F of Fig. 8 E of machine collection before collection presentation graphics 802-3
Image 802-2).Therefore, the time sequencing (for example, by the order of image shot by camera) of image sequence 802 is:Image 802-1,
Image 802-2, image 802-3, image 802-4 and image 802-5.
Image sequence 802 depicts wherein cat 812 and goes in visual field, so that its back of the body is rolled on the ground and stand up and leave
Scene.Meanwhile, bird 814 stops falling on branch.When, in reality, such scene may spend to represent for several seconds, but
In some embodiments, image sequence 802 is captured in short time window.For example, in certain embodiments, described herein
Any in image sequence can describe around obtain during its respective representative image moment (for example, 0.5 second, 1.0 seconds,
In 1.5 seconds, 2.0 seconds or 2.5 seconds) a moment.For example, the interest of user may be worked as when cat 812 starts to roll about in meadow
Provoked, remind user to shoot presentation graphics 802-3.In certain embodiments, image sequence 802 includes just obtaining generation
Image before table image 802-3 and just obtaining the image after presentation graphics 802-3 so that the first image sequence
Row 802 include enhancement mode photo, and this works as user's execution (as described in this article) with regard to representativeness by enhancement mode photo for a moment
Can " vivid " during some operations of image 602-3.
In example shown in Fig. 8 A-8L, portable multifunction device 100 also stores and multiple grouped image sequences
The diverse multiple images of image in row.For example, portable multifunction device 100 storage image 824 (Fig. 8 I), it is not
The part (for example, image 824 is rest image) of the image sequence in multiple images sequence.
In certain embodiments, user interface 800 be image management application in user interface (for example, from Jia Lifu
Buddhist nun's sub-library is than the Apple of Dinon " photo ").For this reason, in certain embodiments, the camera of shooting image sequence 802 is just
Take the part (for example, camera includes the optical pickocff 164 of Figure 1A together with image-forming module 143) of formula multifunctional equipment 100.One
In a little embodiments, by the camera shooting not being portable multifunction device 100, (for example, image sequence 802 exists image sequence 802
It is transferred to portable multifunction device 100) after being taken using the camera on another equipment.In certain embodiments, scheme
As sequence 802 obtains to the activation of shutter release button at the very first time in response to detecting, such as herein with reference to Fig. 5 A-5K
With described by method 900 and/or Figure 22 A-22D and method 2600.In certain embodiments, presentation graphics 802-3 with by
The presentation graphics of collected by camera is corresponding, such as herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method
Described by 2600.
In certain embodiments, some in rest image and/or multiple images sequence are to be set using portable multi-function
Standby 100 collections, and some are to be transferred to portable multi-function after being taken using the camera on distinct device to set
Standby 100.For example, in some cases, for example, user the acquisition as described by with regard to method 900/2600 (can shoot, catch
Obtaining) (for example, tablet PC, notebook and/or digital camera, except portable multifunction device 100 for multiple equipment
Outside all devices) on image sequence and will be synchronous for image sequence or be in addition transferred to portable multifunction device 100
On (the extra rest image of its storage).
In certain embodiments, user interface 800 be information receiving and transmitting application in user interface (for example, from Jia Lifu
Buddhist nun's sub-library is than the Apple of Dinon " message ").In certain embodiments, image sequence 802 and/or rest image 824 are
It is shown in the message conversation in the scrollable field of information receiving and transmitting application, and image sequence 802 is applied with information receiving and transmitting
Scrollable field is scrolled and is shown.In some cases, user may be in the portable multifunction device of herself
Obtain (for example, shoot, capture) respective image sequence on 100, and also (for example, in information receiving and transmitting application) receives
Different images sequence or different rest image from different user.Therefore, in certain embodiments, it is stored in portable many work(
Can the multiple images sequence on equipment 100 include at least one image sequence of being obtained using portable multifunction device 100 and
At least one image sequence or quiet being obtained using the camera on distinct device diverse with portable multifunction device 100
Only image.
As shown in Fig. 8 A-8L, portable multifunction device 100 detects two similar inputs:First input 816 (Fig. 8 B-
8F) He the second input 836 (Fig. 8 K-8L).First input 816 is similar with the second input 836, because they are shared common
Feature set (for example, meets common predefined Rule set), feature such as strength characteristic (as shown in intensity schematic diagram 818)
With route characteristic (for example, the first input 816 and the second input 836 both press and hold gesture).First input 816 Hes
Second input 836 is identical, except the first input 816 is image (for example, the presentation graphics in the part for image sequence
It is detected on 802-3), and the second input 836 is the image (for example, rest image 824) in the part not being image sequence
On be detected.
As a result, portable multifunction device 100 is held when the first input 816 is detected when showing presentation graphics 802-3
Row first operates, and executes the second different operations when the second input 836 is detected when showing rest image 824.In figure
In example shown in 8B-8F, the first operation is included with the side with reference to Fig. 6 A-6FF and method 1000/10000/10050 description
At least a portion of formula display image sequence 802.I.e.:During the Part I 816-1 of the first input 816, portable many work(
Can equipment 100 play back by camera obtain image 802-3 after obtain image (for example, show Fig. 8 C image 802-4, and
And show the image 802-5 of Fig. 8 D);During the Part II 816-2 of the first input 816, portable multifunction device 100 times
Put the image being obtained before obtaining image 802-3 by camera (for example, to show the image 802-1 of Fig. 8 E, and show Fig. 8 F's
Image 802-2).In example shown in Fig. 8 K-8L, the second operation includes illustrating the different piece of rest image 824
Animation.For example, as shown in Fig. 8 L, the second operation includes (for example, inputting below 836 second in the part of rest image 824
Or the part near the second input 836) on the animation that is amplified.Except or replace amplify, in certain embodiments, second behaviour
Make to include showing the information (for example, metadata 822) with regard to rest image 824.
Fig. 8 G-8I illustrates navigation gesture 844 (for example, drag gesture).Navigation gesture be start at the 844-1 of position,
Move into place 844-2, move into place the gesture to the left of 844-3.Therefore, in certain embodiments, portable multi-function sets
It is converted to display for 100 from (for example, by display as with reference to the sequence described by Fig. 7 A-7CC) display image sequence 802 static
(for example, image 824 slides image 824 across touch screen 112, and flash demo does not pass through image sequence, because it is picture
Rather than enhancement mode photo).In certain embodiments, when the input similar with navigation gesture 844 is detected on rest image,
Portable multifunction device 100 is converted to different images, and do not show by the image that is associated with rest image (for example, because
It is not have).
Fig. 9 A-9G illustrates the flow process of the method 900 according to the grouped associated image sequences of the capture of some embodiments
Figure.In the electronic equipment (for example, the portable multifunction device 100 of the equipment 300 of Fig. 3 or Figure 1A) with display and camera
Place's execution method 900.According to some embodiments, this equipment include one of the intensity for the contact with Touch sensitive surface for the detection or
Multiple sensors.In certain embodiments, display be touch-screen display and Touch sensitive surface be over the display or with aobvious
Show that device is integrated.In certain embodiments, display is separated with Touch sensitive surface.Certain operations in method 900 are alternatively combined
And/or the order of certain operations is alternatively changed.
When in being in for the first acquisition of media pattern of camera, this equipment shows (902) (for example, over the display
As shown in Figure 5A) live preview (for example, this equipment with from camera obtain image in real time or near real-time ground display image).Example
As, the first acquisition of media pattern be marked as enhancement mode picture mode, a moment pattern, etc. pattern.
In certain embodiments, the first acquisition of media pattern be configured to (904) by this equipment user (for example, via
Setting interface for camera) enable or disable.In certain embodiments, this equipment includes at least three kinds acquisition of media patterns:
(1) first acquisition of media pattern (it is considered " enhancement mode photo rest image " drainage pattern), it is in response to detection
To the activation to shutter release button, image sequence is grouped, wherein image sequence included before the activation to shutter release button and
Store as image sets to the image of collection after the activation of shutter release button and using them;(2) second acquisition of media pattern (examples
As conventional dead image acquisition modality), it stores single image in response to detecting to the activation of shutter release button, similar to
Rest image pattern in conventional digital camera;And (3) the 3rd acquisition of media patterns (for example, video capture mode), it is deposited
The video of storage collection after the activation to shutter release button is detected, and its video of holding the record is until shutter release button is by again
Activation.In certain embodiments, user can select via for camera, mode selecting button, model selection pointer, etc.
Which kind of acquisition of media pattern setting interface enables.
In certain embodiments, live preview is shown (906) is including can for enable the first acquisition of media pattern
The part (for example, Fig. 5 A-5H can piece supplying 506) of the media capture user interface of piece supplying.When the first acquisition of media pattern is opened
Used time, piece supplying can be animated demonstration (for example, to indicate image and/or voice data when media capture user interface is shown
Captured), and when the first acquisition of media pattern is disabled, piece supplying can not be animated demonstration.In certain embodiments, respond
In detect to can piece supplying selection (for example, can tap gesture) in piece supplying, when the first acquisition of media pattern is disabled, should
Equipment enables the first acquisition of media pattern, starts to capture media (for example, image and/or audio frequency) and start flash demo to be available for
Part.In certain embodiments, capture media include recording image and/or audio frequency.In certain embodiments, capture media include
(for example, in permanent memory) storage image and/or audio frequency.
When showing live preview, this equipment detection (908) activation to shutter release button at the very first time (for example, should
Equipment detection is in pressing or detecting at the very first time to virtual on touch-sensitive display of physical button in the very first time
Gesture on shutter release button, such as (as illustrated in figure 5f) shutter discharge icon on tap gesture or on live preview
Tap gesture, wherein live preview be used as virtual shutter button).In certain embodiments, the activation detecting is to shutter
Button single actuation (for example, similar in conventional digital camera be used for caught with the rest image pattern of conventional digital camera
Obtain the single actuation of single image).
In response to the activation to shutter release button at the very first time is detected, this equipment will by camera with the very first time
On place's time close to the activation of shutter release button, multiple images packet (910) of collection (for example, are such as schemed to the first image sequence
Shown in 5I-5K) in.First image sequence includes:Detected before the activation to shutter release button for the very first time by camera
The multiple images of collection;Presentation graphics, this presentation graphics represents the first image sequence and by camera in the first image sequence
Gather after one or more of other images in row;And by camera collection presentation graphics after collection multiple
Image.
In certain embodiments, presentation graphics is gathered and similar to when conventional digital phase in the very first time by camera
With the single image of its rest image pattern capture when the shutter release button of machine is activated.In certain embodiments, by collected by camera
Presentation graphics with the very first time collection image corresponding.In certain embodiments, by the representativeness of collected by camera
Image with after the activation to shutter release button at the very first time is detected soon to consider shutter sluggish (right detecting
Time delay between the activation of shutter release button and capture/storage presentation graphics) time at collection image corresponding.?
In some embodiments, such as image is used to indicate by the presentation graphics of collected by camera and assumes the image sequence in pattern.
In certain embodiments, the first image sequence includes the predefined quantity of collection after collection presentation graphics
Image, such as 5,10,15,20,25 or 30 images.In certain embodiments, collection presentation graphics it
The image gathering afterwards be collection presentation graphics after time predefined in (for example, collection presentation graphics after
In 0.5 second, 1.0 seconds, 1.5 seconds, 2.0 seconds or 2.5 seconds) image.In certain embodiments, the first image sequence is included in inspection
Measure at the very first time to after the activation of shutter release button collection predefined quantity image, such as 5,10,15,
20,25 or 30 images.In certain embodiments, adopt after the activation to shutter release button at the very first time detecting
The image of collection be in time predefined after the first time (for example, 0.5 second after the first time, 1.0 seconds, 1.5
Second, in 2.0 seconds or 2.5 seconds) image.In certain embodiments, in the first image sequence collection presentation graphics after
The multiple images of collection meet predefined packet criterion.In certain embodiments, predefine packet criterion to include selecting representing
The image of the predefined quantity after property image.In certain embodiments, predefine packet criterion to include selecting immediately preceding inspection
Measure to the image in the time predefined scope after the activation of shutter release button.In certain embodiments, predefine packet accurate
Then include selecting the image in the time predefined scope after the time of collection presentation graphics.
In certain embodiments, the first image sequence is stored (912) in memory is the first unique image collection (example
As being stored in together in the data structure in nonvolatile memory).In certain embodiments, by the representative of collected by camera
Property image be used to indicate the first unique image collection that such as image presents in pattern and (participate in Fig. 6 A-6FF, 7A-7CC and 8A-
8L).
In certain embodiments, live preview shows (914) image with first resolution, and the first image sequence includes
The first resolution being displayed in live preview image (for example, first resolution be the resolution than camera the upper limit more
Low resolution).In certain embodiments, (916) are had by the presentation graphics of collected by camera higher than first resolution
Second resolution.In certain embodiments, had than other figures in the first image sequence by the presentation graphics of collected by camera
As higher resolution.For example, seem 12,000,000 pixel images, 18,000,000 pixel images or 24,000,000 pictures by the representative diagram of collected by camera
Other images in sketch map picture, and the first image sequence have (for example, first point of the resolution with display in live preview
Resolution) corresponding more low resolution.In certain embodiments, had and the first image sequence by the presentation graphics of collected by camera
Other image identical resolution in row.
In certain embodiments, in response to the respective image that the corresponding activation of shutter release button is grouped is detected
The parameter of sequence can be by the user configuring (918) of this equipment.For example, via the setting interface for camera, user can select
The quantity of the image in corresponding sequence, which image be used as this sequence presentation graphics (for example, as shown in Fig. 5 I-5K) and/
Or other collections for image sequence or display parameters (for example, the resolution of respective image, the resolution of other images, frame
Rate, filtering effects, etc.).
In certain embodiments, before the activation to shutter release button for very first time collection are being detected by camera more
Individual image was stored (920) in memorizer with the first form before the activation to shutter release button at the very first time is detected
(for example, program storage, volatile memory, loop buffer, etc.) in, and right at the very first time in response to detecting
The activation of shutter release button and be stored in memorizer (for example, non-volatile memory/storage) with the second form.
In certain embodiments, the multiple images gathering before the activation to shutter release button at the very first time is detected
It is the image (for example, 5,10,15,20,25 or 30 images) that (922) predefine quantity.
In certain embodiments, the multiple images gathering before the activation to shutter release button at the very first time is detected
Be in (924) time predefined before the first time (for example, 0.5 second before the first time, 1.0 seconds, 1.5 seconds,
In 2.0 seconds or 2.5 seconds) image.
In certain embodiments, the multiple images gathering before the activation to shutter release button at the very first time is detected
It is (for example, in the time of collection presentation graphics in (926) time predefined before the time collecting presentation graphics
In before 0.5 second, 1.0 seconds, 1.5 seconds, 2.0 seconds or 2.5 seconds) image.
In certain embodiments, the multiple images gathering before the activation to shutter release button at the very first time is detected
From the time range between (928) very first time and the second time before the first time, and detecting first
To collection multiple images before the activation of shutter release button independent of (except detecting at the very first time to shutter release button at time
Activation outside) interacting close to the second time and shutter release button in time is detected.For example, detecting first
At time, the multiple images of collection before the activation of shutter release button are not in response to (except detecting at the very first time to shutter
Outside the interaction of button) detect and gather with interacting of shutter release button close to the second time in time.For example, exist
Multiple images collection was detected before the activation to shutter release button at the very first time are not in response to and detect when second
Between place or part (or completely) activation gathered close to the second time.
In certain embodiments, in the first image sequence detect at the very first time to the activation of shutter release button it
The multiple images of front collection meet (930) one or more predefined packet criterion.In certain embodiments, predefine packet accurate
Then include selecting the image of (932) predefined quantity before detecting to the activation of shutter release button.In certain embodiments,
Predefined packet criterion includes selecting the image of (934) predefined quantity before presentation graphics.In certain embodiments,
Predefined packet criterion includes selecting (936) immediately preceding detecting to the time predefined scope before the activation of shutter release button
In image.In certain embodiments, predefine packet criterion include select (938) immediately preceding collection presentation graphics when
Between before time predefined scope in image.
In certain embodiments, live preview is shown (940) is including can for enable the first acquisition of media pattern
The part of the media capture user interface of piece supplying, and shutter release button is shown in the software push buttons in media capture user interface
(for example, the shutter release button 514 of Fig. 5 A-5H).In response to (for example, the tap gesture of Fig. 5 F of the activation to shutter release button is detected
518), this equipment shows that (for example, a part for shutter release button splits off and flies back the animation that (942) be associated with shutter release button
To animation together, as shown in Fig. 5 F-5H), this animation continues to be used for the with collected by camera after the activation to shutter release button
The corresponding time quantum of the time quantum of the image of one image sequence is (for example, thus providing a user with the still captured finger of media
Show).In certain embodiments, this animation is circulation animation, and circulation animation can complete collection in camera and be used for the first image sequence
By seamless extension in the case that before the image of row, shutter release button is kept and presses or be activated again.
In certain embodiments, this equipment starts (944) (independent of detection to fast after the first acquisition of media pattern of entrance
The activation of door button) collection simultaneously storage image.This equipment is deleted (946) (or labelling is to be deleted) and is adopted when being in the first media
It is not grouped into when in integrated mode in time in the corresponding multiple images to the activation of shutter release button at the corresponding time
Image.
In certain embodiments, this equipment starts (948) (independent of detection to shutter release button after display live preview
Activation) collection and storage image.This equipment delete (950) be not grouped into when being in the first acquisition of media pattern when
Between upper close at the corresponding time to the image in the corresponding multiple images of the activation of shutter release button (or by its labelling to delete
Remove).
In certain embodiments, this equipment activation collection to shutter release button independent of detection when showing live preview
And storage image (952).This equipment deletion (954) is not grouped into when being in the first acquisition of media pattern and connects in time
The nearly image at the corresponding time, the institute in the corresponding multiple images of the activation of shutter release button being gathered and storing is (or by its labelling
To be deleted).
In certain embodiments, user can select to abandon this figure in the case that image is not grouped in image sequence
As the time span that this image is retained before.For example, this equipment can be set to reservation and be shown in live preview mould by user
Image in formula 5 seconds, 10 seconds or 20 seconds.For example, it is assumed that user's selection time length 5 seconds, then it is shown in the figure in live preview
As not being displayed in live preview at it by the case of being grouped in image sequence to the activation of shutter release button at it
It is retained 5 seconds and is dropped afterwards (for example, be deleted or be labeled to be deleted) afterwards.
In certain embodiments, in response to the activation to shutter release button at the very first time is detected, this equipment will be with
The corresponding audio frequency of one image sequence is (for example, including the audio frequency of record before the activation to shutter release button is detected with inspection
The audio frequency of record after measuring the activation to shutter release button) it is associated (956) with the first image sequence.In certain embodiments,
This equipment includes mike (or and mi crophone communication), and the audio frequency detecting when image sequence is collected is stored in
In memorizer and be linked to stored the first image sequence (or being otherwise associated with).For example, Fig. 6 E-6I diagram
Playback to the image sequence with corresponding audio frequency.
In certain embodiments, in response to the activation to shutter release button at the very first time is detected, this equipment will be with
The corresponding metadata of one image sequence (for example, 6J-6M illustrates the playback to the image sequence with corresponding metadata) with
First image sequence is associated (958).In certain embodiments, the metadata for image sequence is stored in memorizer simultaneously
And it is linked to stored image sequence (or being otherwise associated with), metadata such as time, date, position (example
As via GPS), weather, when image sequence be collected when play music (for example, using the such as Shazam in this equipment,
The music of the music recognition software identification of SoundHound or Midomi), local event information is (for example when the first image sequence quilt
During collection and in the local sports tournament play that the first image sequence is collected), information (such as final score) after event,
Etc..
In certain embodiments, this equipment automatically by broad image from the first image sequence exclude (960) (or delete or
Abandon showing the part as this sequence).
In certain embodiments, after the activation to shutter release button at the very first time is detected, this equipment detects
(962) at the second time, next activation to shutter release button (and does not detect between the very first time and the second time to shutter
Any activation of button).In response to next activation to shutter release button at the second time is detected, this equipment will be existed by camera
On the time close to the activation of shutter release button with the second time, multiple images packet (964) of collection are to the second image sequence
In row.Second image sequence includes:Before the activation to shutter release button for second time collection are being detected by camera more
Individual image;And expression the second image sequence and one or more of other images in the second image sequence by camera
The presentation graphics gathering afterwards.In certain embodiments, capture images sequence is to capture single figure using conventional digital camera
As similar mode is carried out, this makes it capture such image sequence for even novice users is simple and intuitively.For
Conventional digital camera, when each shutter release button is activated, single image is captured.Herein, when each shutter release button is activated, figure
As sequence is captured.This mode of capture images sequence is different from the mode capturing video using conventional digital camera.For
Capture video using conventional digital camera, start recording video is activated to the first time of shutter release button, and to shutter release button
Activate stop recording video next time.
In certain embodiments, the first frame in sequence and/or last frame change according to the change of presentation graphics
(for example, as shown in Fig. 5 I-5K).For this reason, in certain embodiments, the first image sequence is included in (966) first image sequences
Initial pictures, between initial pictures and presentation graphics collection the image of the first quantity and presentation graphics with
The image of the second quantity of collection between whole image.This equipment detects (968) and in order to change the representative in the first image sequence
Property image the corresponding input of request.In certain embodiments, when being in image sequence edit pattern, this equipment is examined
Survey and make presentation graphics select designator to move to the handss of another image the first image sequence from current presentation graphics
Gesture (for example, touch gestures 522) (for example, drag gesture or tap gesture).In certain embodiments, when be in image sequence compile
When in the pattern of collecting, the detection of this equipment makes current presentation graphics remove from presentation graphics selection region and make the first figure
As another image in sequence moves to the gesture (for example, drag gesture or tap gesture) of presentation graphics selection region.Ring
Ying Yu detects the input corresponding with order to change the request of the presentation graphics in the first image sequence:This equipment is according to inspection
It is the presentation graphics being corrected that presentation graphics is changed (970) by the input measuring;And by defeated according to detect
Enter the end in the first image sequence to add image and delete image at the other end of the first image sequence to change the
Grouped multiple images in one image sequence are so that the first image sequence has the initial pictures being corrected and is corrected
Final image.
The quantity of the image between initial pictures and presentation graphics is initial be corrected in certain embodiments
The quantity of the image between image and the presentation graphics being corrected is identical.In certain embodiments presentation graphics with final
The quantity of the image between image is identical with the quantity of the image between the presentation graphics being corrected and final image.One
In a little embodiments, the image being added is in time close to one end of the first image sequence.For example, if the representativeness being corrected
Image is three images in the first image sequence, then (collection followed by before initial pictures) three images are added
To the beginning (image the most forward in wherein three images becomes the initial pictures being corrected) of First ray, and three figures
As being deleted from the end of First ray.
In certain embodiments, display is (972) touch-sensitive display.This equipment receives (974) in order to show from the
The request of the presentation graphics of one image sequence.In response to receiving to show the request of presentation graphics, this equipment is touching
(976) presentation graphics is shown on quick display.When showing presentation graphics, this equipment receives on the touch sensitive display
(978) touch input on presentation graphics, this touch input includes the feature changing over.For example, touch input
Intensity changes over, or the position of the contact of touch input changes over (for example, due to the contact across touch-sensitive display
Transverse shifting).In response to receiving the touch input on presentation graphics on the touch sensitive display, this equipment is to be based on
Described by (for example, as the navigation drag gesture pressing and holding gesture or Fig. 7 A-7CC with regard to Fig. 6 A-6FF)) touch is defeated
The feature entering in time change the speed (for example, sequentially) determining showing the image in (980) first image sequences.
It should be appreciated that it is that the particular order that the operation in Fig. 9 A-9G has been described is merely exemplary and be not intended to
Indicate that described order is the unique order that operation can be performed.It will be recognized by those of ordinary skill in the art that for this
The various modes that operation described in literary composition is resequenced.Extraly it is noted that herein in connection with described herein its
His method (for example, method 1000,10000,10050,1100,11000,1200,2400,2500,2600 and 2700) describes
The details of other processes similar mode can also be applied to the method 900 above with reference to Fig. 9 A-9G description.For example, above pass
Alternatively have herein in contact, gesture, user interface object, intensity threshold, animation and the image sequence that method 900 describes
With regard to additive method described herein (for example, method 1000,10000,10050,1100,11000,1200,2400,2500,
2600 and 2700) one of feature of contact, gesture, user interface object, intensity threshold, animation and image sequence describing
Or it is multiple.For simplicity, these details are not repeated herein.Further, it is noted that other processes described in appendix A
Details similar mode can also be applied to the method 900 above with reference to Fig. 9 A-9G description.For example, retouch above with respect to method 900
Acquisition operations, division operation and the storage operation stated alternatively has the capture behaviour for enhancement mode photo described in appendix A
One or more of work, feature of editing operation, storage operation or search operaqtion.
Figure 10 A-10E illustrates the flow process of the method 1000 of display (playback) associated image sequences according to some embodiments
Figure.In electronic equipment (for example, the equipment 300 of Fig. 3 or the portable multifunction device of Figure 1A with display and Touch sensitive surface
100) place's execution method 1000.According to some embodiments, this equipment includes the intensity of the contact for detection with Touch sensitive surface
One or more sensors.In certain embodiments, described display be touch-screen display and described Touch sensitive surface be
On described display or integrated with described display.In certain embodiments, described display is separated with described Touch sensitive surface.Side
Certain operations in method 1000 are alternatively combined and/or the order of certain operations is alternatively changed.
This equipment shows that (1002) presentation graphics (for example, presents in pattern when this equipment is in image over the display
When, referring to Fig. 6 A).Representative diagram seems one of the image sequence that shot by camera image.Image sequence is included by camera
One or more images of collection after collection presentation graphics.Image sequence is also included by camera in collection presentation graphics
The one or more images before gathering.In certain embodiments, the camera of shooting image sequence is the part of electronic equipment.?
In some embodiments, by the camera shooting of the part not being electronic equipment, (for example, image sequence is being set image sequence using another
Standby upper camera is transferred to electronic equipment after being taken).In certain embodiments, image sequence is in response to detecting
At one time, the activation of shutter release button is obtained, such as herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and
Described by method 2600.In certain embodiments, presentation graphics with by collected by camera presentation graphics corresponding, such as exist
Herein with reference to described by Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600.
When showing presentation graphics over the display, the Part I of this equipment detection (1004) first input is (for example,
Input on Touch sensitive surface, participates in the Part I 616-1 of first input 616 of Fig. 6 B-6D).In certain embodiments, first
Input is that (1006) press and hold gesture (for example, the presentation graphics over the display when vernier or other focus selectors
When upper, pressing and holding on the presentation graphics on touch-sensitive display presses and holds handss in gesture, or Trackpad
Gesture).In certain embodiments, the first input is when on vernier or other focus selectors presentation graphics over the display
Clicking on and keeping inputting using mouse.In certain embodiments, this equipment include (1008) for detection and Touch sensitive surface
One or more sensors of the intensity of contact, and the first input includes the finger contact of satisfaction the first contact strength criterion
(for example, when on vernier or other focus selectors presentation graphics over the display, the representative diagram on touch-sensitive display
As upper finger gesture, or the finger gesture on Trackpad, the wherein contact in finger gesture exceedes for input at least
Partial light pressure (or deep pressure) intensity threshold).For example, as shown in Fig. 6 B-6D, the first input 616 is to exceed light pressure threshold value ITL's
Press and hold gesture.
In response to the Part I of the first input is detected, this equipment utilization to by camera collection presentation graphics after
The order of one or more images of collection shows to substitute (1010) display to presentation graphics (for example, as Fig. 6 B-6D institute
Show).Therefore, in certain embodiments, in response to the Part I of the first input is detected, by camera in collection representative diagram
One or more images as gathering afterwards are shown sequentially.In certain embodiments, in response to detecting the of the first input
A part, (for example, the speed of display is with the first input for the speed with the intensity of the contact in the Part I based on this input
Part I in contact intensity increase and increase, and the speed showing with first input Part I in connecing
Tactile intensity reduces and reduces) sequentially to show (1012) by camera after collection presentation graphics collection one or more
Image.In certain embodiments, (in response to the Part I of the first input is detected) image is schemed to final from presentation graphics
Speed as being shown sequentially changes according to the intensity of the contact in the Part I of the first input.In certain embodiments, exist
After display speed this initial dependency to the contact strength in the first input, (in response to detecting as shown by Fig. 6 E-6M
The first input part after a while) independent of the first input after a while partly in contact strength, carried out with fixing display speed
Follow-up display to image sequence.In certain embodiments, in response to the Part I of the first input is detected, with fixed rate
Order display is by one or more images of camera collection after collection presentation graphics.In certain embodiments, by figure
(for example, the certain strength of contact is mapped to entering by image sequence to the intensity based on contact for the position of the progress of picture sequence
The corresponding amount of degree, as shown in Fig. 6 P-6V and Fig. 6 W).In certain embodiments, this mapping between intensity and animation progress should
With as follows:When the intensity of contact strength is in ITLWith ITDBetween when and when contact intensity be higher than ITDWhen, image sequence it
Between animation with predefined speed (for example, 1x is real-time) or the speed that determined with the intensity based on contact is (for example, for having relatively
The contact of high intensity is very fast, and slower for having more low intensive contact) advance.In certain embodiments, using to by
Camera order of one or more images of collection after collection presentation graphics shows to substitute and presentation graphics is shown
Show the animation of the image in dynamic display image sequence including the intensity change in time based on the first contact for the display.
In certain embodiments, using to the one or more images being gathered after collection presentation graphics by camera
Order display is multiple (for example, to substitute the image that the display to presentation graphics is included shown by renewal in a second (for example, substituting)
10 times per second, 20 times, 30 times or 60 times), alternatively no matter whether the first Part I inputting meets one or more making a reservation for
Criterion of strength.In certain embodiments, animation is as the smoothness that the intensity of the Part I of the first input changes and is updated
Animation, thus provide a user with the feedback of the amount with regard to the intensity being detected by equipment (for example, with regard to the power that applied by user
The feedback of amount).In certain embodiments, animation smoothly and is rapidly updated thus creating user interface to applying for user
Respond in real time to the change of the power of Touch sensitive surface performance (for example, animation is being perceptually instant for a user,
Thus provide a user with immediate feedback and allowing users to preferably modulate they be applied to the power of Touch sensitive surface with right
There is the user interface object that the contact of the intensity of different or change responds effectively interact.
In certain embodiments, illustrate that the animation that image sequence is substituted is done with the little change to the intensity of the first contact
The mode going out dynamic response is shown sequentially (for example, as shown in Fig. 6 W).
In certain embodiments, after the Part I of the first input is detected, the detection (1014) first of this equipment is defeated
The Part II (for example, continuing the enough contacts in detection finger gesture and/or intensity) entering.In certain embodiments,
Two parts are the continuities of first input with the Part I identical feature inputting with first (for example, in the first input
The not related change of existence time between Part I and Part II).In certain embodiments, unless interrupted by user or in
Only, the Part I of the first input continues the long time as the time of execution operation 1010 cost, and after that
It is Part II or the part more posteriorly of the first input.
In certain embodiments, in response to the Part II of the first input is detected, this equipment order show (1016) by
Camera one or more images of collection, presentation graphics and representative in collection by camera before collection presentation graphics
One or more images (for example, as shown in Fig. 6 E-6I) of collection after image.Therefore, in certain embodiments, in response to inspection
Measure the Part II of the first input, show this sequence from the initial pictures of whole image sequence to final image.
In certain embodiments, replace by using to or many being gathered after collection presentation graphics by camera
The order of individual image shows to substitute the display to presentation graphics that the Part I the first input is detected is responded,
This equipment shows to substitute to presentation graphics by using the order of the remainder following sequence to the initial pictures of sequence
Display the Part I the first input is detected is responded.
In certain embodiments, in response to the Part II of the first input is detected, shown with fixed rate order
(1018) image sequence.In certain embodiments, independent of in (for example, during the Part II of the first input) first input
Contact intensity, with the image in fixed rate order display image sequence.For example, during the Part II of the first input
With 1x video playback velocity (for example, the obtained speed of image) order display image sequence.In certain embodiments, first
The speed that image in image sequence during the Part II of input is shown sequentially depends on the strong of the contact in the first input
Degree.For example, speed increases with the intensity increase of contact.In certain embodiments, in response to detecting the of the first input
Two parts, with the speed of the intensity of the contact in the Part I based on this input come order display image sequence.
In certain embodiments, this equipment shows, from order, being gathered after collection respective representative image by camera
Individual or multiple images cross-fade (1020) arrives or many that order display was gathered before collection presentation graphics by camera
Individual image.In certain embodiments, when image sequence is circulated or shows again, from (for example, as shown in Figure 6 D) image
The end of sequence to (for example, as illustrated in fig. 6e) image sequence start show cross-fade animation.
In certain embodiments, in response to the Part II of the first input is detected, this equipment assumes (1022) and image
The corresponding audio frequency of sequence.In certain embodiments, in response to the Part II of the first input is detected, show whole figure together
As sequence and when gathering image sequence the corresponding audio frequency that records.In certain embodiments, in response to the first input is detected
Part I do not assume audio frequency.In certain embodiments, (for example, respond during the first complete playback to image sequence
In the Part II the first input is detected) assume audio frequency.In certain embodiments, if complete to the first of image sequence
Maintain the first input after playback (for example, in response to the Part II of the first input is detected), the follow-up of this sequence is being returned
Do not respond once more to continue the first input is detected and assume audio frequency during putting.In certain embodiments, for given input,
Audio frequency is only assumed during the first complete playback to image sequence.In certain embodiments, for given input, to image
Audio frequency is only assumed during second complete playback of sequence.
In certain embodiments, after the Part II of the first input is detected, the detection (1014) first of this equipment is defeated
The Part III (for example, continuing the enough contacts in detection finger gesture and/or intensity, as shown in Fig. 6 J-6M) entering.One
In a little embodiments, the Part III of the first input is the of the first input in the case of not having the change of feature of the first input
The continuity of two parts.In response to the Part III of the first input is detected, this equipment order shows that (1026) are being gathered by camera
Before presentation graphics collection one or more images, presentation graphics and by camera collection presentation graphics after adopt
One or more images (for example, this equipment loops back and display sequence again) of collection.In certain embodiments, if first is defeated
Pressure in entering and/or contact are maintained, then image sequence is shown again.In certain embodiments, as long as the first input quilt
Maintain, circulation and playback continue to.
In certain embodiments, in response to the Part III of the first input is detected, this equipment shows (1028) and image
The corresponding metadata of sequence.In certain embodiments, if the pressure in the first input and/or contact are maintained, utilize
Concurrently display display image sequence again to the metadata for image sequence, metadata is stored in memorizer and quilt
Be linked to stored image sequence (or being otherwise associated with), metadata for example the time, the date, position (for example, via
GPS), weather, when image sequence be collected when play music (for example, using the such as Shazam in this equipment,
The music of the music recognition software identification of SoundHound or Midomi), local event information is (for example when the first image sequence quilt
During collection and in the local sports tournament play that the first image sequence is collected) and/or event after information (for example finally divide
Number).For example, Fig. 6 J-6M illustrates the concurrently display to the position corresponding with the image in image sequence and temporal information.
In certain embodiments, termination (for example, the connecing in detection first input of this equipment detection (1030) first input
The tactile intensity being lifted away from or detecting the contact in the first input drops to predetermined threshold intensity value (such as ITL) below, such as Fig. 6 N
Shown).In response to the termination of the first input is detected, this equipment show (1032) presentation graphics (for example, this equipment show with
The animation that the display of the only representing property image in image sequence is terminated).
In certain embodiments, as the first image (for example, the image 602-4 of Fig. 6 N) in display image sequence, should
The termination of equipment detection (1034) first input (for example, detects being lifted away from or detecting in the first input of the contact in the first input
The intensity of contact drop to predetermined threshold intensity value (such as ITL) below, as shown in fig. 6n).Work as display in response to (1036)
The termination of the first input is detected during the first image in image sequence:Occur in the representative in image sequence according to the first image
Property image before (for example, the first image was taken before presentation graphics) determination, this equipment come in chronological order order
From the image of the first image to presentation graphics, (for example, display image sequence reaches representativeness until it to this equipment forward for display
Image).Occur in the presentation graphics in image sequence according to the first image after, (for example, the first image is in presentation graphics
Be taken afterwards) determination, this equipment carrys out order by reversed time order and shows image from the first image to presentation graphics
(for example, this equipment backward display image sequence until its reach presentation graphics).In certain embodiments, come in chronological order
Order display includes little by little slowing down the speed that image is shown so as to image from the image of the first image to presentation graphics
The playback of sequence lentamente stops the halt at presentation graphics.In certain embodiments, next suitable by reversed time order
Sequence shows that the image from the first image to presentation graphics includes little by little slowing down the speed that image is shown so as to image sequence
The reverse playback of row lentamente stops the halt at presentation graphics.
In certain embodiments, image sequence is configured to (1038) on forward direction or inverse direction in a looping fashion
It is shown sequentially.When the first image in display image sequence, the termination of this equipment detection (1040) first input is (for example,
It is strong that the intensity being lifted away from or detecting the contact in the first input of the contact in this equipment detection first input drops to predetermined threshold
Angle value (such as ITL) below, as shown in fig. 6n).Detect when the first image in display image sequence in response to (1042)
The termination of the first input:Exist relatively between the first image and presentation graphics according to when circulation is traversed in the forward direction
The determination of few image, this equipment sequentially shows the image from the first image to presentation graphics in the forward direction, and according to
There is the determination of less image, this equipment when circulation is traversed in backward direction between the first image and presentation graphics
Order display is from the image of the first image to presentation graphics in backward direction.
In certain embodiments, show that (1044) are existed by camera according to the respective strengths level order being applied by the first input
One or more images of collection after collection presentation graphics.For example, as shown in Fig. 6 P-6V and 6W, respective image is mapped
To respective strengths, and user (for example, can be provided and strength range 618-4 by the intensity that the first input applies by changing
The touch input of corresponding intensity initiates the user interface of diagram in Fig. 6 R is shown, and subsequently increases touch input
Intensity make intensity corresponding with strength range 618-5 initiate using in Fig. 6 S diagram user interface substitute in Fig. 6 R
Diagram user interface display) forwardly and rearwardly to swipe by after presentation graphics obtain one or more figures
Picture.
In certain embodiments, the Part I of the first input includes the contact that (1046) detect on Touch sensitive surface
The change (for example, as shown in Fig. 6 P-6V) of intensity.When presentation graphics be shown and contact there is the first intensity when, this sets
The intensity of standby detection contact passes through respective amount to the increase of the second intensity.Intensity in response to contact is detected passes through respective amount
Increase, this equipment utilization to substitute the display to presentation graphics to the display of the first successive image, the first successive image be
The image of the respective numbers after presentation graphics in image sequence.For example, in Fig. 6 Q-6R, the intensity 636 of contact is from strong
Intensity in degree scope 618-3 increases to intensity in strength range 618-4, and replaces using to the display of image 602-4
The display to image 602-3 for the generation.When show the first successive image and contact there is the second intensity when, this equipment detection contact
Intensity passes through respective amount to the increase of the 3rd intensity.Intensity in response to contact is detected passes through respective amount from the second intensity to the
The increase of three intensity, this equipment utilization to substitute the display to the first successive image to the display of the second successive image, after second
Continuous image is the image of the respective numbers after the first successive image in image sequence.For example, in Fig. 6 R-6S, contact
Intensity 636 increase to intensity in scope 618-5 from the intensity in strength range 618-4, and using to image 602-5's
Show and to substitute the display to image 602-4.
In certain embodiments, the amplitude of the change of intensity that the respective numbers of image are contacted based on (1048).For example, exist
In Fig. 6 Q-6S, when the intensity of contact 636 increases to the intensity in strength range 618-4 from the intensity in strength range 618-3
When, the respective numbers of image are one, and when the intensity in strength range 618-3 for the intensity of contact 636 increases to intensity model
When enclosing the intensity in 618-5, the respective numbers of image are two.
In certain embodiments, when the change of the intensity of contact has the first amplitude, the first sequential picture is closelyed follow
(1050) after the respective image in image sequence, and the first order in image sequence for second sequential picture
After image.In certain embodiments, when the corresponding change of the intensity of contact has the second amplitude more than the first amplitude, the
One sequential picture is opened by the image separation of the respective numbers in image sequence with respective image, and the second sequential picture and
One sequential picture is opened by the image separation of the respective numbers in image sequence, and wherein the image of respective numbers is one or more
Image.For example, in Fig. 6 Q-6S, when the intensity in strength range 618-3 for the intensity of contact 636 increases to strength range
During intensity in 618-4, using the display display of image 602-4 being substituted to image 602-3, and when contact 636
When intensity increases to the intensity in strength range 618-5 from the intensity in strength range 618-3, utilize and image 602-5 is shown
Show the display to substitute to image 602-3.
In certain embodiments, the Part I of the first input includes the contact that (1052) detect on Touch sensitive surface
The change (for example, as shown in Fig. 6 P-6V) of intensity.When presentation graphics be shown and contact there is the first intensity when, this sets
The intensity of standby detection contact passes through respective amount to the increase of the second intensity.Intensity in response to contact is detected passes through (identical) phase
The increase that should measure, this equipment utilization to substitute the display to presentation graphics, the first subsequent figure to the display of the first successive image
Picture is the image of the respective numbers after the presentation graphics in image sequence.For example, in Fig. 6 Q-6R, the intensity of contact
636 intensity from strength range 618-3 increase to the intensity in strength range 618-4, and utilization shows to image 602-4
Show the display to substitute to image 602-3.When show the first successive image and contact there is the second intensity when, this equipment detect
The intensity of contact passes through the change of (identical) respective amount.Intensity in response to contact is detected passes through changing of (identical) respective amount
Become:Intensity according to contact includes the intensity of contact from the second intensity to the increasing of the 3rd intensity by changes of (identical) respective amount
Plus determination, this equipment utilization to substitute the display to the first successive image, the second subsequent figure to the display of the second successive image
As being the image of respective numbers after the first successive image in image sequence;And the intensity according to contact passes through (identical)
The change of respective amount include contact intensity from the second intensity to the determination of the reduction of the first intensity, this equipment utilization is to representativeness
The display of image is substituting the display to the first successive image.For example, when the intensity of contact 636 is in strength range 618-4
When intensity increases to the intensity in scope 618-5 as shown in Fig. 6 R-6S, substitute to figure using to the display of image 602-5
As the display of 602-4, and when the intensity of contact 636 is reduced to as shown in Fig. 6 T-6U from the intensity in strength range 618-4
During intensity in scope 618-3, substitute the display to image 602-4 using to the display of image 602-3.
It should be appreciated that the particular order that the operation in Figure 10 A-10E has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 1014 and 1016 is omitted.Extraly it is noted that this
In literary composition with regard to additive method described herein (for example, method 900,10000,10050,1100,11000,1200,2400,
2500th, 2600 and 2700) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 10 A-10E
The method 1000 stated.For example, above with respect to the contact of method 1000 description, gesture, user interface object, intensity threshold, animation
With image sequence alternatively have herein in connection with additive method described herein (for example, method 900,10000,10050,
1100th, 11000,1200,2400,2500,2600 and 2700) contact that describes, gesture, user interface object, intensity threshold, dynamic
One or more of feature of picture and image sequence.For simplicity, these details are not repeated herein.
Figure 10 F-10I illustrates the stream of the method 10000 of display (playback) associated image sequences according to some embodiments
Cheng Tu.In the electronic equipment with display and Touch sensitive surface, (for example, the equipment 300 of Fig. 3 or the portable multi-function of Figure 1A set
Standby 100) place's execution method 10000.According to some embodiments, this equipment includes the intensity of the contact for detection with Touch sensitive surface
One or more sensors.In certain embodiments, described display is touch-screen display and described Touch sensitive surface is
On the display or integrated with described display.In certain embodiments, described display is separated with described Touch sensitive surface.
Certain operations in method 10000 are alternatively combined and/or the order of certain operations is alternatively changed.
This equipment shows (10002) presentation graphics over the display.Representative diagram seems the image sequence being shot by camera
One of row image.Image sequence is included by one or more images of camera collection after collection presentation graphics.Figure
As sequence is included by one or more images of camera collection before collection presentation graphics.In certain embodiments, image
Sequence is similar to the image sequence of operation 1002 description with regard to method 1000.
When showing presentation graphics over the display, the Part I of this equipment detection (10004) first input.One
In a little embodiments, the first input is that (10006) press and hold gesture.In certain embodiments, first input similar to regard to
First input of the operation 1004-1008 description of method 1000.
In response to the Part I of the first input is detected:This equipment from display presentation graphics change (10008) arrive show
Show the corresponding prior images in image sequence, wherein this corresponding prior images is to be gathered before collection presentation graphics by camera
's;And after being converted to the corresponding prior images of display from display presentation graphics, this equipment is started with corresponding prior images
Order shows at least some of the one or more images being gathered before collection presentation graphics by camera image and by phase
Machine at least some of one or more images of collection image after collection presentation graphics.In certain embodiments, exist
From display presentation graphics be converted to display corresponding prior images after, this equipment with corresponding prior images start order show by
Camera at least some of one or more images image of collection, presentation graphics and by phase before collection presentation graphics
Machine at least some of one or more images of collection image after collection presentation graphics.
In certain embodiments, it is converted to the corresponding prior images of display from display presentation graphics to include sequentially showing
(10010) at least some of one or more images being gathered after collection presentation graphics by camera image, and it
Substitute the display to the corresponding successive image of collection after collection presentation graphics afterwards using corresponding prior images (for example,
This equipment cross-fade and/or fuzzy with from showing that corresponding successive image is switched to the corresponding prior images of display, such as with reference to Fig. 6 X
Schematic diagram 650 described by).
In certain embodiments, it is converted to the corresponding prior images of display from display presentation graphics to include using accordingly previous
Image come to substitute (10012) to the display of presentation graphics (for example, this equipment cross-fade and/or fuzzy to represent from display
Property image be switched to display corresponding prior images, and do not show before handover by camera collection presentation graphics after gather
One or more images, as with reference to Fig. 6 X schematic diagram 656 described by).
In certain embodiments, it is converted to the corresponding prior images of display from display presentation graphics to include:Defeated according to first
The Part I that enters meet the first playback criterion (for example, detect contact intensity arrive the slow increase playing back intensity threshold or
Person detects the increase of the intensity of contact to the slow playback intensity threshold lower than quick playback intensity threshold) determination, suitable
Sequence shows (10014) one or more images by camera collection after collection presentation graphics, and utilizes corresponding afterwards
Prior images come to substitute to collection presentation graphics after collection corresponding successive image display (for example, cross-fade and/
Or obscure to be switched to the corresponding prior images of display from the corresponding successive image of display);And, the Part I according to the first input
(intensity contact for example, is detected is fast to playback intensity threshold to meet the second playback criterions different from the first playback criterion
Speed increases or detects the increase of the intensity of contact to quick playback intensity threshold) determination, using corresponding prior images Lai
Substitute to the display of presentation graphics (for example, cross-fade and/or fuzzy corresponding to be switched to display from display presentation graphics
Prior images, and do not show before handover by one or more images of camera collection after collection presentation graphics).
According to some embodiments, this equipment include (10016) for detection one of the intensity of contact with Touch sensitive surface
Or multiple sensor unit.First input includes the contact on Touch sensitive surface.First playback criterion includes thering is height when contact
In the first intensity threshold (for example, light pressure threshold value IT of Fig. 6 XL) characteristic strength when the criterion that is satisfied.Second playback criterion bag
Include and have higher than second intensity threshold (for example, depth pressure threshold value IT of Fig. 6 X bigger than the first intensity threshold when contactD) spy
Levy the criterion being satisfied during intensity.
In certain embodiments, (for example, with reference operation 1018, method 1000 description with fixed rate to image
Display similar mode) with fixed rate with corresponding prior images start order show (10018) by camera collection represent
Property image before collection one or more images and by camera collection presentation graphics after collection one or more figures
Picture.
In certain embodiments, this equipment assumes (10020) audio frequency corresponding with image sequence (for example, similar to pass
In method 1000 operation 1022 description audio frequency is presented).
In certain embodiments, after the Part I of the first input is detected, the detection (10022) first of this equipment is defeated
The Part II entering.In response to the Part II of the first input is detected, this equipment order display is representative in collection by camera
At least some of one or more images of collection image and the collection after gathering presentation graphics by camera before image
At least some of one or more images image (for example, with operation 1024-1028, method 1000 similar mode).
In certain embodiments, in response to the Part II of the first input is detected, this equipment shows that (10024) are relative with image sequence
The metadata answered.
In certain embodiments, the termination (for example, being lifted away from) of this equipment detection (10026) first input.In response to detection
To the termination of the first input, this equipment display presentation graphics.(for example, with similar with operation 1030-1032, method 1000
Mode).
In certain embodiments, when the first image in display image sequence, this equipment detection (10028) first input
Termination (for example, being lifted away from).In response to the termination of the first input is detected when the first image in display image sequence:According to
First image occurs in the determination before the presentation graphics in image sequence, and this equipment carrys out order in chronological order and shows from the
One image is to the image of presentation graphics, and is occurred according to the first image after the presentation graphics in image sequence really
Fixed, this equipment by reversed time order come order show image from the first image to presentation graphics (for example, with operation
1034-1036, method 1000 similar mode).
In certain embodiments, image sequence is configured to (10030) on forward direction or inverse direction with circulation side
Formula is shown sequentially.When the first image in display image sequence, the termination of this equipment detection first input (for example, is lifted
From).In response to the termination of the first input is detected when the first image in display image sequence:According to before circulate in side
There is the determination of less image between the first image and presentation graphics, this equipment is suitable in the forward direction when being traversed upwards
Sequence shows the image from the first image to presentation graphics, and according to when circulation be traversed in backward direction when in the first figure
As there is the determination of less image and presentation graphics between, order showed from the first image to generation this equipment in backward direction
Table image image (for example, with operation 1038-1042, method 1000 similar mode).
It should be appreciated that the particular order that the operation in Figure 10 F-10I has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 10014 and 10016 is omitted.Extraly it is noted that
Herein in connection with additive method described herein (for example, method 900,1000,10050,1100,11000,1200,2400,
2500th, 2600 and 2700) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 10 F-10I
The method 10000 stated.For example, above with respect to the contact of method 10000 description, gesture, user interface object, intensity threshold, move
Draw and image sequence alternatively have herein in connection with additive method described herein (for example, method 900,1000,10050,
1100th, 11000,1200,2400,2500,2600 and 2700) contact that describes, gesture, user interface object, intensity threshold, dynamic
One or more of feature of picture and image sequence.For simplicity, these details are not repeated herein.
Figure 10 J-10M illustrates the stream of the method 10050 of display (playback) associated image sequences according to some embodiments
Cheng Tu.In the electronic equipment with display and Touch sensitive surface, (for example, the equipment 300 of Fig. 3 or the portable multi-function of Figure 1A set
Standby 100) place's execution method 10050.According to some embodiments, this equipment includes the intensity of the contact for detection with Touch sensitive surface
One or more sensors.In certain embodiments, described display is touch-screen display and described Touch sensitive surface is
On the display or integrated with described display.In certain embodiments, described display is separated with described Touch sensitive surface.
Certain operations in method 10050 are alternatively combined and/or the order of certain operations is alternatively changed.
In one or more sensings with display, Touch sensitive surface and the intensity of the contact with Touch sensitive surface for detection
At the electronic equipment of device:This equipment shows (10052) presentation graphics (for example, presentation graphics of Fig. 6 Y over the display
602-1).
Representative diagram seems one of the image sequence that shot by camera image.Image sequence is included by camera in collection
One or more images of collection after presentation graphics.In certain embodiments, representative diagram seems first in image sequence
Beginning image.In certain embodiments, image sequence includes that (10054) were gathered before collection presentation graphics by camera
Or multiple images (for example, image sequence is similar to the image sequence of operation 1002 description with regard to method 1000).
When showing presentation graphics over the display, this equipment detection (10056) first input, it includes detecting touch-sensitive
The characteristic strength of the contact on surface is to the first intensity (for example, the light pressure threshold value of Fig. 6 Y-6AA more than the first intensity threshold
ITL) increase.
In certain embodiments, the first input is that (10058) press and hold gesture.In certain embodiments, first is defeated
Enter the first input similar to the operation 1004-1008 description with regard to method 1000.
In response to the increase of the characteristic strength of contact is detected, this equipment is in a first direction (for example, in chronological order)
Speed propulsion (10060) being determined with being at least partially based on the first intensity is passed through to be gathered after collection presentation graphics by camera
One or more images (for example, as with reference to Fig. 6 Y-6AA described by).
Pass through by camera after collection presentation graphics in the speed propulsion being determined with being at least partially based on the first intensity
After one or more images of collection, the intensity that this equipment detection (10062) contacts is to the second intensity less than the first intensity
Reduction.
In response to the characteristic strength of contact is detected to the reduction of the second intensity:It is higher than the first intensity threshold according to the second intensity
The determination of value, this equipment is continued (10064) and is passed through by camera in collection representative diagram with the second speed propulsion in a first direction
One or more images that picture gathers afterwards, wherein:Second speed be at least partially based on the second intensity come to determine and second speed
Rate is slower than first rate;And, according to the second intensity be less than the first intensity threshold determination, this equipment with first direction phase
The speed propulsion being determined with being at least partially based on the second intensity in anti-second direction (for example, reversed time order) is passed through by phase
Machine collection presentation graphics after collection one or more images (for example, equipment 100 will from image sequence 602 Fig. 6 AA to
After move to Fig. 6 BB, because input 644 has less than light pressure threshold value ITLContact strength).
In certain embodiments, with contact characteristic strength increase and increase speed (for example, first rate and/
Or second speed proportional with the characteristic strength contacting) sequentially to show (10066) image.In certain embodiments, according to connecing
Tactile characteristic strength is higher than the determination of the first intensity threshold, using the speed pair being increased and being increased with the characteristic strength with contact
By camera, after collection presentation graphics, the order of at least some of one or more images of collection image shows to replace
The display to presentation graphics for the generation.
In certain embodiments, the characteristic strength according to contact is less than the determination of the first intensity threshold, with contact
(for example, the figure as shown in the speed schematic diagram of Fig. 6 Y-6AA and Fig. 6 CC-6DD of speed backward that characteristic strength reduces and increases
Shown in shape) show the image in this sequence by reversed time order.
In certain embodiments, forward rate or backward speed in real time or near real-time be determined so that user can
To be accelerated by the characteristic strength that change contacts or to slow down (on forward direction or backward directions) by the progress of image.
In certain embodiments, with the speed proportional with the difference between the characteristic strength contacting and the first intensity threshold
(for example, first rate and/or the second speed are proportional with the difference between the characteristic strength contacting and the first intensity threshold) comes
Order display (10068) image.In certain embodiments, the characteristic strength according to contact is higher than the determination of the first intensity threshold,
Represent in collection to by camera using with the speed proportional with the difference between the characteristic strength contacting and the first intensity threshold
Property image after the order of at least some of one or more images image of collection show to substitute to presentation graphics
Display.
In certain embodiments, the characteristic strength according to contact is less than the determination of first intensity threshold, with contacting
The proportional speed backward of difference between characteristic strength and the first intensity threshold is shown the figure in this sequence by reversed time order
Picture.
In certain embodiments, forward rate or backward speed in real time or near real-time be determined so that user can
To be accelerated by the characteristic strength that change contacts or to slow down (on forward direction or backward directions) by the progress of image.
In certain embodiments, this equipment reduces in (10070) image sequence with the terminal near image sequence
The speed (for example, independent of the characteristic strength of contact) that image is shown.
For example, in certain embodiments, first rate is at least partially based on currently displaying image to the knot of image sequence
The nearness of bundle is determining that (10072) (for example, at the end of playback approximating sequence, the speed of propulsion slows down so that image sequence
The halt at the end of image sequence is slowly arrived in the playback of row).Therefore, this equipment is light at the end of working as its arrival image sequence
Micro- " braking ".
As another example, in certain embodiments, the second speed is at least partially based on currently displaying image to image
The nearness of the beginning of sequence come to determine (10074) (for example, when the beginning of reverse playback approximating sequence, the speed being moved rearwards by
Rate slows down so that the reverse playback of image sequence slowly arrives the halt of the beginning of image sequence).Therefore, this equipment works as it
Reach slight " braking " during the beginning of image sequence by reversed time order movement.
In certain embodiments, when contact is detected on Touch sensitive surface, propulsion is subject to by the speed of image sequence
(for example, the maximum rate of 2x, wherein x is the standard playback speed for content to big rate constraint (10076), for example, for 1
The speed of playback of second is corresponding with the time of 1 second passing through during the collection to the image in sequence).
In certain embodiments, close to the contact of the first intensity threshold characteristic strength intensity level be at least distally from per second
The rate value of the speed scheduled volume of zero image is associated (10078)
0.2x, and be -0.2x for the value less than the first intensity threshold).Guarantee that the playback rate of image sequence will not be anti-close to zero
Only image is played the discordance must being so slow that between each image and becomes apparent, and this avoids damage to smooth
The hallucination by image sequence for the playback.
In certain embodiments, when contact is detected on Touch sensitive surface, the speed being moved through image sequence is subject to
Big reverse rate (for example, maximum reverse speed of -2x) constraint (10080).
In certain embodiments, presentation graphics is shown the background image in the screen locking that (10082) are equipment, and works as
When one or more image of capture after respective image is passed through in equipment propulsion, one or more foreground elements (for example, days
Phase, time, one or more notice, network state information, battery status information, equipment unlock instruction and/or other states letter
Breath) do not change.
In certain embodiments, this equipment shows (10084) metadata corresponding with image sequence.For example, this equipment
Show such as following metadata:Time, date, position (for example, via GPS), weather, the broadcasting when image sequence is collected
Music (for example, using such as Shazam, SoundHound or Midomi in this equipment music recognition software identification sound
Happy), (the local broadcasting being collected for example when the first image sequence is collected and in the first image sequence of local event information
Sports tournament) and/or event after information (such as final score).
In certain embodiments, this equipment detection (10086) contact being lifted away from from Touch sensitive surface.In response to contact is detected
Be lifted away from, this equipment is moved through figure in a second direction with the speed bigger than maximum reverse speed (for example, the speed of -4x)
Picture.
In certain embodiments, the termination (for example, being lifted away from) of this equipment detection (10088) first input.(for example, with
Operation 1030-1032, method 1000 similar mode) in response to the termination of the first input is detected, the display of this equipment is representative
Image.
In certain embodiments, when the first image in display image sequence, this equipment detection (10090) first input
Termination (for example, being lifted away from).In response to the termination of the first input is detected when the first image in display image sequence:According to
First image occurs in the determination before the presentation graphics in image sequence, and this equipment carrys out order in chronological order and shows from the
One image is to the image of presentation graphics, and is occurred according to the first image after the presentation graphics in image sequence really
Fixed, this equipment by reversed time order come order show image from the first image to presentation graphics (for example, with operation
1034-1036, method 1000 similar mode).
It should be appreciated that the particular order that the operation in Figure 10 J-10M has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 10064 and 10066 is omitted.Extraly it is noted that
Herein in connection with additive method described herein (for example, method 900,1000,10000,1100,11000,1200,2400,
2500th, 2600 and 2700) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 10 J-10M
The method 10050 stated.For example, above with respect to the contact of method 10050 description, gesture, user interface object, intensity threshold, move
Draw and image sequence alternatively have herein in connection with additive method described herein (for example, method 900,1000,10000,
1100th, 11000,1200,2400,2500,2600 and 2700) contact that describes, gesture, user interface object, intensity threshold, dynamic
One or more of feature of picture and image sequence.For simplicity, these details are not repeated herein.
The flow process of the method 1100 of associated image sequences is passed through in the navigation that Figure 11 A-11E illustrates according to some embodiments
Figure.In electronic equipment (for example, the equipment 300 of Fig. 3 or the portable multifunction device of Figure 1A with display and Touch sensitive surface
100) place's execution method 1100.According to some embodiments, this equipment includes the intensity of the contact for detection with Touch sensitive surface
One or more sensors.In certain embodiments, described display be touch-screen display and described Touch sensitive surface be
On described display or integrated with described display.In certain embodiments, described display is separated with described Touch sensitive surface.Side
Certain operations in method 1100 are alternatively combined and/or the order of certain operations is alternatively changed.
This equipment (for example, in nonvolatile memory and/or program storage) stores (1102) multiple images sequence.
Respective image sequence includes respective representative image, the collection after gathering respective representative image by camera being shot by camera
One or more images and by camera collection respective representative image before collection one or more images.At some
In embodiment, the camera shooting respective image sequence is the part of electronic equipment.In certain embodiments, respective image sequence by
Be not electronic equipment part camera shoot (for example, respective image sequence is being taken it using the camera on another equipment
After be transferred to electronic equipment).In certain embodiments, respective image sequence is in response to detecting at the very first time to shutter
The activation of button and obtain, as described by herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600
's.In certain embodiments, respective representative image with by collected by camera presentation graphics corresponding, such as herein with reference to
Described by Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600.In certain embodiments, presentation graphics tool
There is the resolution higher than other images in respective image sequence.In certain embodiments, presentation graphics have with accordingly
Other image identical resolution in image sequence.
In this equipment removable first area over the display, display (1104) is used for the first generation of the first image sequence
Table image (for example, when this equipment is in image and presents in pattern, as shown in Figure 7 A).In certain embodiments, may move
First area is to show image in the first image sequence and do not show from the image sequence in addition to the first image sequence
Image region.
In certain embodiments, the first image sequence is in the scrollable field that (1106) are shown in information receiving and transmitting application
Message in message conversation, and the scrollable field that the first image sequence is applied with information receiving and transmitting be scrolled (for example, with
Across the display movement of message) and be shown.
This equipment detects the drag gesture (for example, the drag gesture 736 of Fig. 7 B-7D) on (1108) Touch sensitive surface.One
In a little embodiments, start in drag gesture removable first area over the display.In certain embodiments, drag gesture
Terminate in removable first area over the display.In certain embodiments, drag gesture over the display removable the
Start and terminate (for example, because drag gesture utilizes it to pull removable first area) in one region.
Determination (1112) on first direction on Touch sensitive surface (for example, to the left or upwards) is according to drag gesture:
In removable first area, this equipment utilization to by camera collection for the first image sequence the first presentation graphics it
At least some of the one or more images for the first image sequence gathering afterwards image is chronological to be shown to replace
Generation (1114) display to the first presentation graphics for the first image sequence.This equipment is also by first area in a first direction
Upper movement (1116) (for example, pulling first area using drag gesture).For example, in response to drag gesture to the left is detected, such as
Shown in Fig. 7 B-7D, first area 734 is dragged to the left side by this equipment.
In certain embodiments, over the display in a first direction to the movement of first area with Touch sensitive surface
On first direction, corresponding (for example, the movement of contact seems directly to manipulate to the firstth area the movement to the contact in drag gesture
The movement in domain).More generally, in certain embodiments, the shifting to the respective regions being shown in image in respective image sequence
Dynamic with Touch sensitive surface the movement to the contact in drag gesture corresponding.
In certain embodiments, in the first region to by camera collection the first presentation graphics after collection be used for
The chronological display of at least some of one or more images of the first image sequence image is according in drag gesture
The mobile appearance of contact.Therefore, if in a first direction to contact mobile acceleration, in the first region to image when
Between progress display accelerate.If the movement to contact slows down in a first direction, the time to image in the first region
The display of progress slows down.If the mobile time-out to contact, enters to the time of image in the first region in a first direction
The display halt of degree.And, if the mobile reverse directions (for example, from drag gesture to the left to drag gesture to the right) of contact,
Then the display to the progress of the image in the first image sequence is reversed in the first region, and image is according in reverse directions
On to contact movement be illustrated by reversed time order.More generally, in certain embodiments, for respective image sequence,
The mobile appearance showing according to the contact in drag gesture to the progress of the image in respective image sequence.
In certain embodiments, by first area in a first direction mobile include (1118) by least portion of first area
Point in a first direction (for example, to the left or upwards) moves away display.In certain embodiments, due to first area existing
First party moves up, so only a part of of first area is shown over the display.For example, as shown in Fig. 7 B-7D, the
The partly dragged of one region 734 frames out.
In certain embodiments, according to drag gesture be in first direction on Touch sensitive surface (for example, to the left or to
On) on determination:Removable second area is moved (1120) by this equipment in a first direction.In certain embodiments, removable
Dynamic second area is to show image in the second image sequence and do not show from the image sequence in addition to the second image sequence
The region of the image (for example, the removable second area 738 of Fig. 7 B-7D) of row.In certain embodiments, removable second area
Adjacent with removable first area (for example, on the right of removable first area).In certain embodiments, may move the secondth area
Domain is the region for next image sequence.When moving second area in a first direction, this equipment also will be existed by camera
Gather for gathering before the second presentation graphics of the second image sequence for the second image sequence (for example, next image
Sequence) at least some of one or more images image show (1122) in chronological order in the second area.
In certain embodiments, in the second area to by camera collection the second presentation graphics before collection be used for
The chronological display of at least some of one or more images of the second image sequence image is according in drag gesture
Contact mobile appearance (for example, with above with respect to first image sequence description similar mode).For example, in drag gesture
Period, the image in first area and the image of second area are advanced with phase same rate simultaneously, the shifting based on contact for its medium-rate
Dynamic.
In certain embodiments, when moving second area in a first direction, second area only shows for
Second presentation graphics of two image sequences and do not show other images in the second image sequence.
In certain embodiments, when replacing moving second area in a first direction, second area is suitable in z layer (in front and back)
In sequence below first area, and when moving away display in a first direction when first area, second area is exposed.
In certain embodiments, the drag gesture in a first direction similar to detection, detection to next icon or is pressed
The activation of button (for example, the next icon 750-1 of Fig. 7 A) also leads in the first region to the image from First ray
Animation shows and in the second area the animation of the image from the second sequence is shown.In certain embodiments, detection under
The activation of one icon or button leads to the display of the second presentation graphics is substituted with the display to the first presentation graphics, and does not have
Have and in the first region the animation of the image from First ray is shown and not in the second area to from the second sequence
The animation of the image of row shows.In certain embodiments, detection leads to the activation of next icon or button represent to second
Property image display substitute display to the first presentation graphics, without to other images in First ray or the second sequence
Shown.In certain embodiments, to different types of input, (for example, drag gesture to next icon of comparison or is pressed to the left
The activation of button) response be user can for example via setting interface configurations.
In certain embodiments, by second area in a first direction mobile include (1124) by least portion of second area
Point in a first direction (for example, to the left or upwards) moves on display.In certain embodiments, due to second area existing
First party moves up, so only a part of of second area is shown over the display, the wherein more multi-section of second area
Divide dragged and exposed in a first direction with second area.For example, as shown in Fig. 7 B-7D, the part quilt of second area 738
It is dragged on screen.
In certain embodiments, first direction (for example, to the left or upwards) is on Touch sensitive surface according to drag gesture
On determination:After moving second area in a first direction, this equipment will be used for the second representative of the second image sequence
Property image shows (1126) in the second area.For example, Fig. 7 F illustrates the final result as drag gesture 736 (Fig. 7 B-7D), the
Two presentation graphics 724-3 are shown (even if in the case that optional intermediary operation is performed, as described below).
In certain embodiments, when the second presentation graphics being used for the second image sequence is shown in the second area
When, the change of this equipment detection (1128) intensity of input corresponding with the second presentation graphics.In response to input is detected
Intensity change, this equipment advance in the second area (1130) by the second image sequence not by second area display
Move on device and (for example, started with the image after the second presentation graphics in chronological order and be circulated back to temporally suitable
Image before the second presentation graphics for the sequence).For example, in certain embodiments, user can suspend drag gesture, thus
Drag gesture is converted to triggering and is pressed and held gesture to what the second image sequence played back, such as reference method 1000/
10000/10050th, Figure 10 A-10M is more fully described.
In certain embodiments, when mobile first area and second area, this equipment detects (1132) drag gesture
Terminate (for example, being lifted away from).In response to the termination (1134) of drag gesture is detected when moving first area and second area:Root
(first area for example, exceeding half has moved away from display (as schemed to meet next sequence navigation criteria according to drag gesture
Shown in 7E) or exceed the first area of another predefined part (for example, 0.2,0.3 or 0.4) and have moved away from display
Or drag gesture is the flick gesture being lifted away from speed having higher than predefined threshold velocity) determination:This equipment is by first
Region is moved (1136) in a first direction completely and is left display;Second area is moved fully on display;And will
The second presentation graphics for the second image sequence shows in the second area.In certain embodiments, when first area is moved
Move when leaving display (for example, or even after input is terminated), represent in collection first to by camera in the first region
Property image after collection at least some of one or more images for the first image sequence image chronological
Display continues.In certain embodiments, when second area moves to and is shown until the second presentation graphics on display,
One or more for the second image sequence to gathered before collection the second presentation graphics by camera in second area
The chronological display of at least some of image image continues.
In certain embodiments, in response to the termination of drag gesture is detected when moving first area and second area:
It is unsatisfactory for the determination of next sequence navigation criteria according to drag gesture:This equipment is by second area contrary with first direction
Two sides move (1138) fully up and leave display;First area is moved fully on display;And first will be used for
First presentation graphics of image sequence shows in the first region.In certain embodiments, when first area is moved fully to
On display until first presentation graphics be shown when, in the first region to by camera collection the first presentation graphics it
The chronological display of at least some of the one or more images for the first image sequence gathering afterwards image is anti-
Turn.In certain embodiments, when second area moves away display completely, in the second area to by camera in collection the
At least some of the one or more images for the second image sequence gathering before two presentation graphicses image is temporally
The display of order is inverted.
According to some embodiments, second direction on Touch sensitive surface (for example, to the right or downwards) is according to drag gesture
On determination (1140):In removable first area, this equipment utilization is used for the first image sequence to by camera in collection
Before first presentation graphics, at least some of one or more images for the first image sequence of collection image presses
Reversed time order shows to substitute (1142) display to the first presentation graphics for the first image sequence.This equipment is also
By first area, (for example, to the right or downwards) is mobile (1144) in a second direction.
In certain embodiments, over the display in a second direction to the movement of first area with Touch sensitive surface
In second direction, corresponding (for example, the movement of contact seems directly to manipulate to the firstth area the movement to the contact in drag gesture
The movement in domain).More generally, in certain embodiments, the shifting to the respective regions being shown in image in respective image sequence
Dynamic with Touch sensitive surface the movement to the contact in drag gesture corresponding.
In certain embodiments, in the first region to by camera collection the first presentation graphics before collection be used for
The showing according to drag gesture by reversed time order of at least some of one or more images of the first image sequence image
In contact mobile appearance.Therefore, if the mobile acceleration to contact in a second direction, in the first region to image
Reversed time progress display accelerate.If the movement to contact slows down in a second direction, in the first region to figure
The display of the reversed time progress of picture slows down.If the mobile time-out to contact, right in the first region in a second direction
The display halt of the reversed time progress of image.And, if the mobile reverse directions of contact are (for example, from drag gesture to the right
Arrive drag gesture to the left), then the display to the progress of the image in the first image sequence is reversed in the first region, and schemes
As being illustrated in chronological order according to the movement to contact in reverse directions.More generally, in certain embodiments, for phase
Answer image sequence, the mobile appearance showing according to the contact in drag gesture to the progress of the image in respective image sequence.
In certain embodiments, by first area in a second direction mobile include (1146) by least portion of first area
Point in a second direction (for example, to the right or downwards) moves away display.For example, in response to Fig. 7 G-7I is detected to the right
Drag gesture 744, first area 734 is moved away by this equipment when showing the first image sequence 702 by reversed time order
Move to right on display.
In certain embodiments, second direction (for example, to the right or downwards) is on Touch sensitive surface according to drag gesture
On determination:(1148) are moved in 3rd region by this equipment in a second direction.In certain embodiments, may move the 3rd region
It is to show image in the 3rd image sequence and do not show the image from the image sequence in addition to the 3rd image sequence
Region.In certain embodiments, may move the 3rd region adjacent with removable first area (for example, in removable first area
The left side).When moving the 3rd region in a second direction, this equipment also will be used for the 3rd image sequence by camera in collection
Third representative image after at least some of one or more images for the 3rd image sequence image of collection press
Reversed time order shows (1150) in the 3rd region.
In certain embodiments, in the 3rd region to by camera collection third representative image after collection be used for
The showing according to drag gesture by reversed time order of at least some of one or more images of the 3rd image sequence image
In contact mobile appearance (for example, with above with respect to first image sequence description similar mode).For example, pulling
During gesture, the image in first area and the image in the 3rd region are retreated with phase same rate simultaneously, and its medium-rate is based on contact
Movement.
In certain embodiments, when moving the 3rd region in a second direction, the 3rd region only shows for
The third representative image of three image sequences and do not show other images in the 3rd image sequence.
In certain embodiments, when replacing moving first area in a second direction, first area is suitable in z layer (in front and back)
In sequence below the 3rd region, and when moving on display in a second direction when the 3rd region, first area is capped.
In certain embodiments, the drag gesture in a second direction similar to detection, detection is to previous icon (example
As the previous icon 750-1 of Fig. 7 A) or the activation of button also lead in the first region to the image from First ray
Animation shows and in the 3rd region, the animation of the image from the 3rd sequence is shown.In certain embodiments, detection is to front
The activation of one icon or button leads to the display of third representative image is substituted with the display to the first presentation graphics, and does not have
Have and in the first region the animation of the image from First ray is shown and do not have in the 3rd region to from the 3rd sequence
The animation of the image of row shows.In certain embodiments, detection leads to third generation table to the activation of previous icon or button
Property image display substitute display to the first presentation graphics, without to other images in First ray or the 3rd sequence
Shown.In certain embodiments, to different types of input, (for example, drag gesture to the previous icon of comparison or is pressed to the right
The activation of button) response be user can for example via setting interface configurations.
In certain embodiments, by the 3rd region in a second direction mobile include (1152) by least portion in the 3rd region
Point in a second direction (for example, to the right or downwards) moves on display.For example, in response to Fig. 7 G-7I is detected to the right
Drag gesture 744, the 3rd region 746 is moved from the right by this equipment when showing three image sequences 726 by reversed time order
Move on display.
In certain embodiments, the first image sequence gathered (1154) by camera before the second image sequence, and the
One image sequence is gathered after the 3rd image sequence by camera.For example, image sequence is in chronological order from left to right.
In certain embodiments, second direction (for example, to the right or downwards) is on Touch sensitive surface according to drag gesture
On determination:After moving the 3rd region in a second direction, this equipment will be used for the third generation table of the 3rd image sequence
Property image shows (1156) in the 3rd region (for example, as shown in fig. 7k).
In certain embodiments, when mobile first area and the 3rd region (for example, as shown in figure 7j), this equipment detects
(1158) termination (for example, being lifted away from) of drag gesture.In response to (1160) is detected when moving first area and three regions
The termination of drag gesture:According to drag gesture meet (1162) previous sequence navigation criteria (for example, first area be in leave aobvious
Show device at least half (as shown in figure 7j) or exceed the first area of another predefined part (for example, 0.2,0.3 or 0.4)
It is the flick gesture being lifted away from speed having higher than predefined threshold velocity through moving away display or drag gesture) really
Fixed:By first area, (for example, to the right or downwards) moves away display to this equipment completely in a second direction;By the 3rd region
It is moved fully on display;And the third representative image being used for the 3rd image sequence is shown in the 3rd region.?
In some embodiments, when first area moves away display, gathered before collection the first presentation graphics to by camera
Showing by reversed time order of at least some of one or more images for the first image sequence image continue.
In certain embodiments, when the 3rd region moves to and is shown until third representative image on display, in the 3rd region
In to by camera collection third representative image after collection the one or more images for the 3rd image sequence in
Showing by reversed time order of at least some image continues.
In certain embodiments, it is unsatisfactory for the determination of (1164) previous sequence navigation criteria according to drag gesture:This equipment
By the 3rd region, (for example, to the left or upwards) moves away display completely in a first direction;First area is moved completely
To on display;And the first presentation graphics being used for the first image sequence is shown in the first region.In some enforcements
Example in, when first area be moved fully on display until first presentation graphics be shown when, in the first region to by
In camera one or more images for the first image sequence of collection before collection the first presentation graphics at least one
A little showing by reversed time order of image are inverted.In certain embodiments, when the 3rd region moves away display completely
When, in the 3rd region to by camera collection third representative image after collection one for the 3rd image sequence or
Showing by reversed time order of at least some of multiple images image is inverted.
It should be appreciated that the particular order that the operation in Figure 11 A-11E has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.Extraly it is noted that herein in connection with described herein
Additive method (for example, method 900,1000,10000,10050,11000,1200,2400,2500,2600 and 2700) describes
The details of other processes similar mode can also be applied to the method 1100 above with reference to Figure 11 A-11E description.For example, above
With regard to the contact of method 1100 description, gesture, user interface object, intensity threshold, animation and image sequence, alternatively there is this
In literary composition with regard to additive method described herein (for example, method 900,1000,10000,10050,11000,1200,2400,
2500th, 2600 and 2700) in the feature of contact, gesture, user interface object, intensity threshold, animation and image sequence describing
One or more.For simplicity, these details are not repeated herein.
The flow process of the method 11000 of associated image sequences is passed through in the navigation that Figure 11 F-11I illustrates according to some embodiments
Figure.In electronic equipment (for example, the equipment 300 of Fig. 3 or the portable multifunction device of Figure 1A with display and Touch sensitive surface
100) place's execution method 11000.According to some embodiments, this equipment includes the intensity of the contact for detection with Touch sensitive surface
One or more sensors.In certain embodiments, described display be touch-screen display and described Touch sensitive surface be
On described display or integrated with described display.In certain embodiments, described display is separated with described Touch sensitive surface.Side
Certain operations in method 11000 are alternatively combined and/or the order of certain operations is alternatively changed.
This equipment stores (11002) multiple images sequence.Respective image sequence includes the respective representative being shot by camera
Image and the one or more images by camera collection before collection respective representative image.In certain embodiments, clap
The camera taking the photograph respective image sequence is the part of electronic equipment.In certain embodiments, respective image sequence include (11004) by
One or more images of camera collection after collection respective representative image.
In certain embodiments, respective image sequence shoots (for example, corresponding figure by the camera of the part not being electronic equipment
As sequence is transferred to electronic equipment after being taken using the camera on another equipment).In certain embodiments, corresponding figure
As sequence obtains to the activation of shutter release button at the very first time in response to detecting, such as herein with reference to Fig. 9 A-9G and
Described by method 900 and/or Figure 26 A-26D and method 2600.In certain embodiments, respective representative image with by camera
The presentation graphics of collection is corresponding, such as herein with reference to Fig. 9 A-9G and method 900 and/or Figure 26 A-26D and method 2600
Described.In certain embodiments, presentation graphics has the resolution higher than other images in respective image sequence.
In certain embodiments, presentation graphics has and other image identical resolution in respective image sequence.
In this equipment removable first area over the display, display (11006) is used for the first generation of the first image sequence
Table image (for example, when this equipment is in image and presents in pattern).In certain embodiments, removable first area is aobvious
Show the image in the first image sequence and do not show from the image sequence in addition to the first image sequence image (for example,
The presentation graphics 702-3 of Fig. 7 Q) region.
In certain embodiments, the first image sequence is in the scrollable field that (11108) are shown in information receiving and transmitting application
Message conversation in message, and the scrollable field that the first image sequence is applied with information receiving and transmitting is scrolled and shown
Show (for example, as described by reference operation 1106, method 1100).
This equipment detect (11010) Touch sensitive surface on gesture, this gesture include by with first party over the display
The corresponding contact of mobile phase upwards movement (for example, Fig. 7 Q-7S flick/gently sweep dragging of gesture 740 and/or Fig. 7 Y-7AA
Drag gesture 764).
In response to the gesture on Touch sensitive surface is detected, this equipment:Over the display by first area in a first direction
Mobile (11012) (for example, with regard to the first presentation graphics);Over the display removable second area is moved in a first direction
Dynamic;And the determination that criterion is satisfied is shown according to sequence, when moving second area in a first direction, in second area
In show in chronological order by camera before the second presentation graphics for the second image sequence for the collection collection for the
At least some of one or more images of two image sequences image (for example, shows the image sequence 724 of Fig. 7 R-7T).?
In some embodiments, over the display in a first direction to the movement of first area with Touch sensitive surface in a first direction
Movement corresponding (for example, the movement of contact seems directly to manipulate the movement to first area) to the contact in gesture.One
In a little embodiments, removable second area is to show image in the second image sequence and do not show from except the second image sequence
The region of the image of the image sequence outside row.In certain embodiments, removable second area and removable first area phase
Adjacent (for example, on the right of removable first area).
In certain embodiments, in response to the gesture on Touch sensitive surface is detected:Show that criterion is not satisfied according to sequence
Determination, when moving second area in a first direction, this equipment will be used for the second representative diagram of the second image sequence
In picture display (11014) removable second area over the display (and other images in the second image sequence are not shown
In removable second area).In certain embodiments, the unsatisfied determination of criterion is shown according to sequence, when by the secondth area
When domain is moved in a first direction, this equipment by be used for the second image sequence initial pictures (rather than second presentation graphics) or
It is removable that another image that person gathered before the second presentation graphics for the second image sequence shows over the display
In second area.
In certain embodiments, sequence shows that criterion includes (11016) and shows in chronological order by phase in the second area
Machine at least some of one or more images for the second image sequence of collection before collection the second presentation graphics
The criterion being lifted away from is contacted before image.In certain embodiments, if continued when moving second area in a first direction
Contact is detected, be then used only for the presentation graphics of the second image sequence when moving second area in a first direction
(or only initial pictures) are shown.For example, contact is lentamente across Touch sensitive surface movement and by second area lentamente
(or to the right) to the left drag gesture that (or to the right) pulls to the left.On the contrary, if worked as, second area is moved in a first direction
Shi Buzai detects contact, then be shown in the second filial generation in the second area when continuing and moving in a first direction second area
The image sequence of the flash demo shooting before table image.For example, contact is rapidly across Touch sensitive surface movement and afterwards
Part when second area (or to the right) to the left flick gesture that (or to the right) is lifted away from when mobile still to the left.For example, contact is
(when display is used only for the second presentation graphics or the initial pictures of the second image sequence in the second area) is across touch-sensitive table
Face is mobile and is moved into showing at least scheduled volume (for example, 25%, 30%, 40% or 50%) of second area afterwards
The part of (or to the right) the to the left drag gesture being lifted away from after on device.In after being lifted away from, the remainder of second area moves
To on display, and show in the second area by camera collection the second presentation graphics before collection for the second figure
At least some of image as sequence image.
In certain embodiments, (11018) are included by mobile in a first direction for removable second area over the display
(for example, this equipment initially shows to be shown in the corresponding prior images gathering before collection the second presentation graphics in the second area
Show the initial pictures in image sequence rather than presentation graphics).
In certain embodiments, sequence shows that criterion includes (11020) and detects that contact is lifted away from (for example, as with reference to Fig. 7 Y-
Described by 7AA).In response to being lifted away from of contact is detected, this equipment continues to move removable second area in a first direction
Move and continue to move removable first area in a first direction.Image from the second image sequence is shown with a speed
Show so that the second presentation graphics is displayed on removable the when removable second area stops moving in a first direction
In two regions, (for example, the speed of the movement of removable second area is selected as and the speed phase being moved through image sequence
Join, or propulsion is selected as matching with the speed of the movement that may move second area by the speed of image sequence, or
Combination of the two).In certain embodiments, when removable first area is moved, this equipment propulsion by with removable
The corresponding image sequence in dynamic first area.In certain embodiments, when removable first area is moved, this equipment is not
Propulsion is by the image sequence corresponding with removable first area.
In certain embodiments, when will be mobile for removable first area, this equipment shows that (11022) are used for may move the
The simulation parallax effect of the image in one region is so that the image in removable first area is with respect to removable first area
Vertical shift (for example it appears that the frame of removable first area is divided on simulation z direction with the image in removable first area
Open).
In certain embodiments, when (for example, in display before the image of the second image sequence) is on Touch sensitive surface
Detect during contact when will be mobile for removable second area, this equipment show that (11024) are used for the figure that may move in second area
The simulation parallax effect of picture is so that may move the vertical shift (example that the image in second area is with respect to removable second area
As it appears that the frame of removable second area is separated on simulation z direction with the image in removable second area).
In certain embodiments, (11026) are included by first area at least by mobile in a first direction for first area
Part (for example, first area is slid off display, Fig. 7 A-7CC) in a first direction and moves away display.
In certain embodiments, (11028) are included by second area at least by mobile in a first direction for second area
Part (for example, second area is slided into and slides off display together with by first area on display, scheme in a first direction
7A-7CC) move on display.
In certain embodiments, after moving second area in a first direction, this equipment will be used for the second image
Second presentation graphics of sequence shows (11030) (for example, as shown in Fig. 7 CC and other) in the second area.
In certain embodiments, when the second presentation graphics being used for the second image sequence is shown in the second area
When, the change of this equipment detection (11032) intensity of input corresponding with the second presentation graphics.In response to input is detected
Intensity change, this equipment advances in the second area by the second image sequence and do not move second area over the display
Dynamic (for example, this equipment execute any in the operation shown in Fig. 6 A-6FF).
In certain embodiments, when mobile first area and second area, this equipment detects (11034) drag gesture
Terminate.In response to the termination of drag gesture is detected when moving first area and second area:Under being met according to drag gesture
The determination of one sequence navigation criteria:First area is moved away display by this equipment in a first direction completely;By the secondth area
Domain is moved fully on display;And show the second presentation graphics (example for the second image sequence in the second area
As described by reference operation 1136, method 1100).
In certain embodiments, in response to the termination of drag gesture is detected when moving first area and second area:
It is unsatisfactory for the determination of next sequence navigation criteria according to drag gesture:This equipment is by second area contrary with first direction
Two sides move (11036) fully up and leave display;First area is moved fully on display;And in first area
Middle the first presentation graphics (for example, as described by reference operation 1138, method 1100) showing for the first image sequence.
It should be appreciated that the particular order that the operation in Figure 11 F-11I has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.Extraly it is noted that herein in connection with described herein
Additive method (for example, method 900,1000,10000,10050,1100,1200,2400,2500,2600 and 2700) describes
The details of other processes similar mode can also be applied to the method 11000 above with reference to Figure 11 F-11I description.For example, with
On alternatively have with regard to the contact of method 11000 description, gesture, user interface object, intensity threshold, animation and image sequence
Herein in connection with additive method described herein (for example, method 900,1000,10000,10050,1100,1200,2400,
2500th, 2600 and 2700) in the feature of contact, gesture, user interface object, intensity threshold, animation and image sequence describing
One or more.For simplicity, these details are not repeated herein.
Figure 12 A-12B illustrates more complete compared with individual images to associated image sequences execution according to some embodiments
The flow chart of the method 1200 of different operations.In electronic equipment (for example, the equipment of Fig. 3 with display and Touch sensitive surface
300 or Figure 1A portable multifunction device 100) place's execution method 1200.According to some embodiments, this equipment include for
One or more sensors of the intensity of contact with Touch sensitive surface for the detection.In certain embodiments, described display is to touch
Panel type display and described Touch sensitive surface is on the display or integrated with described display.In certain embodiments, institute
State display to separate with described Touch sensitive surface.Certain operations in method 1200 be alternatively combined and/or certain operations suitable
Sequence is alternatively changed.
This equipment stores (1202) multiple images sequence.Respective image sequence includes the respective representative figure being shot by camera
As, by camera after collection respective representative image one or more images of collection and represented collection is corresponding by camera
Property image before collection one or more images.In certain embodiments, the camera shooting respective image sequence is that electronics sets
Standby part.In certain embodiments, (for example, accordingly respective image sequence is shot by the camera of the part not being electronic equipment
Image sequence is transferred to electronic equipment after being taken using the camera on another equipment).In certain embodiments, accordingly
Image sequence obtains to the activation of shutter release button at the very first time in response to detecting, such as herein with reference to Fig. 5 A-5K
With described by method 900 and/or Figure 22 A-22D and method 2600.In certain embodiments, respective image sequence is with consecutive mould
Formula is obtaining.In certain embodiments, respective representative image with by collected by camera presentation graphics corresponding, such as herein
Described by middle reference Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600.In certain embodiments, representative
Image has the resolution higher than other images in respective image sequence.In certain embodiments, presentation graphics has
With other image identical resolution in respective image sequence.
The diverse multiple images of image in this equipment storage (1204) and multiple images sequence.In multiple images
Respective image is not the part of the image sequence in multiple images sequence.
This equipment shows (1206) first images (for example, as shown in Figure 8 A) over the display.When showing over the display
During the first image, this equipment detection (1208) first input (for example, Fig. 8 B press and hold input 816).In response to detection
To (1210) first inputs:It is that (for example, the first image is for the first figure for image in the first image sequence according to the first image
As sequence presentation graphics) determination, this equipment execution (1212) first operation, this first operation includes display the first image
At least some of the image beyond the first image in sequence image (for example, order display the first image sequence in
At least some of image beyond first image image) (for example, as shown in Fig. 8 C-8F).According to the first image be with multiple
The determination of the image in the diverse multiple images of the image in image sequence, (for example, such as this equipment execution (1214) be related to
Shown in Fig. 8 K-8L) the first image with first operation diverse second operation.That is, this equipment is to increase depending on image
The part of strong type photo or rest image carry out the differently input to same type and (for example, share one or more common roads
Footpath or the input of strength characteristic) respond.According to various embodiments, the first operation is herein with respect to image sequence description
Operation in any.Specific example is provided below.
In certain embodiments, the first input is that (1216) press and hold gesture, first operation display the first image sequence
At least part of (for example, as herein with reference to described by Figure 10 A-10M and method 1000/10000/10050) of row, and the
Two operations the information with regard to the first image is shown together with the first image (for example, the time, the date, position (for example, via
GPS) and/or with regard to other metadata of the first image it is superimposed on partly going up of the first image, as shown in Fig. 8 K-8L).One
In a little embodiments, when on vernier or other focus selectors the first image over the display, it is tactile for pressing and holding gesture
Pressing and holding on the first image on quick display presses and holds gesture on gesture or Trackpad.In some enforcements
In example, the first input be when on vernier or other focus selectors the first image over the display using mouse click simultaneously
Keep input.
In certain embodiments, the first input is that (1218) press and hold gesture, first operation display the first image sequence
At least part of (for example, as herein with reference to described by Figure 10 A-10M and method 1000/10000/10050) of row, and the
Two operations illustrate the animation of the different piece of the first image.In certain embodiments, when vernier or other focus selectors
When on the first image over the display, pressing and holding gesture is pressing and holding on the first image on touch-sensitive display
Gesture is pressed and held in gesture or Trackpad.In certain embodiments, the first input is when vernier or the choosing of other focuses
Clicking on and keeping inputting using mouse when selecting on device the first image over the display.For example, the second operation is to the first figure
As zooming in and out and/or the animation (for example, Ken Burns specially good effect) translating and/or the animation that filter is applied to the first image.
In certain embodiments, the second operation includes being reduced from image and is pushed back to display thus providing the first image
Impression.
In certain embodiments, this equipment include (1220) for detection one of the intensity of contact with Touch sensitive surface or
Multiple sensors, the finger that the first input includes meeting the first contact strength criterion contacts (for example, when vernier or other focuses are selected
When selecting on device the first image over the display, the finger gesture on the first image on touch-sensitive display, or on Trackpad
Finger gesture, the wherein contact in finger gesture exceedes at least part of light pressure (or the deep press) intensity threshold for input),
At least part of (for example, reference Figure 10 A-10M such as herein and method 1000/ of first operation display the first image sequence
Described by 10000/10050), and second operation (for example, the information with regard to the first image shown together with the first image
Time, date, position (for example, via GPS) and/or other metadata with regard to the first image are superimposed on the portion of the first image
On point).
In certain embodiments, this equipment include (1222) for detection one of the intensity of contact with Touch sensitive surface or
Multiple sensors, the finger that the first input includes meeting the first contact strength criterion contacts (for example, when vernier or other focuses are selected
When selecting on device the first image over the display, the finger gesture on the first image on touch-sensitive display, or on Trackpad
Finger gesture, wherein the contact in finger gesture exceed for input at least part of deep Compressive Strength threshold value), first operation
Show the first image sequence at least part of (for example, as herein with reference to Figure 10 A-10M with method 1000/10000/10050 institute
Description), and the second operation illustrates the animation of the different piece of the first image.For example, the second operation is to the first figure
As zooming in and out and/or the animation (for example, Ken Burns specially good effect) translating and/or the animation that filter is applied to the first image.
In certain embodiments, the second operation includes being reduced thus being provided the first image to be pushed back to the print display from image
As.
In certain embodiments, the first input is (1224) drag gesture, and the first operation ought change from showing the first image
Show during to display the second image (the second image not being the image in the first image sequence) in the image in the first image sequence
At least part of (for example, as herein with reference to described by Figure 10 A-10M and method 1000/10000/10050), and second
Operation from show the first image be converted to display the 3rd image (the 3rd image is not the image the first image sequence).
In certain embodiments, when the image that the first image is in the first image sequence, the method also includes detection and leads
Boat inputs and navigates to the second image, and the second image is diverse with the image in multiple images sequence in multiple images
Image.The method also includes detecting shares the of one or more features (for example, intensity and/or path input) with the first input
Two inputs.In certain embodiments, the first input and the second input need not sharing position.The method is also included in response to detecting
Second input, execution is related to the second operation of the second image.
It should be appreciated that the particular order that the operation in Figure 12 A-12B has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.Extraly it is noted that herein in connection with described herein
Additive method (for example, method 900,1000,10000,10050,1100,11000,2400,2500,2600 and 2700) describes
The details of other processes similar mode can also be applied to the method 1200 above with reference to Figure 12 A-12B description.For example, above
With regard to the contact of method 1200 description, gesture, user interface object, intensity threshold, animation and image sequence, alternatively there is this
In literary composition with regard to additive method described herein (for example, method 900,1000,10000,10050,1100,11000,2400,
2500th, 2600 and 2700) in the feature of contact, gesture, user interface object, intensity threshold, animation and image sequence describing
One or more.For simplicity, these details are not repeated herein.
According to some embodiments, Figure 13 shows the electronic equipment 1300 of the principle configuration of the embodiment according to various descriptions
Functional-block diagram.The functional block of described equipment is alternatively implemented each to realize by the combination of hardware, software or hardware and software
Plant the principle of the embodiment of description.It will be understood by those skilled in the art that the functional block described in Figure 13 is alternatively combined or quilt
It is separated into sub- frame to implement the principle of the embodiment of various descriptions.Therefore, description herein is alternatively supported to retouching herein
Any possible combination of the functional block stated or separate or limit further.
As shown in figure 13, electronic equipment 1300 includes:Camera unit 1304, it is configured to gather image;Display unit
1302, it is display configured to (image for example, obtaining) live preview from camera unit;One or more optional sensors
Unit 1306, it is configured to detect the activation to shutter release button;And processing unit 1308, itself and display unit 1302, phase
Machine unit 1304 and one or more optional sensor unit 1306 couple.In certain embodiments, processing unit 1308 wraps
Include:Display enabling unit 1310, grouped element 1312, associative cell 1314 and flash demo unit 1316.
Processing unit 1308 is configured to:When in being in for the first acquisition of media pattern of camera unit 1304, (example
As using display enabling unit 1310) show (image for example, obtaining) from camera unit 1304 on display unit 1302
Live preview, when showing live preview, (for example, using sensor unit 1306) detection is at the very first time to shutter release button
Activation.In response to the activation to shutter release button at the very first time is detected, processing unit 1308 is configured to (for example, profit
With grouped element 1312) activating of shutter release button will be adopted on the close time with the very first time by camera unit 1304
The multiple images of collection are grouped in the first image sequence.First image sequence includes:Detected by camera unit 1304
Multiple images to collection before the activation of shutter release button at one time;Presentation graphics, this presentation graphics represents the first figure
Other images in first image sequence one or more of image after gathered as sequence and by camera unit 1304;
And the multiple images by camera unit 1304 collection after collection presentation graphics.
As shown in figure 14, electronic equipment 1400 includes:Display unit 1402, it is display configured to image;Touch sensitive surface
Unit 1404, it is configured to detect user input;One or more optional sensor units 1406, its be configured to detection with
The intensity of the contact of Touch sensitive surface unit 1404;And processing unit 1408, it is with display unit 1402, Touch sensitive surface unit
1404 and one or more optional sensor unit 1406 couple.In certain embodiments, processing unit 1408 includes:Display
Enabling unit 1410, detector unit 1412 and audio frequency display unit 1414.
Processing unit 1408 is configured to (for example, using display enabling unit 1410) and shows generation on display unit 1402
Table image.Representative diagram seems one of the image sequence that shot by camera image.Image sequence includes being adopted by camera
Collection presentation graphics after collection one or more images, and image sequence include by camera collection presentation graphics it
One or more images of front collection.Processing unit 1408 is additionally configured to work as and shows presentation graphics on display unit 1402
When (for example, using detector unit 1412, it alternatively detects the input on Touch sensitive surface unit 1404) detection first input
Part I.The Part I that processing unit 1408 is configured to respond to detect the first input (for example, is made using display
Energy unit 1410) utilize on display unit 1402 to the one or more figures by camera collection after collection presentation graphics
The order of picture shows the display to substitute presentation graphics.Processing unit 1408 is configured to (for example, using detector unit
1412) Part II that detection first inputs after the Part I of the first input is detected.Processing unit 1408 is configured
It is the Part II in response to the first input is detected, display unit 1402 sequentially shows by camera in collection representative diagram
As gather before one or more images, presentation graphics and by camera collection presentation graphics after collection one or
Multiple images.
As shown in figure 15, electronic equipment 1500 includes:Display unit 1502, it is display configured to image;Touch sensitive surface
Unit 1504, it is configured to detect user input;One or more optional sensor units 1506, its be configured to detection with
The intensity of the contact of Touch sensitive surface unit 1504;Memory element 1516, it is configured to storage image;And processing unit
1508, it is with display unit 1502, Touch sensitive surface unit 1504, memory element 1516 and one or more optional sensor list
Unit 1506 coupling.In certain embodiments, processing unit 1508 includes:Display enabling unit 1510, detector unit 1512 and standard
Then determining unit 1514.
Processing unit 1508 is configured to store multiple images sequence in memory cell 1516.Respective image sequence bag
Include the respective representative image being shot by camera, the one or more figures by camera collection after collection respective representative image
Picture and the one or more images by camera collection before collection respective representative image.Processing unit 1508 is also configured
For (for example, utilizing display enabling unit 1510), the first representative diagram of the first image sequence will be used on display unit 1502
As being shown in the removable first area on display unit 1502.Processing unit 1508 is additionally configured to (for example, using detection
Unit 1512) drag gesture on Touch sensitive surface unit 1504 for the detection.Touch sensitive surface unit 1504 is according to drag gesture
On first direction on determination:Processing unit 1508 is configured to (for example, using display enabling unit 1510) removable
In first area, utilize on display unit 1502 by camera collection for the first image sequence the first presentation graphics it
At least some of the one or more images for the first image sequence gathering afterwards image is chronological to be shown to replace
The display to the first presentation graphics for the first image sequence for the generation.Processing unit 1508 is additionally configured in display unit
On 1502, by first area, (for example, using display enabling unit 1510) is mobile in a first direction.
As shown in figure 16, electronic equipment 1600 includes:Display unit 1602, it is display configured to image;Touch sensitive surface
Unit 1604, it is configured to detect user input;One or more optional sensor units 1606, its be configured to detection with
The intensity of the contact of Touch sensitive surface unit 1604;Memory element 1616, it is configured to storage image;And processing unit
1608, it is with display unit 1602, Touch sensitive surface unit 1604, memory element 1616 and one or more optional sensor list
Unit 1606 coupling.In certain embodiments, processing unit 1608 includes:Display enabling unit 1610, detector unit 1612 and really
Order unit 1614.
Processing unit 1608 is configured to store multiple images sequence in memory cell 1616.Respective image sequence bag
Include the respective representative image being shot by camera, the one or more figures by camera collection after collection respective representative image
Picture and the one or more images by camera collection before collection respective representative image.Processing unit 1608 is also configured
It is the diverse multiple images of image in storage and multiple images sequence in memory cell 1616.In multiple images
Respective image is not the part of the image sequence in multiple images sequence.Processing unit 1608 is additionally configured in display unit
(for example, using display enabling unit 1610) display the first image on 1602.Processing unit 1608 is additionally configured to when in display
(for example, using detector unit 1612) detection first input when first image is shown on unit 1602.Processing unit 1608 goes back quilt
It is configured in response to the first input is detected:(for example, using determining unit 1614) is the first image sequence according to the first image
In image determination, execution first operation, it is single in display that this first operation includes (for example, using display enabling unit 1610)
At least some of the image beyond the first image in first image sequence image is shown on unit 1602;And, (for example,
Using determining unit 1614) it is the figure in the diverse multiple images with the image in multiple images sequence according to the first image
The determination of picture, execution is related to second operation diverse with the first operation of the first image.
As shown in figure 17, electronic equipment 1700 includes:Display unit 1702, it is display configured to image;Touch sensitive surface
Unit 1704, it is configured to detect user input;One or more optional sensor units 1706, its be configured to detection with
The intensity of the contact of Touch sensitive surface unit 1704;And processing unit 1708, it is with display unit 1702, Touch sensitive surface unit
1704 and one or more optional sensor unit 1706 couple.In certain embodiments, processing unit 1708 includes:Aobvious
Show enabling unit 1710, detector unit 1712, conversion unit 1714 and display unit 1716.
Processing unit 1708 is configured on display unit 1702 (for example, using display enabling unit 1710) display generation
Table image.Representative diagram seems one of the image sequence that shot by camera image.Image sequence includes being adopted by camera
One or more images of collection after collection presentation graphics.Image sequence includes being adopted before collection presentation graphics by camera
One or more images of collection.Processing unit 1708 is additionally configured to representative when making it possible to display on display unit 1702
The Part I that during image, detection first inputs (for example, using detector unit 1712).Processing unit 1708 is additionally configured to ring
Ying Yu detects the Part I of the first input, and (for example, using conversion unit 1714) is converted to aobvious from display presentation graphics
Show the corresponding prior images in image sequence.This corresponding prior images was gathered before collection presentation graphics by camera.Process
Unit 1708 is additionally configured to:In response to the Part I of the first input is detected, aobvious being converted to from display presentation graphics
After showing corresponding prior images, (for example, using display enabling unit 1710) makes it possible to start order with corresponding prior images
Display is at least some of the one or more images being gathered before collection presentation graphics by camera image with by camera
At least some of one or more images of collection image after collection presentation graphics.
As shown in figure 18, electronic equipment 1800 includes:Display unit 1802, it is display configured to image;Touch sensitive surface
Unit 1804, it is configured to detect user input;One or more sensor units 1806, its be configured to detect with touch-sensitive
The intensity of the contact of surface cell 1804;And processing unit 1808, it is with display unit 1802, Touch sensitive surface unit 1804 with
And one or more sensor unit 1806 couples.In certain embodiments, processing unit 1808 includes:Display enabling unit
1810th, detector unit 1812 and mobile unit 1814.
Processing unit 1808 is configured on display unit 1802 (for example, using display enabling unit 1810) display generation
Table image.Representative diagram seems one of the image sequence that shot by camera image.Image sequence includes being adopted by camera
One or more images of collection after collection presentation graphics.Processing unit 1808 is additionally configured to when on display unit 1802
(for example, using detector unit 1812) detection first input when making it possible to show presentation graphics, it includes (for example, utilizing
Sensor unit 1804) contact on detection Touch sensitive surface unit 1804 characteristic strength to more than the of the first intensity threshold
The increase of one intensity.Processing unit 1808 is additionally configured to the increase of the characteristic strength in response to contact is detected, (for example, profit
With mobile unit 1814) in a first direction to be at least partially based on the speed propulsion of the first intensity determination by being adopted by camera
One or more images of collection after collection presentation graphics.Processing unit 1808 is additionally configured to to be at least partially based on
After the speed propulsion that one intensity determines is passed through by camera one or more images of collection after collection presentation graphics,
The intensity of (for example, using detector unit 1812) detection contact is to the reduction of the second intensity less than the first intensity.Processing unit
1808 are additionally configured to the reduction in response to characteristic strength contact is detected to the second intensity, are higher than first according to the second intensity
The determination of intensity threshold, is continued (for example, using mobile unit 1814) and is passed through by phase with the second speed propulsion in a first direction
One or more images of machine collection after collection presentation graphics.Second speed is at least partially based on the second intensity and comes really
Fixed, and the second speed ratio first rate is slower.The feature that processing unit 1808 is additionally configured in response to contact is detected is strong
Spend the reduction to the second intensity:It is less than the determination of the first intensity threshold, (for example, using mobile unit 1814) according to the second intensity
The speed being determined with being at least partially based on the second intensity in a second direction that is opposite the first direction is moved through being existed by camera
One or more images of collection after collection presentation graphics.
As shown in figure 19, electronic equipment 1900 includes:Display unit 1902, it is display configured to image;Touch sensitive surface
Unit 1904, it is configured to detect user input;One or more optional sensor units 1906, its be configured to detection with
The intensity of the contact of Touch sensitive surface unit 1904;And processing unit 1908, it is with display unit 1902, Touch sensitive surface unit
1904 and one or more optional sensor unit 1906 couple.In certain embodiments, processing unit 1908 includes:Display
Enabling unit 1910, detector unit 1912, memory element 1914, mobile unit 1916;And propulsion unit 1918.
Processing unit 1908 is configured to (for example, using memory element 1914) storage multiple images sequence.Respective image
Sequence includes the respective representative image being shot by camera and gathered before collection respective representative image one by camera
Individual or multiple images.Processing unit 1908 is additionally configured to (for example, using display enabling unit 1910) and makes it possible in display
The first presentation graphics for the first image sequence is shown in removable first area on unit 1902.Processing unit
1908 are additionally configured to the gesture on (for example, using detector unit 1912) detection Touch sensitive surface unit 1904, and this gesture includes
By the movement that contact corresponding with the mobile phase on the first direction on display unit 1902.Processing unit 1908 is also joined
It is set in response to the gesture on Touch sensitive surface unit 1904 is detected:(for example, using mobile unit 1916) is in display unit
On 1902, first area is moved in a first direction;(for example, using mobile unit 1916) can on display unit 1902
Mobile second area moves in a first direction;And the determination that is satisfied of criterion is shown according to sequence, when second area exists
When first party moves up, (for example, using display enabling unit 1910) makes it possible to show in chronological order in the second area
Show by camera before the second presentation graphics for the second image sequence for the collection collection for the one of the second image sequence
At least some of individual or multiple images image.
Figure 20 A-20L illustrates according to some embodiments for changing example user circle of the image in image sequence
Face.User interface in these accompanying drawings be used for illustrate process described below, including Fig. 9 A-9G, 10A-10M, 11A-11I,
Process in 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.Although will with reference to touch-screen display (its
Middle Touch sensitive surface and display are combined) on input provide following example, but in certain embodiments, the inspection of described equipment
Survey the input on Touch sensitive surface 451 detached with display 450 as illustrated in fig. 4b.
Figure 20 A illustrates the image sequence 2002 illustrating the train near platform.Image sequence includes presentation graphics
2002-3, by camera before collection presentation graphics 2002-3 image 2002-1 and 2002-2 of collection and adopted by camera
Image 2002-4 and 2002-5 of collection after collection presentation graphics 2002-3.The time sequencing of image sequence 2002 is:Image
2002-1, image 2002-2, presentation graphics 2002-3, image 2002-4 and image 2002-5.
Figure 20 B illustrates the equipment 100 when equipment 100 is in photo editing user interface and shows representative over the display
Property image 2002-3.Photo editing user interface is included for editing can piece supplying 2004 (for example, cutting of presentation graphics 2002-3
Sanction can piece supplying 2004-1, filter can piece supplying 2004-2, illumination can piece supplying 2004-3).In this example, it is assumed that user is chosen
Illumination can piece supplying 2004-3 and have modified the contrast (representative as shown in Figure 20 B of presentation graphics 2002-3
Property image 2002-3 made compared with the presentation graphics 2002-3 as shown in Figure 20 A its contrast increase).
Photo editing user interface is also included in the first edit pattern (for example, being applied to whole edit patterns) and the
Switch between two edit patterns (for example, single image edit pattern) can piece supplying 2006 (for example, switching switch).In figure
In 20B, switching switch 2006 is arranged to the second edit pattern so that when (for example, equipment 100 detect user input 2008
For changing the input of presentation graphics) when, equipment 100 is changed presentation graphics 2002-3 and is not changed by camera in collection generation
After table image 2002-3 collection one or more images (for example, image 2002-4 and 2002-5), and do not change by
One or more images (for example, image 2002-1 and 2002-2) of camera collection before collection presentation graphics 2002-3.
On the contrary, in Figure 20 I, switching switch 2006 is arranged to the first edit pattern (for example, being applied to all mode),
So that when equipment 100 detects the user input 2008 of the modification of application user, equipment 100 changes presentation graphics 2002-
3rd, by one or more images (for example, image 2002-4 and 2002- of camera collection after collection presentation graphics 2002-3
5) and by camera collection presentation graphics 2002-3 before collection one or more images (for example, image 2002-1 and
2002-2).
Return in the example that wherein only representing property image 2002-3 is modified and (for example, follow Figure 20 B), Figure 20 C-20H
Illustrate the various embodiments for replay image sequence, wherein only representing property image is modified.
Specifically, as shown in Figure 20 C-20D, in certain embodiments, after modification only representing property image 2002-3,
When showing presentation graphics 2002-3, equipment 100 is received as the user input of the request for replay image sequence 2002
2010.As shown in Figure 20 C, in response to the Part I of the user input 2010 for replay image sequence 2002, equipment 100 profit
At least some of image of collection image (for example, image 2002-4 and image 2002- after presentation graphics 2002-3
5) substituting the display to presentation graphics 2002-3.As seen in fig. 2 od, in response to detecting for replay image sequence 2002
The second input 2010 Part II, equipment 100 order display gathered before collection presentation graphics 2002-3 by camera
At least some of one or more images image (for example, the image 2002-2 of image 2002-1 and Fig. 6 F), modified
Presentation graphics 2002-3 and by camera collection presentation graphics 2002-3 after collection one or more images in
At least some image (for example, image 2002-4 and image 2002-5).I.e., in certain embodiments, presentation graphics 2002-3
It is comprised in playback with its modified form.
Figure 20 E-20F illustrates another example, wherein after modification only representing property image 2002-3, when display is through repairing
During the presentation graphics 2002-3 changing, equipment 100 is received as the user input 2012 of the request for replay image sequence 2002.
As shown in Figure 20 E, in response to the Part I of the user input 2012 for replay image sequence 2002, equipment 100 utilizes
After presentation graphics 2002-3, at least some of image of collection image (for example, image 2002-4 and image 2002-5) comes
Substitute the display to presentation graphics 2002-3.As shown in Figure 20 E, in response to for replay image sequence 2002 is detected
The Part II of two inputs 2010, equipment 100 order shows being gathered before collection presentation graphics 2002-3 by camera
At least some of individual or multiple images image (for example, the image 2002-2 of image 2002-1 and Fig. 6 F), there is no generation of changing
Table image 2002-3 and by camera collection presentation graphics 2002-3 after collection one or more images in extremely
Few some images (for example, image 2002-4 and image 2002-5).I.e., in certain embodiments, presentation graphics 2002-3 goes out
Purpose in playback is restored to its unmodified form.
Figure 20 G-20H illustrates another example, wherein after modification only representing property image 2002-3, when display is through repairing
During the presentation graphics 2002-3 changing, equipment 100 is received as the user input 2014 of the request for replay image sequence 2002.
As shown in Figure 20 G, in response to the Part I of the user input 2014 for replay image sequence 2002, equipment 100 utilizes
After presentation graphics 2002-3, at least some of image of collection image (for example, image 2002-4 and image 2002-5) comes
Substitute the display to presentation graphics 2002-3.As shown in Figure 20 H, in response to for replay image sequence 2002 is detected
The Part II of two inputs 2010, equipment 100 order shows being gathered before collection presentation graphics 2002-3 by camera
At least some of individual or multiple images image (for example, the image 2002-2 of image 2002-1 and Fig. 6 F) and by camera in collection
At least some of one or more images of collection image (for example, image 2002-4 and figure after presentation graphics 2002-3
As 2002-5).I.e., in certain embodiments, presentation graphics 2002-3 once be modified just from playback be omitted completely.
As mentioned above, in Figure 20 I, switching switch 2006 is arranged to the first edit pattern and (for example, is applied to entirely
Portion's pattern) so that when equipment 100 detects the user input 2008 of the modification of application user, equipment 100 changes representative diagram
Picture 2002-3, one or more images (for example, the image 2002-4 being gathered after collection presentation graphics 2002-3 by camera
And 2002-5) and one or more images (for example, the image by camera collection before collection presentation graphics 2002-3
2002-1 and 2002-2).
Figure 20 J-20K illustrates and according to some embodiments, all images in wherein image sequence has been modified
The playback of image sequence.When in modification presentation graphics 2002-3, by camera collection after collection presentation graphics 2002-3
One or more images (for example, image 2002-4 and 2002-5) and by camera collection presentation graphics 2002-3 before
Modified presentation graphics 2002-3 is shown after one or more images (for example, image 2002-1 and 2002-2) of collection
When, equipment 100 is received as the user input 2016 of the request for replay image sequence 2002.As shown in Figure 20 J, in response to
In the Part I of the user input 2016 of replay image sequence 2002, equipment 100 is using modified in presentation graphics
At least some of image image (for example, image 2002-4 and image 2002-5) of collection after 2002-3 is substituting to representative
The display of property image 2002-3.As shown in Figure 20 K, in response to the second input 2016 for replay image sequence 2002 is detected
Part II, equipment 100 order display is by one or more figures of camera collection before collection presentation graphics 2002-3
As at least some of image (for example, the image 2002-2 of image 2002-1 and Fig. 6 F) and by camera collection presentation graphics
At least some of one or more images of collection image (for example, image 2002-4 and image 2002-5) after 2002-3.
I.e., in certain embodiments, when all images in image sequence are modified, equipment 100 utilizes modified all images
To play image sequence.
In certain embodiments, be not in the examples described above any middle using after presentation graphics 2002-3 gather
At least some of image image (for example, image 2002-4 and image 2002-5) substituting to presentation graphics 2002-3's
Display, equipment 100 substitutes to representative diagram using to the display of respective image of collection before presentation graphics 2002-3
Display (for example, omitting the playback shown in Figure 20 C, Figure 20 E, Figure 20 G and Figure 20 J) as 2002-3.More generally, when only
When only presentation graphics is modified in image sequence, (Fig. 6 A-6FF, Fig. 7 A-7CC, Fig. 8 A- in the document elsewhere
8L, method 600, method 700 and/or method 800) any in the embodiment for replay image sequence that describes alternatively exist
Modified image is omitted, is restored to its unmodified form or is included as modified situation during playing back
Get off to execute.Similarly, when all images in image sequence are modified, (Fig. 6 A-6FF, figure in the document elsewhere
7A-7CC, Fig. 8 A-8L, method 600, method 700 and/or method 800) in the embodiment for replay image sequence that describes
Any alternatively to execute in the case of modified image sequence.
In some cases, change presentation graphics and do not change additional images and will lead to when enhancement mode photo is played
Discontinuity.For example, as shown in Figure 20 L, in certain embodiments, when switching switch 2006 is arranged to "Off", equipment
The user input 2022 of (for example, or rotation) presentation graphics 2002-3 100 is detected for cutting out.However, when representativeness
When image 2002-3 is tailored/rotates, as the playback enhancement mode photo above with reference to described by Figure 20 C-20H will lead to work as representative
Property image 2002-3 be shown when " jump ".Therefore, in certain embodiments, some when carrying out to presentation graphics 2002-3
Modification (for example, cut out and/or rotate) and do not change by camera collection presentation graphics 2002-3 after collection one or
Multiple images and do not change by camera gather presentation graphics 2002-3 before collection one or more image when, equipment
100 are automatically switched off the playback to additional images, delete additional images, or modified presentation graphics is saved to newly
File is as rest image.In certain embodiments, equipment 100 provides a user with warning 2018.In certain embodiments, equipment
100 provide a user with option 2020.For example, equipment 100 provides a user with:New static for saving as compiled image
The option 2020-1 of image;For deleting the option 2020-2 of the additional images in image sequence (for example, enhancement mode photo);With
And for entering the option 2020-3 being applied to whole edit patterns;And the option 2020-4 for cancelling.
Figure 21 A-21J illustrates according to some embodiments for the image from image sequence is sent to the second electronics
The exemplary user interface of equipment.User interface in these accompanying drawings is used for process described below being described, including Fig. 9 A-
Process in 9G, 10A-10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.Although will join
Examine and provide following example with the input on touch-screen display (wherein Touch sensitive surface and display are combined), but at some
In embodiment, equipment 100 detects the input on Touch sensitive surface 451 detached with display 450 as illustrated in fig. 4b.
Figure 21 A-21J illustrates two kinds of example scenario, wherein when display over the display is from the representative of image sequence
Property image when, equipment 100 detect with for sending the request of the presentation graphics from image sequence or for from image
Select the presentation graphics for transmission in sequence asks corresponding input.When the second electronic equipment is configured to image
Sequence interacts that (for example, the second electronic equipment is configured to execute Fig. 6 A-6FF, Fig. 7 A-7CC and/or Fig. 8 A- as one group
Interaction described in 8L) when, equipment 100 shows for image sequence is at least partly sent to the first of the second electronic equipment
Option set (for example, as shown in fig. 21 c).On the contrary, when the second electronic equipment be not configured as using image sequence as one group with
Interaction when, equipment 100 shows for by second option set being at least partly sent to the second electronic equipment of image sequence
(for example, as shown in figure 21d).
Figure 21 A-21D illustrates following situation, and the option being wherein used for sending the presentation graphics from image sequence rings
Ying Yu is used for selecting the user's request of the presentation graphics for sending to provide.Figure 21 A illustrates the dialogue on equipment 100
(for example, the dialogue in information receiving and transmitting application/messaging user interface 2102).Dialogue is the use with regard to the second electronic equipment
Family (Stephanie).In certain embodiments, when the user's request of equipment 100 selects the presentation graphics for sending,
Destination's (for example, the second electronic equipment) of presentation graphics is known for equipment 100, because asking the generation for sending
The process of table image stems from the dialogue with the second equipment.
For this reason, as illustrated in fig. 21, equipment 100 detection selects (for example, to add static photograph for media are added to dialogue
Piece, the media of enhancement mode photo, video or any other type) can piece supplying 2106 user input 2104 (for example, tapping handss
Gesture).
Figure 20 B illustrates in response to user input 2104 (Figure 21 A), and equipment provides user interface 2108, user interface
The option of 2108 photos providing a user with the user for being selected for transmission to the second electronic equipment.User interface 2108 is wrapped
Include the region with optional image 2110 (for example, image 2110-a to image 2110-c).In certain embodiments, optional
Select the expression (for example, nearest three or five photos) that image 2110 is nearest photo.User interface 2108 include for from
Select the option 2112 of photo in the photo library of user, shoot photo for (for example, using the camera in the equipment that is integrated into 100)
Or the option 2114 of video and the option 2116 for cancelling interpolation media.
In this example, equipment 100 detects user input 2116, and user input 2116 is for being selected for transmission to
The request of the image 2110-b of the user of the second equipment.For illustrative purposes, in this example, image 2110-b is from figure
As the presentation graphics (for example, carrying out the presentation graphics of self-reinforcing type photo) of sequence, image sequence is included in presentation graphics
The image of collection and/or the image gathering after presentation graphics 2110-b before 2110-b.
As shown in fig. 21 c, because when Stephanie equipment (the second electronic equipment) be configured to using image sequence as
One group when interacting, equipment 100 shows for by first choosing being at least partly sent to the second electronic equipment of image sequence
Item set 2118.In certain embodiments, the first option set 2118 includes:(for example, send for sending whole image sequence
Enhancement mode photo) option 2118-a;Gathered be not sent in presentation graphics 2110-b for sending presentation graphics before
Image and after not being sent in presentation graphics 2110-b collection image (for example, only by presentation graphics 2110-b
As rest image send) option 2118-b;For by the image pane being at least partially converted into flash demo of image sequence
The option 2118-c of formula (for example, GIF form);For image sequence is at least partially converted into video format (for example, MPEG
Form) option 2118-d;And the option 2118-e for cancelling.Equipment 100 also display image 2110-b has been chosen
For being sent to the instruction 2120 of the second electronic equipment.
On the contrary, Figure 21 D illustrates for by second option being at least partly sent to the second electronic equipment of image sequence
Set 2122.Second option set 2122 is shown, because in this example, the second electronic equipment (for example, the setting of Robert
Standby) it is not configured as interacting image sequence as one group.Reach the process of the second option set 2122 and with reference to figure
The process of arrival first option set 2118 of 21A-21C description is similar to.I.e., in certain embodiments, the second option set
2122 select when the user of equipment 100 is in the dialogue with Robert for by media be added to dialogue can piece supplying (for example,
Figure 21 A's can piece supplying 2106) be then selected for transmission to from image sequence Robert equipment presentation graphics (example
As selected image 2110-b using the user input similar with input 2116) shown afterwards.
Second option set 2122 includes:Before not being sent in presentation graphics 2110-b for sending presentation graphics
The image of collection and after not being sent in presentation graphics 2110-b collection image (for example, only by presentation graphics
2110-b as rest image send) option 2118-a;For by the flash demo of being at least partially converted into of image sequence
The option 2122-b of picture format (for example, GIF form);For image sequence is at least partially converted into video format (example
As mpeg format) option 2122-c;And the option 2122-d for cancelling.Equipment 100 also display image 2110-b is
It is selected for being sent to the instruction 2120 of the second electronic equipment.In certain embodiments, the second option set 2122 does not include
For sending the option of whole image sequence (for example, send enhancement mode photo) because the equipment of Robert be not configured as by
Image sequence interacts as one group.
Figure 21 E-21J illustrates following situation, and the option being wherein used for sending the presentation graphics from image sequence rings
Ying Yu to provide for the user's request sending presentation graphics.Figure 21 E illustrates the camera film user interface on equipment 100
2124.(for example, image 2129-1 is to image with other images 2129 for camera film user interface 2124 display image 2126
2129-8), it is optionally the expression of photo, enhancement mode photo or film.In this example, it is assumed that the user of equipment 100 is still
In dialogue, do not navigate to camera film user interface 2124 (for example, navigating to user interface 2124 from main screen).Cause
This, in the example shown in Figure 21 E-21J, when user's (for example, via the user input 2128 of Figure 21 E) selects image 2126
Or during the shared image 2126 of (for example, via the user input 2132 of Figure 21 F) request, the destination of presentation graphics is (for example,
Second electronic equipment) it is unknown for equipment 100.Therefore, depending on receiving whether the second electronic equipment is configured to scheme
As sequence interacts as one group, equipment 100 may have not yet been displayed the first different options or the second option.On the contrary, as follows
Described by face, once knowing destination, when user's request sends image, the first option or the second option are shown.
For this reason, Figure 21 E illustrates the user input 2128 selecting image 2126 in camera film user interface 2114.?
It is assumed that image 2126 is the presentation graphics from image sequence in this example.
As shown in Figure 21 F, in response to user input 2128, image 2126 is shown in image viewing user circle by equipment 100
In face 2130.Figure 21 F also illustrates the user input of (for example, by select shared can piece supplying 2134) shared image 2126 of request
2132.
As shown in Figure 21 G, in response to user input 2132, the shared user interface 2138 of equipment 100 display.Shared user circle
Face 2138 is shown in response to the shared image 2126 of request, and image 2126 is shown in and image shared user interface 2138
Collection on 2126 close times several images (for example, three images, five images, etc.) region 2140 in by advance
Select.Shared user interface also includes the sharing option based on agreement for selecting the agreement by its shared image 2126,
Including messaging protocol sharing option 2142-a, email protocol sharing option 2142-b and social media agreement sharing option
2142-c.In Figure 21 G, user's (via user input 2143) selects messaging protocol sharing option 2142-a, and it draws Figure 21 H
Shown in conversational user interface 2144.
In this example, when conversational user interface 2144 (Figure 21 H) is initially shown, destination field 2146 is empty
, because the user of equipment 100 does not also specify destination (for example, equipment 100 extraction one message, the empty purpose of image 2126
Ground field 2146 and image 2126 are automatically inserted in the main body 2148 of this message).Therefore, in this example, it is assumed that equipment
100 user has been manually entered " Stephanie Levin " as purpose in the conversational user interface 2144 in Figure 21 H
Ground.User has also keyed in short message 2150, " to look at this train ".
As also shown in Figure 21 H, user's (via user input 2152) selects send button 2154, thus request is sent out
Send the remainder of image 2126 and message.
As shown in Figure 21 I, in response to user input 2152, because (the second electronics sets when the equipment of Stephanie Levin
Standby) be configured to using image sequence as one group interact when, equipment 100 shows at least partly sending out image sequence
Deliver to the first option set 2118 (above with reference to described by Figure 21 C) of the second electronic equipment.
On the contrary, Figure 21 J illustrates for by second option being at least partly sent to the second electronic equipment of image sequence
Set 2122.Second option set 2122 is shown, because in this example, the second electronic equipment (for example, Robert Yu
Equipment) it is not configured as interacting image sequence as one group.Reach the process of the second option set 2122 and with reference to figure
The process of arrival first option set 2118 of 21E-21H description is similar to.I.e., in certain embodiments, as the user of equipment 100
When input Robert Yu to replace Stephanie Levin in Figure 21 H and to press transmission afterwards as destination 2146,
Second option set 2122 is shown.
Figure 22 A-20D is illustrated according to some embodiments for being gathered photo (for example, enhancement mode using scene Recognition
Photo or picture) exemplary user interface.User interface in these accompanying drawings is used for process described below is described,
Including the process in Fig. 9 A-9G, 10A-10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.
Although following example will be provided with reference to the input on touch-screen display (wherein Touch sensitive surface and display are combined),
It is in certain embodiments, equipment 100 detects on Touch sensitive surface 451 detached with display 450 as illustrated in fig. 4b
Input.
Some scenes are relatively beneficial to be captured as image sequence (for example, enhancement mode photo).For example, people are frequently used phase
With portable multifunction device capturing important a moment (picture that for example, their child smiles on sandy beach) and to catch
Obtain more ordinary image, for example, shoot the picture of receipt for filing purpose.According to some embodiments, Figure 22 A-22D illustrates
Automatically determine via scene Recognition and to be intended to capture images sequence (for example, in the case of the child smiling) or rest image
The user interface for equipment of (for example, in the case of receipt).For example, (for example, relate to when scene meets action capture criterion
And the criterion of the activity in scene) when, this equipment retains image sequence in response to the activation to shutter release button, including to fast
The image of collection and the image gathering after the activation to shutter release button before the activation of door button.On the contrary, when scene is discontented with
During sufficient action capture criterion, this equipment retain single image (for example, similar in traditional camera in response to the activation to shutter
The single image of collection).
In Figure 22 A, equipment 100 is in acquisition of media pattern (for example, photo acquisition pattern or automatically static/enhancement mode
Photo acquisition pattern) in.When equipment 100 is in acquisition of media pattern, equipment 100 display is included by camera (for example, camera
It is integrated in equipment 100) the image capturing user interface 2202 of the live preview 2210 of scene that detects.Image capturing
User interface 2202 is also included for navigating to piece supplying 2204 (for example, piece supplying 2204 can showing and adopted by camera of camera film
The epitome of the last photo/video collecting represents);Virtual shutter button 2206;And for filter being applied to the reality of scene
When preview (dark brown filter) can piece supplying 2208.
When equipment 100 is in acquisition of media pattern, equipment 100 executes scene Recognition to scene.For example, real at some
Apply in example, scene Recognition includes detecting that text, detection be mobile, the movement of the face of detection people and/or testing equipment 100 (for example,
When user is planned to follow the tracks of target).In Figure 22 A, equipment 100 identifies that using scene Recognition scene is almost text (example
If scene is receipt).In certain embodiments, equipment 100 identifies field by the text that identification scene includes exceeding threshold quantity
Scape is almost text.In certain embodiments, when equipment 100 recognize scene be almost text when, action capture criterion not by
Meet.For example, the receipts on desk are captured because user unlikely wants in response to the activation to shutter release button 2206
According to a moment of surrounding, so equipment 100 retains single image 2214 (in image review mode shown in Figure 22 B).
On the contrary, Figure 22 C describes in acquisition of media pattern the train shown in live preview 2210 near platform
Scene.Specifically, Figure 22 C depicts scene in five different times (in chronological order:Time 2210-1;Time 2210-2;When
Between 2210-3;Time 2210-4;And time 2210-5) place live preview 2210.
When live preview 2210 is shown over the display, equipment 100 executes scene detection.In this example, when this
Equipment detect threshold quantity mobile when action capture criterion be satisfied.So, because train moves in live preview 2210,
So in response to the activation to shutter release button 2206 at time 2210-3, equipment 100 retains image sequence as shown in figure 22d
2218 (for example, enhancement mode photos).
As shown in figure 22d, image sequence 2218 includes:The multiple images of collection before the activation to shutter release button 2206
(for example, image 2218-1 and 2218-2);In certain embodiments with the activation close time to shutter release button 2206
Upper collection presentation graphics 2218-3 (for example, image 2218-3 similar in traditional camera in response to the activation to shutter
The single image of collection);And multiple images (for example, the image by camera collection after collection presentation graphics 2218-3
2218-4 and 2218-5).That is, because the train of movement exceedes the movement of threshold quantity, equipment 100 captures enhancement mode photo.Strengthen
Type photo can play back according to the embodiment for example describing with reference to Fig. 6 A-6FF, Fig. 7 A-7CC and/or Fig. 8 A-8L afterwards.
Figure 23 A-23E illustrates the showing for montage sequence (for example, enhancement mode photo) according to some embodiments
Example property user interface.User interface in these accompanying drawings is used for process described below being described, including Fig. 9 A-9G, 10A-
Process in 10M, 11A-11I, 12A-12B, 24A-24E, 25A-25C, 26A-26D and 27A-27E.Although will with reference to touch
Input on panel type display (wherein Touch sensitive surface and display are combined) provides following example, but in some embodiments
In, equipment 100 detects the input on Touch sensitive surface 451 detached with display 450 as illustrated in fig. 4b.
Figure 23 A illustrates the equipment 100 when equipment 100 is in photo editing user interface and shows representative over the display
Property image 2302.Presentation graphics 2302 represents image sequence (for example, representing enhancement mode photo).In certain embodiments, if
From the currently selected image of image sequence, it is not necessarily presentation graphics for standby 100 displays.Photo editing user interface bag
Include for edit presentation graphics 2302 can piece supplying 2004 (for example, cut out can piece supplying 2004-1, filter can piece supplying 2004-2,
Illumination can piece supplying 2004-3).Photo editing user interface also includes icon 2304 may be selected.In certain embodiments, work as photo
Editor user interface in display image be from the presentation graphics of image sequence when, optional icon 2304 is animated drills
Show, shown with colour and/or be filled.In certain embodiments, when the image of display in photo editing user interface is static
During image, optional icon 2304 with white and black displays, be not animated demonstration and/or be not filled.Therefore, in some embodiments
In, optional icon 2304 indicates to the user that whether he/her is editing enhancement mode photo.In certain embodiments, may be selected
Icon 2304 is only only optional when the presentation graphics from image sequence is displayed in photo editing user interface
's.
Optional icon 2304 also include " completing " can piece supplying 2301, the modification of user is applied to photo by it.
In Figure 23 A, equipment 100 receives the user input 2306 that icon 2304 may be selected.
In Figure 23 B, in response to user input 2306, equipment 100 display can piece supplying bar 2308.Can wrap piece supplying bar 2308
Include:What the animation for opening image sequence played back can piece supplying 2310-1;For closing image sequence when retaining image sequence
Animation playback can piece supplying 2310-2;For montage sequence can piece supplying 2310-3;And be used for deleting image sequence
In other images in addition to presentation graphics 2302 can piece supplying 2310-4.In certain embodiments, any given
At time can piece supplying 2310-1 or can the only one in piece supplying 2310-2 be selectable, depending on animation playback be currently by
Open or be closed (for example, if playback be currently open when, " opening " can piece supplying 2310-2 by " graying ").Photo is compiled
Collect user interface also to include in the first edit pattern (for example, being applied to whole edit patterns) and the second edit pattern (example
As single image edit pattern) between switch over can piece supplying 2006 (for example, switching switch), such as reference Figure 20 A-20L institute
Description.
In Figure 23 B, equipment 100 receive select for montage sequence can piece supplying 2310-3 user input
2312.
In Figure 23 C, in response to user input 2312, equipment 100 shows for being image sequence by image sequence editing
Subset (for example, editing is the subset fewer than all images in image sequence) user interface 2314.User interface 2314
Region 2316 (for example, band) including the expression 2318 of the image comprising in image sequence is (in order to vision understands, only in figure
In marked one of image expression 2318).In certain embodiments, the expression 2318 of image is the image in image sequence
Thumbnail.In certain embodiments, the expression of image to arrange in chronological order so that in region 2316 left side those tables
Show that 2318 expressions represent 2318 images gathering earlier than those of the right in region 2316.
User interface 2314 includes the second area 2322 being simultaneously shown with region 2316.Presentation graphics or current
The image selecting is displayed in second area 2322.
Region 2316 includes the beginning shank 2320-a of the beginning image in the subset define image sequence.Region 2316 is wrapped
Include the end shank 2320-b of the end image in the subset defining image sequence.Start shank 2320-a and terminate shank
2320-b is positioned at the position being automatically selected by equipment (for example, using scene detection) in region 2316.For example, equipment
100 use scene detection (for example, by determining when face orientation camera or determining when that image is the most unambiguous)
Determine the time period that best action during it occurs.Equipment 100 is arranged starting shank 2320-a in region 2316
Represent the position of the beginning of time period that action best during it occurs and shank 2320-b will be terminated region is set to
The position of the end of time period that the best action during it of the expression in 2316 occurs.
Figure 23 C also illustrate by start shank 2320-a and terminate the expression 2318 of the image between shank 2320-b with
Other in region 2316 represent that 2318 are distinguished (for example, by somewhat making other expressions graying) by vision.
User interface 2314 also includes replacement can piece supplying 2324.In Figure 23 C, equipment 100 receives selection replacement can piece supplying
2324 user input 2326.
In Figure 23 D, in response to select to reset can piece supplying 2324 user input 2326, equipment 100 will start shank
2320-a moves to the position corresponding with the initial pictures in the image sequence of non-editing and will terminate shank 2320-b shifting
Move the final image in the image sequence of non-editing.That is, replacement can piece supplying 2324 editing shank is reset to defeated with user
Image sequence before entering editing user interface 2314 is corresponding.
As shown in fig. 23d, in certain embodiments, user interface 2314 display is not included in original (for example, non-editing
) expression 2328 of image obtaining before the initial pictures in original sequence in image sequence and/or do not wrapped
Include the image obtaining after the final image in original sequence in original (for example, non-editing) image sequence
Represent 2330.
Also as shown in fig. 23d, in certain embodiments, when user select replacement can piece supplying 2324 when, user interface 2314
Display (for example, reset can be in the place of piece supplying 2324) automatically can piece supplying 2332, it allows user to switch back to based on scene
The position for starting shank 2320-a and end shank 2320-b that detection automatically selects.
Also as shown in Figure 23 D-23E, user can adjust the position starting shank 2320-a and terminating shank 2320-b manually
Put.In Figure 23 D, equipment 100 receiving user's input 2324 (for example, the drag gesture on starting shank 2320-a).Figure 23 E
The position illustrating the beginning shank 2320-a in region 2316 is moved according to drag gesture 2334.
In certain embodiments, work as user when being in editing user interface 2314 to select to be applied to the editing of user
" completing " of image sequence can piece supplying 2301 when, equipment 100 deletes the son that (or labelling is to be deleted) is not included in image
The image (for example, it represents that 2318 are not included in starting shank 2320-a and terminate between shank 2320-b) concentrated, or
The playback to the image in the subset being not included in image for the disabling.For example, when cropped image sequence is according to reference to figure
When some embodiments of 6A-6FF, Fig. 7 A-7CC and/or 8A-8L description are to play back, it is not included in the figure in the subset of image
As not being played.In certain embodiments, when equipment 100 disables the playback to the image in the subset being not included in image
When, equipment 100 retains the image being not included in the subset of image and allows user in the time after a while (for example, in editing
In user interface 2314) reduction whole image sequence or whole image sequence any part.
Figure 24 A-24E illustrates the flow process of the method 2400 according to the image in the modification image sequence of some embodiments
Figure.In electronic equipment (for example, the equipment 300 of Fig. 3 or the portable multifunction device of Figure 1A with display and Touch sensitive surface
100) place's execution method 2400.According to some embodiments, this equipment includes the intensity of the contact for detection with Touch sensitive surface
One or more sensors.In certain embodiments, described display be touch-screen display and described Touch sensitive surface be
On described display or integrated with described display.In certain embodiments, described display is separated with described Touch sensitive surface.Side
Certain operations in method 2400 are alternatively combined and/or the order of certain operations is alternatively changed.
Method 2400 provides the intuitive manner for changing enhancement mode photo.Specifically, when (for example, user changes
Cut out, make black and white, change balance and/or contrast) for enhancement mode photo presentation graphics when, in some embodiments
In, method 2400 allows whether user's (for example, using switching switch) specified modification should be applied to only representing property image
Or it is applied to all images in enhancement mode photo.When modification is applied to only representing property image, method 2400
To there is provided playback scheme according to various embodiments.For example, in various embodiments, including the figure with modified presentation graphics
Enhancement mode photo as sequence to play back in the case that presentation graphics is modified, is not modified or is omitted.When modification quilt
When being applied to whole image sequence, enhancement mode photo plays back modified image sequence.
This equipment shows that (2402) presentation graphics (for example, presents in pattern when this equipment is in image over the display
When).Representative diagram seems one of the image sequence that shot by camera image.Image sequence includes being represented in collection by camera
Property image after collection one or more images.Image sequence includes being gathered before collection presentation graphics by camera
Individual or multiple images.In certain embodiments, the camera of shooting image sequence is the part of electronic equipment.In some embodiments
In, (for example, image sequence is using the phase on another equipment by the camera shooting of the part not being electronic equipment for image sequence
Machine is transferred to electronic equipment after being taken).In certain embodiments, image sequence is in response to detecting at the very first time
The activation of shutter release button is obtained, such as herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600
Described.In certain embodiments, presentation graphics with by collected by camera presentation graphics corresponding, such as herein join
Examine described by Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600.
When showing presentation graphics over the display, this equipment detection (2404) is used for changing the input of presentation graphics
(for example, for cutting out, filter process, adjust exposure rate, adjust color, be converted to black and white, etc. input).For example, Figure 20 B
Input 2008 be input for changing presentation graphics 2002-3.
In response to the input for changing presentation graphics is detected:It is in the first edit pattern really according to this equipment
Fixed (for example, can piece supplying being arranged to for editor to be applied to the institute in respective image sequence of the switching switch 2006 of such as Figure 20 I
Have image), this apparatus modifications (2406) presentation graphics, by camera collection presentation graphics after collection one or more
Image and the one or more images by camera collection before collection presentation graphics;And, according to this equipment be in
Determination in diverse second edit pattern of first edit pattern (for example, being available for of the switching switch 2006 of such as Figure 20 B
Part is arranged to the presentation graphics being applied only to editor in respective image sequence), this apparatus modifications presentation graphics, and
Do not change by one or more images of camera collection after collection presentation graphics, and do not change by camera in collection generation
One or more images of collection before table image.
In certain embodiments, equipment is provided in the first edit pattern and the second volume in photo editing user interface
Between volume pattern switching can piece supplying (for example, the switching switch 2006 of Figure 20 B and Figure 20 I is the group of photo editing user interface
Part).In certain embodiments, photo editing user interface is included for beating opening/closing being available for the playback of enhancement mode photo
Part, for deleting can piece supplying and/or the set for editing additional photographs (for example, of additional images in enhancement mode photo
The selection of rest image is revised as being comprised in enhancement mode photo) can piece supplying, as with reference to Figure 23 A-23E and method 2700 institute
Description.
In certain embodiments, in response to the input for changing presentation graphics is detected, this equipment presents to user
Modification is applied to only representing property image or is applied to presentation graphics and by camera after collection presentation graphics
One or more images of collection and the option by camera one or more images of collection before collection presentation graphics.
In some cases, change presentation graphics and do not change additional images and will lead to when enhancement mode photo is played
Discontinuity.For example, when presentation graphics is tailored with respect to additional images or is rotated, when presentation graphics is shown
Shi Huifang enhancement mode photo will lead to " jump ".Therefore, in certain embodiments, when some modifications are carried out to presentation graphics
(for example, cut out and/or rotate) and do not change by camera collection presentation graphics after collection one or more images simultaneously
And when not changing the one or more image being gathered before gathering presentation graphics by camera, this equipment is automatically switched off to extra
The playback of image, deletes additional images, or makes modified presentation graphics be saved to new file as rest image.
In certain embodiments, this device alerts user modification will lead to modified presentation graphics to become stationary at image and to user
The option (for example, the warning 2018 of Figure 20 L) continuing modification or cancelling modification is provided.
In certain embodiments, do not change by camera after collection respective representative image in modification presentation graphics
Collection one or more images and do not change by camera collection respective representative image before collection one or more
After image:This equipment shows (2408) modified presentation graphics over the display.Modified when showing over the display
Presentation graphics when, this equipment detection second input Part I.In response to the Part I of the second input is detected, should
Equipment utilization is suitable at least some of the one or more images being gathered after collection presentation graphics by camera image
Sequence shows the display to substitute to modified presentation graphics.Therefore, in certain embodiments, defeated in response to detecting second
The Part I entering, by camera, after collection presentation graphics, one or more (unmodified) image of collection is sequentially shown
Show (for example, as shown in Figure 20 E).Therefore, in certain embodiments, this equipment be shown in modified presentation graphics with by phase
Machine cross-fade animation between one or more (unmodified) image of collection after collection presentation graphics.
After the Part I of the second input is detected, the Part II of this equipment detection second input (for example, continues
Contact in detection finger gesture and/or intensity).In response to the Part II of the second input is detected, this equipment order shows
By camera before collection presentation graphics at least some of one or more images image of collection, there is no the representative of modification
Property image and at least some of the one or more images image (example by camera collection after collection presentation graphics
As shown in Figure 20 F).
Therefore, in certain embodiments, in response to the Part II of the second input is detected, from whole (unmodified) figure
Initial pictures as sequence play this sequence to final image.For example, presentation graphics passes through to be changed into black white image quilt
Modification, and the other sequences in this sequence remain coloured image.When black and white presentation graphics is shown, input (for example, is pressed
Descend and keep gesture or deep pressure gesture) Part I be detected.As response, the display to black and white presentation graphics is passed through
To one or more (unmodified) coloured image being gathered after collection presentation graphics by camera in image sequence
Order shows to substitute.In response to the Part II of the second input is detected, from the initial pictures of whole image sequence to final
This sequence of image player, wherein all images are with colored display.
In certain embodiments, do not change by camera after collection respective representative image in modification presentation graphics
Collection one or more images and do not change by camera collection respective representative image before collection one or more
After image:This equipment shows (2410) modified presentation graphics over the display.Modified when showing over the display
Presentation graphics when, this equipment detection second input.In response to the second input is detected, this equipment order display is existed by camera
At least some of one or more images image of collection before collection presentation graphics, not have the presentation graphics of modification with
And at least some of the one or more images image by camera collection after collection presentation graphics.
Therefore, in certain embodiments, in response to the second input is detected, neither one is modified this equipment in the picture
In the case of from collection presentation graphics before collection image from the beginning of (for example, started with the initial pictures in image sequence)
Play back enhancement mode photo rather than played back the image of collection after presentation graphics by camera by display.
In certain embodiments, do not change by camera after collection respective representative image in modification presentation graphics
Collection one or more images and do not change by camera collection respective representative image before collection one or more
After image:This equipment shows (2412) modified presentation graphics over the display.Modified when showing over the display
Presentation graphics when, this equipment detection second input Part I.In response to the Part I of the second input is detected,
This equipment utilization is at least some of the one or more images being gathered after collection presentation graphics by camera image
Order shows to substitute the display to modified presentation graphics.Therefore, in certain embodiments, in response to detecting second
The Part I of input, by one or more (unmodified) image quilt order of camera collection after collection presentation graphics
Display.Therefore, in certain embodiments, this equipment is shown in modified presentation graphics and is gathering representative diagram with by camera
Cross-fade animation (for example, as shown in Figure 20 C) between one or more (unmodified) image that picture gathers afterwards.
After the Part I of the second input is detected, the Part II of this equipment detection second input (for example, continues
Contact in detection finger gesture and/or intensity).In response to the Part II of the second input is detected, this equipment order shows
At least some of the one or more images being gathered before collection presentation graphics by camera image, modified representativeness
Image and by camera collection after collection presentation graphics at least some of one or more images image (for example,
As seen in fig. 2 od).
Therefore, in certain embodiments, in response to the Part II of the second input is detected, first from whole image sequence
Beginning image plays this sequence to final image, and wherein only representing property image is modified.For example, presentation graphics passes through to be changed
It is changed into black white image to be modified, and the other sequences in this sequence remain coloured image.When black and white presentation graphics is shown
When, the Part I of input (for example, pressing and holding gesture or deep pressure gesture) is detected.As response, black and white is represented
Property image display by image sequence by camera after collection presentation graphics one or more (not the repairing of collection
Changing) order of coloured image shows to substitute.In response to the Part II of the second input is detected, from whole image sequence
Initial pictures play this sequence to final image, wherein except all images of presentation graphics (it is with white and black displays) are with colour
Display.
In certain embodiments, do not change by camera after collection respective representative image in modification presentation graphics
Collection one or more images and do not change by camera collection respective representative image before collection one or more
After image:This equipment shows (2414) modified presentation graphics over the display.Modified when showing over the display
Presentation graphics when, this equipment detection second input.In response to the second input is detected, this equipment order display is by camera
Before collection presentation graphics at least some of one or more images image of collection, modified presentation graphics with
And at least some of the one or more images image by camera collection after collection presentation graphics.
Therefore, in certain embodiments, in response to the second input is detected, this equipment is modified in only representing property image
In the case of from collection presentation graphics before collection image from the beginning of (for example, started with the initial pictures in image sequence)
Play back enhancement mode photo rather than played back the image of collection after presentation graphics by camera by display.
In certain embodiments, do not change by camera after collection respective representative image in modification presentation graphics
Collection one or more images and do not change by camera collection respective representative image before collection one or more
After image:This equipment shows (2416) modified presentation graphics over the display.Modified when showing over the display
Presentation graphics when, this equipment detection second input Part I.In response to the Part I of the second input is detected, should
Equipment utilization is suitable at least some of the one or more images being gathered after collection presentation graphics by camera image
Sequence shows the display to substitute to modified presentation graphics.Therefore, in certain embodiments, defeated in response to detecting second
The Part I entering, by camera, after collection presentation graphics, one or more (unmodified) image of collection is sequentially shown
Show.Therefore, in certain embodiments, this equipment is shown in modified presentation graphics and is gathering presentation graphics with by camera
Cross-fade animation (for example, as shown in Figure 20 G) between one or more (unmodified) image gathering afterwards.
After the Part I of the second input is detected, the Part II of this equipment detection second input (for example, continues
Contact in detection finger gesture and/or intensity).In response to the Part II of the second input is detected, this equipment order shows
By camera before collection presentation graphics at least some of one or more images image of collection and by camera in collection
At least some of one or more images of collection image (for example, as shown in Figure 20 H) after presentation graphics.
Therefore, in certain embodiments, in response to the Part II of the second input is detected, first from whole image sequence
Beginning image plays this sequence to final image, except presentation graphics be not shown (for example, modified presentation graphics from
It is omitted in first complete playback of enhancement mode photo).In certain embodiments, simply enter to be maintained and (for example, press and protect
The depth pressure held gesture and/or there is the intensity higher than predefined threshold value), this equipment continues to cycle through image sequence.
In certain embodiments, do not change by camera after collection respective representative image in modification presentation graphics
Collection one or more images and do not change by camera collection respective representative image before collection one or more
After image:This equipment shows (2418) modified presentation graphics over the display.Modified when showing over the display
Presentation graphics when, this equipment detection second input.In response to the second input is detected, this equipment order display is existed by camera
Gather at least some of the one or more images gathering before presentation graphics image and by camera in collection representative diagram
At least some of one or more images that picture gathers afterwards image.
Therefore, in certain embodiments, in response to the second input is detected, this equipment be omitted in presentation graphics and
(for example, with image sequence from the beginning of the image of collection before collection presentation graphics in the case that residual image is unmodified
Initial pictures start) playback enhancement mode photo rather than by display by camera after presentation graphics gather image
Playback.
In certain embodiments, in modification presentation graphics, the collection after collection respective representative image by camera
One or more images and by camera collection respective representative image before collection one or more images after:This sets
Standby display (2420) modified presentation graphics over the display.When showing modified presentation graphics over the display
When, the Part I of this equipment detection second input.In response to the Part I of the second input is detected, this equipment utilization to by
Camera order of at least some of modified one or more images of collection image after collection presentation graphics shows
Show the display to substitute to modified presentation graphics.Therefore, in certain embodiments, in response to the second input is detected
Part I, by camera, after collection presentation graphics, modified one or more images of collection are shown sequentially (example
As shown in Figure 20 J).
After the Part I of the second input is detected, the Part II of this equipment detection second input (for example, continues
Contact in detection finger gesture and/or intensity).In response to the Part II of the second input is detected, this equipment order shows
By camera before collection presentation graphics at least some of modified one or more images image of collection, modified
Presentation graphics and by camera collection presentation graphics after collection modified one or more images in extremely
Few some images (for example, as shown in Figure 20 K).
Therefore, in certain embodiments, in response to the Part II of the second input is detected, from entirely modified image
The initial pictures of sequence play this sequence to final image.For example, the image in this sequence is by changing into them from colour
Black white image and be modified.When black and white presentation graphics is shown, input (for example, presses and holds gesture or deep pressure gesture)
Part I be detected.As response, showing by being existed by camera in image sequence to black and white presentation graphics
After collection presentation graphics, the order of one or more black white images of collection shows to substitute.Defeated in response to detecting second
The Part II entering, plays this sequence from the initial pictures of whole image sequence to final image, and wherein all images are with black and white
Display.
In certain embodiments, in modification presentation graphics, the collection after collection respective representative image by camera
One or more images and by camera collection respective representative image before collection one or more images after:This sets
Standby display (2422) modified presentation graphics over the display.When showing modified presentation graphics over the display
When, this equipment detection second input.In response to the second input is detected, this equipment order display is by camera in collection representative diagram
As at least some of modified one or more images of gathering before image, modified presentation graphics and by phase
Machine at least some of modified one or more images of collection image after collection presentation graphics.
Therefore, in certain embodiments, in response to the second input is detected, the situation that this equipment is modified in all images
Under from collection presentation graphics before collection image from the beginning of (for example, started with the initial pictures in image sequence) playback increase
Strong type photo rather than by display by camera after presentation graphics collection image play back.
In certain embodiments, this equipment detection (2424) was adopted by camera with deleting before collection respective representative image
Collection one or more images and by camera collection respective representative image after collection one or more images please
Corresponding second is asked to input.In response to the second input is detected, this equipment is deleted (or labelling is to be deleted) and is existed by camera
Gather the one or more images gathering before presentation graphics and one gathered after collection presentation graphics by camera
Or multiple images (for example, in the case of the additional customer's input outside not having the second input, removing in deletion enhancement mode photo
All additional images outside presentation graphics).
It should be appreciated that the particular order that the operation in Figure 24 A-24E has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 2408 and 2410 is omitted.Extraly it is noted that this
In literary composition with regard to additive method described herein (for example, method 900,1000,10000,10050,1100,11000,1200,
2500th, 2600 and 2700) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 24 A-24E
The method 2400 stated.For example, above with respect to the contact of method 2400 description, gesture, user interface object, intensity threshold, animation
With image sequence alternatively have herein in connection with additive method described herein (for example, method 900,1000,10000,
10050th, 1100,11000,1200,2500,2600 and 2700) contact that describes, gesture, user interface object, intensity threshold,
One or more of feature of animation and image sequence.For simplicity, these details are not repeated herein.
Figure 25 A-25C illustrates, according to some embodiments, the image from image sequence is sent to the second electronic equipment
Method 2500 flow chart.In the first electronic equipment (for example, equipment of Fig. 3 with display and alternatively Touch sensitive surface
300 or Figure 1A portable multifunction device 100) place's execution method 2500.According to some embodiments, this equipment is included for examining
Survey one or more sensors of the intensity of contact with Touch sensitive surface.In certain embodiments, described display is touch screen
Display and described Touch sensitive surface is on the display or integrated with described display.In certain embodiments, described
Display is separated with described Touch sensitive surface.Certain operations in method 2500 be alternatively combined and/or certain operations order
Alternatively changed.
According to some embodiments, when the equipment of other users is configured to interact (for example, compatible) with enhancement mode photo,
Method 2500 allows her enhancement mode photo of collaborative share of user and other users.For this reason, method 2500 includes determining remotely
Whether electronic equipment is configured to interact with enhancement mode photo, and when devices in remote electronic is configured to hand over enhancement mode photo
When mutually, method 2500 is included by showing the first sharing option set (for example, including transmission enhancement mode photo) to increase to sending
The request of strong type photo responds.When devices in remote electronic is not configured as interacting with enhancement mode photo, method 2500 is wrapped
Include and (for example, regarded including transmission only representing property image or be converted to enhancement mode photo by showing the second sharing option set
Frequency or GIF form) the request sending enhancement mode photo is responded.
Presentation graphics is shown that (2502) are being configured to and other electronic equipments by the first electronic equipment over the display
In the user interface of application of communication.For example, presentation graphics be displayed on for information receiving and transmitting application (for example, from plus profit
The iMessage of the Apple than Dinon for the good fortune Buddhist nun's sub-library), social networking application (for example, Twitter or Facebook), ad
The hoc network service AirDrop of the Apple than Dinon (for example, from California storehouse) or e-mail applications (example
As from the Mail of the Apple than Dinon for California storehouse) input area in.
Representative diagram seems one of the image sequence that shot by camera image.Image sequence is included by camera in collection
One or more images of collection after presentation graphics.Image sequence includes being gathered before collection presentation graphics by camera
One or more images.In certain embodiments, the camera of shooting image sequence is the part of the first electronic equipment.At some
In embodiment, by the camera shooting of the part not being the first electronic equipment, (for example, image sequence is being set image sequence using another
Standby upper camera is transferred to the first electronic equipment after being taken).In certain embodiments, image sequence is in response to detecting
At the very first time, the activation of shutter release button is obtained, such as herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-
Described by 22D and method 2600.In certain embodiments, presentation graphics with by collected by camera presentation graphics relative
Should, as described by herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600.
In certain embodiments, it is configured to the application communicating with other electronic equipments in response to detecting in shared use
(for example, the share menu, such as California storehouse are more public than the Fructus Mali pumilae of Dinon for the application icon corresponding with application in the interface of family
Department iOS in shared table) selection and be shown (2504).In certain embodiments, presentation graphics be displayed on shared
In user interface, and using image sequence as one group of interaction carrying out therewith (for example, shared user interface is display configured to
Those interactions with reference to Fig. 6 A-6FF description).In certain embodiments, shared user interface is in response to being shown when presentation graphics
Choosing to shared icon when showing in image management application (for example, California storehouse is than the Apple of Dinon " photo ")
Select and be shown.
When showing presentation graphics over the display, the first electronic equipment detection (2506) with order to will generation using application
Table image be sent to the first electronic equipment the second long-range electronic equipment request or in order to be selected for transmission to
(for example, detection is by for the corresponding input of request of the long-range presentation graphics of the second electronic equipment of the first electronic equipment
Tap gesture on the Touch sensitive surface of one electronic equipment or click swash to " transmission " icon or " selection photo " icon
Live).
In response to detect with order to presentation graphics to be sent to the request of the second long-range electronic equipment or in order to select
That selects the presentation graphics for being sent to the second electronic equipment asks corresponding input:It is configured according to the second electronic equipment
It is that image sequence is interacted as one group that (for example, the second electronic equipment is configured to execute with reference to Fig. 6 A-6FF description
Interaction) determination, the first electronic equipment show that (2508) are used for for image sequence being at least partly sent to the second electronic equipment
The first option set (for example, as shown in Figure 21 C and/or 21I);And it is not configured as image according to the second electronic equipment
As one group of determination interacting, the first electronic equipment shows for image sequence is at least partly sent to second sequence
Second option set (for example, as shown in Figure 21 D and/or 21J) of electronic equipment, the wherein second option set and the first set of choices
Close different.
In certain embodiments, the second electronic equipment is not configured as image sequence as one group of determination interacting
Including wherein can not determining that the second electronic equipment is configured to image sequence as one group of situation about interacting.At some
In embodiment, if can not determine that the second electronic equipment is configured to interact image sequence as one group, can push away
Disconnected second electronic equipment is not configured as interacting image sequence as one group.
In certain embodiments, the second electronic equipment is configured to using image sequence as one group of determination interacting extremely
It is at least partly based on determination operating system just to be used by the second electronic equipment.In certain embodiments, as assuming the first set of choices
The alternative closed, is configured to image sequence as one group of determination interacting, the first electricity according to the second electronic equipment
Sub- equipment automatically send image sequence (for example, send will as one group of whole image sequence interacting, without
User presses the user's intervention after " transmission " button).In certain embodiments, according to the second electronic equipment be not configured as by
Image sequence automatically sends presentation graphics and does not send and existed by camera as one group of determination interacting, the first electronic equipment
Collection presentation graphics after collection one or more images and do not send by camera collection presentation graphics before adopt
One or more images of collection.
In certain embodiments, for by first set of choices being at least partly sent to the second electronic equipment of image sequence
(for example, the first option set is included for sending image sequence for sending the option of whole image sequence to close inclusion (2510)
Option as enhancement mode photo).
In certain embodiments, for by second set of choices being at least partly sent to the second electronic equipment of image sequence
Close inclusion (2512) to be used for the option being at least partially converted into video format (for example, mpeg format) of image sequence.One
In a little embodiments, for second option set being at least partly sent to the second electronic equipment of image sequence is included for inciting somebody to action
The option being at least partially converted into the form that the second electronic equipment is configured to interact of image sequence.In some embodiments
In, for second option set being at least partly sent to the second electronic equipment of image sequence is included for by image sequence
The option being at least partially converted into animated image form (for example, GIF form).
In certain embodiments, if the second electronic equipment is not configured as interacting image sequence as one group,
Replace send presentation graphics and do not send other images in image sequence, the first electronic equipment show give the user for
Image sequence (and, in certain embodiments, the audio frequency corresponding with image sequence) is converted into video segment and/or animation
The menu (for example, pop-up menu) of the option of GIF.In response to " being converted to video " and/or " sending as video " option
User select, the video corresponding with image sequence be sent to the second electronic equipment.In certain embodiments, in response to right
The user of " being converted to video " and/or " sending as video " option selects, and image sequence is converted into regarding by the first electronic equipment
Frequency simultaneously sends the video to the second electronic equipment.In response to the user to " being converted to GIF " and/or " sending as GIF " option
Select, the animated GIF corresponding with image sequence is sent to the second electronic equipment.In certain embodiments, in response to " turn
Be changed to GIF " and/or " sending as the GIF " user of option select, image sequence is converted into animated GIF by the first electronic equipment
And GIF is sent to the second electronic equipment.
In certain embodiments, for by first set of choices being at least partly sent to the second electronic equipment of image sequence
Close inclusion (2514) to be used for the option being at least partially converted into video format (for example, mpeg format) of image sequence.One
In a little embodiments, the first electronic equipment show give the user for by image sequence (and, in certain embodiments, with figure
As the corresponding audio frequency of sequence) be converted into video segment and/or animated GIF option menu (for example, delivery option menu),
No matter whether the second electronic equipment is configured to interact image sequence as one group.Therefore, if such option quilt
Select, even if then the second electronic equipment is configured to interact image sequence as one group, video or animated GIF replace figure
As sequence (with and without associated audio frequency and/or metadata) is sent to the second electronic equipment.
In certain embodiments, the first electronic equipment shows and gives the user for image sequence is (and, real at some
Apply in example, the audio frequency corresponding with image sequence) menu of the option that is converted into video segment and/or animated GIF (for example, leads
Go out, " being sent as " or " being converted to " menu).If such option is chosen, image sequence is (with and without associated
Audio frequency and/or metadata) video or animated GIF are converted into according to the option selecting.
In certain embodiments, it is configured to for image sequence to interact (example as one group according to the second electronic equipment
As, the second electronic equipment is configured to execute the interaction with reference to Fig. 6 A-6FF description) determination, the first electronic equipment sends
(2516) audio frequency corresponding with image sequence.For example, when the first option set is included for sending whole image sequence (example
As sent enhancement mode photo) option and the user of the first electronic equipment selects the option for sending whole image sequence
When, the first electronic equipment sends the audio to the second electronic equipment and allows the user of the second electronic equipment to play back enhancement mode photograph
Piece and audio frequency, as described by with reference to Fig. 6 F-6I.
In certain embodiments, it is configured to for image sequence to interact (example as one group according to the second electronic equipment
As, the second electronic equipment is configured to execute the interaction with reference to Fig. 6 A-6FF description) determination, the first electronic equipment sends
(2518) metadata corresponding with the first image sequence.For example, when the first option set is included for sending whole image sequence
The user of the option of row (for example, sending enhancement mode photo) and the first electronic equipment selects for sending whole image sequence
Option when, the first electronic equipment transmits metadata to the second electronic equipment and the user of the second electronic equipment is played back
Enhancement mode photo and metadata, as described by with reference to Fig. 6 J-6M.In certain embodiments, for first number of image sequence
According to being linked to image sequence (or being otherwise associated with), metadata for example the time, the date, position (for example, via
GPS), weather, when image sequence be collected when play music (for example, using the such as Shazam in the first electronic equipment,
SoundHound or Midomi music recognition software identification music) and/or local event information (for example when the first image sequence
When row are collected and in the local sports tournament play that the first image sequence is collected), information (is for example finally divided after event
Number).
In certain embodiments, for by second set of choices being at least partly sent to the second electronic equipment of image sequence
Close include (2520) be used for send presentation graphics and do not send by camera collection presentation graphics after collection one or
Multiple images and do not send by camera collection presentation graphics before collection one or more images (for example, send generation
Table image is as rest image) option.
In certain embodiments, the first electronic equipment determines whether (2522) first electronic equipments are in and allows image sequence
In the first mode that row (for example, as a group) send.It is not in allowing image sequence as one according to the first electronic equipment
Determination in the first mode that group is transmitted, the first electronic equipment is changed for image sequence is at least partly sent to
First option set of two electronic equipments.In certain embodiments, replace only sending representative diagram to send image sequence
Picture, in addition to determining that the second electronic equipment is configured to interact image sequence as one group, the first electronic equipment is also
Need to be in permission using image sequence as in one group of pattern being transmitted, rather than be in only permission and send from image sequence
In the pattern of rest image (for example, presentation graphics) of row.In certain embodiments, user can be using such as Figure 20 B
Shown switches can piece supplying being selected between these two modes of switch 2006.
In certain embodiments, when showing presentation graphics one of (and selection group) over the display, first
Electronic equipment detection (2524) second input.In response to the second input is detected, the first electronic equipment is using in image sequence
The order of at least some of image image show to substitute the display to presentation graphics.In certain embodiments, first
Electronic equipment is configured as showing and plays back enhancement mode photo during sharing option, and this can help user to determine how altogether she want
Enjoy photo (for example, as enhancement mode photo, video, GIF or rest image).
In certain embodiments, first electronic equipment the first electronic equipment include (2526) Touch sensitive surface and for detection with
One or more sensors of the intensity of the contact of Touch sensitive surface.Second input includes meeting the finger of the first contact strength criterion
Contact.For example, when set of choices is shown, the depth on presentation graphics pushes back puts enhancement mode photo.
It should be appreciated that the particular order that the operation in Figure 25 A-25C has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 2510 and 2512 is omitted.Extraly it is noted that this
In literary composition with regard to additive method described herein (for example, method 900,1000,10000,10050,1100,11000,1200,
2400th, 2600 and 2700) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 25 A-25C
The method 2500 stated.For example, above with respect to the contact of method 2500 description, gesture, user interface object, intensity threshold, animation
With image sequence alternatively have herein in connection with additive method described herein (for example, method 900,1000,10000,
10050th, 1100,11000,1200,2400,2600 and 2700) contact that describes, gesture, user interface object, intensity threshold,
One or more of feature of animation and image sequence.For simplicity, these details are not repeated herein.
Figure 26 A-26D is illustrated and is gathered photo (for example, enhancement mode photo using scene Recognition according to some embodiments
Or picture) method 2600 flow chart.In the electronic equipment with display, camera and alternatively Touch sensitive surface
(for example, the portable multifunction device 100 of the equipment 300 of Fig. 3 or Figure 1A) place executes method 2600.According to some embodiments,
This equipment includes one or more sensors of the intensity of the contact for detection with Touch sensitive surface.In certain embodiments, institute
Stating display is touch-screen display and described Touch sensitive surface is on the display or integrated with described display.One
In a little embodiments, described display is separated with described Touch sensitive surface.Certain operations in method 2600 be alternatively combined and/
Or the order of certain operations is alternatively changed.
According to some embodiments, this equipment executes scene Recognition when from camera captures images.Fast in response to user's activation
Door, this equipment be based on scene Recognition come determine be intended to retain image sequence (for example, as enhancement mode photo) be also intended to retain quiet
Only image.For example, when scene includes moving in a large number, this equipment retains enhancement mode photo automatically.As another example, work as scene
During including a large amount of texts (for example, " scene " being only receipt or the page from books), this equipment retains rest image.
For this reason, work as being in for (for example, being marked as static/enhancement mode automatically in the first acquisition of media pattern of camera
The pattern of picture mode) when:This equipment shows the live preview of (2602) scene over the display.
This equipment executes (2604) scene Recognition to scene.In certain embodiments, execution scene Recognition includes identifying field
Text in motion in face in scape, identification scene, identification scene, identification scene are indoors or outdoor (for example, is known
The brightness of other threshold quantity and/or identification sunlight) and/or the depth in place of identification scene (for example, determine whether scene is scape
See).
When showing the live preview of scene, this equipment detection (2606) swashed to the single of shutter release button at the very first time
Live.In certain embodiments, detection includes to the single actuation of shutter release button detecting at the very first time and is in the very first time
Pressing or detecting the gesture on virtual shutter button on the touch sensitive display at the very first time of physical button, for example, exist
Shutter discharges the tap gesture on icon or the tap gesture on live preview, and wherein live preview is pressed as virtual shutter
Button.In certain embodiments, the activation detecting is the single actuation to shutter release button (for example, similar in conventional digital phase
It is used in machine capturing the single actuation of single image with the rest image pattern of conventional digital camera).In certain embodiments, right
The single actuation of shutter release button does not require activation to be maintained any special time amount (for example, any of shutter release button to be detected
The activation arrived will meet, regardless of activating the length being maintained).
In response to (2608) single actuation to shutter release button at the very first time is detected:Meet action according to scene to catch
It is approved the determination then (for example, being related to the criterion of the activity in scene), be at least partially based on the scene Recognition to scene execution,
This equipment retains by camera in the multiple images with collection on the time close to the activation of shutter release button in the very first time simultaneously
Multiple images are grouped in the first image sequence (for example, this equipment retains the enhancement mode photo of scene, as Figure 22 C-22D institute
Show).
First image sequence includes:Collection before the activation to shutter release button for the very first time is being detected by camera
Multiple images;Presentation graphics, this presentation graphics represents the first image sequence and by camera in the first image sequence
Gather after one or more of other images;And the multiple images by camera collection after collection presentation graphics.
The determination that action captures criterion is unsatisfactory for according to scene, this equipment retains in time close to right at the very first time
The activation of shutter release button single image (and not by by camera with gather close to the activation of shutter release button in the very first time
Time on collection multiple images be grouped in the first image sequence, as shown in Figure 22 A-22B).
In certain embodiments, before the activation to shutter release button at the very first time is detected, the image of collection is pre-
Define the image of quantity, such as 5,10,15,20,25 or 30 images.In certain embodiments, detecting
It is (example in time predefined before the first time to the image of collection before the activation of shutter release button at the very first time
As in before the first time 0.5 second, 1.0 seconds, 1.5 seconds, 2.0 seconds or 2.5 seconds) image.In certain embodiments, its
In, before the activation to shutter release button at the very first time is detected, the multiple images of collection are in from (in the very first time
Time range between second time and the very first time before), and detecting at the very first time to shutter release button
Before activation, collection multiple images are independent of interacting close to the second time and shutter release button in time.For example, in detection
To before the activation to shutter release button at the very first time collection multiple images be not in response to detect close in time
The gathering with interacting of shutter release button of second time.For example, the activation to shutter release button at the very first time is being detected
The multiple images before gathering are not in response to and detect at the second time or close to the second time, part (or complete) is swashed
Live and gather.
In certain embodiments, this equipment starts collection storage image after the first acquisition of media pattern of entrance.
In certain embodiments, the activation to shutter release button at the very first time is being detected in the first image sequence
The multiple images before gathering meet predefined packet criterion.In certain embodiments, predefine packet criterion to include selecting
The image of the predefined quantity before presentation graphics.In certain embodiments, predefine packet criterion to include selecting immediately
Detecting to the image in the time predefined scope before the activation of shutter release button.In certain embodiments, predefine and divide
Group criterion includes selecting the image in the time predefined scope before the time of collection presentation graphics.In some embodiments
In, predefined packet criterion includes the movement based on scene Recognition and/or equipment for the choosing, and to select image, (for example, this equipment abandons
The image obtaining when the movement of this equipment is too many, thus abandon the image for example shooting when this equipment is lifted by user).
In certain embodiments, presentation graphics is gathered and similar to when conventional digital phase in the very first time by camera
With the single image of its rest image pattern capture when the shutter release button of machine is activated.In certain embodiments, by collected by camera
Presentation graphics with the very first time collection image corresponding.In certain embodiments, by the representativeness of collected by camera
Image with after the activation to shutter release button at the very first time is detected soon to consider shutter sluggish (right detecting
Time delay between the activation of shutter release button and capture/storage presentation graphics) time at collection image corresponding.?
In some embodiments, such as image is used to indicate by the presentation graphics of collected by camera and assumes the image sequence in pattern.
In certain embodiments, the first image sequence includes the predefined quantity of collection after collection presentation graphics
Image, such as 5,10,15,20,25 or 30 images.In certain embodiments, collection presentation graphics it
The image gathering afterwards be collection presentation graphics after time predefined in (for example, collection presentation graphics after
In 0.5 second, 1.0 seconds, 1.5 seconds, 2.0 seconds or 2.5 seconds) image.In certain embodiments, the first image sequence is included in detection
To at the very first time to after the activation of shutter release button collection predefined quantity image, such as 5,10,15,20
Individual, 25 or 30 images.In certain embodiments, gather after the activation to shutter release button at the very first time is detected
Image be in time predefined after the first time (for example, 0.5 second after the first time, 1.0 seconds, 1.5 seconds,
In 2.0 seconds or 2.5 seconds) image.In certain embodiments, gathering after collection presentation graphics in the first image sequence
Multiple images meet predefined packet criterion.In certain embodiments, predefine packet criterion to include selecting in representative diagram
Image as predefined quantity afterwards.In certain embodiments, predefine packet criterion to include selecting immediately preceding detecting
To the image in the time predefined scope after the activation of shutter release button.In certain embodiments, predefine packet criterion bag
Include the image in the time predefined scope selecting after the time of collection presentation graphics.In certain embodiments, make a reservation for
Justice packet criterion includes the movement based on scene Recognition and/or equipment for the choosing and selects image.
In certain embodiments, action capture criterion includes one or more of (2610) identification scene face.One
In a little embodiments, when this equipment recognizes at least one of scene face, this equipment retains multiple images and to multiple figures
As packet.
In certain embodiments, this equipment includes (2612) for gathering the default image capture parameter of image.This equipment
Determine that (2614) scene comprises the single face on portrait orientation, single face occupies the scheduled volume more than display.In response to
Determine that scene comprises the single face on portrait orientation, single face occupies the scheduled volume more than display:This equipment gathers
(2616) (for example, (and/or reservation) have the multiple images capturing the diverse image capture parameters of parameter with default image
Higher frame per second with capture the little change in expression, higher resolution with preferably capture details, etc.).
In certain embodiments, (for example, the motion that action capture criterion is included in (2618) identification scene (for example, detects
In live preview) higher than predetermined threshold motion).In certain embodiments, when this equipment recognize at least pre- in scene
During the motion of definition threshold quantity, this equipment retains multiple images and simultaneously multiple images is grouped.
In certain embodiments, (2620) are included to scene execution scene Recognition and determine the quantity of motion in scene.Retain by
Camera multiple images of collection on the time close to the activation of shutter release button with the very first time include:According to quantity of motion
For the determination of the first amount, multiple images are retained with the first frame per second;And according to quantity of motion be second amount bigger than the first amount really
Fixed, image is retained with second frame per second higher than the first frame per second.
In some cases, electronic equipment is mobile (for example, move and/or translate) in itself.In certain embodiments,
Action capture criterion includes (2622) and the movement of electronic equipment is detected higher than predetermined threshold.In certain embodiments, this equipment
Movement some characteristics indicate this equipment be aimed at the scene of movement (for example, this equipment when keep significantly level when quilt
Move).When this equipment determines that this equipment is aimed, this equipment retains multiple images and multiple images is grouped.For example, one
In the case of a little, this equipment is moved to follow the tracks of object (athlete for example, taking part in the match, the automobile passing through, etc.).Real at some
Apply in example, the movement of detection electronic equipment includes the acceleration (Figure 1A) detecting this equipment using accelerometer 168.
In certain embodiments, the number of the image in the multiple images being retained depends on (2624) when the multiple figures of collection
As when the movement of equipment that detects.For example, this equipment identifies when that it is translated and (for example, is attached to mountain bike or cunning
The helmet of snow person).When this equipment by quickly translation (for example, as by acceleration vibration and/or quick change indicated)
When, this equipment retains multiple images and to multiple images packet with higher frame per second and/or in the longer time period, thus leading to
Greater amount of image is retained in multiple images.
In certain embodiments, execution scene Recognition includes identifying landscape (for example, waterfall, windmill, the leaveves with activity
The tree swinging with the wind).When this equipment recognizes this equipment and capturing the landscape with activity, this equipment retains multiple images
And (for example, as enhancement mode photo) is grouped to multiple images.In certain embodiments, the enhancement mode photo of landscape is with the side of circulation
Formula playback makes landscape scene continuously occur.
On the contrary, according to there is not face in the scene, significantly do not move in scene and/or electronic equipment does not have in itself
The determination of mobile (for example, this equipment is fixing), this equipment swashs to shutter release button at the very first time in response to detecting
Live (without being divided in the multiple images with collection on the time close to the activation of shutter release button in the very first time by camera
Group is in the first image sequence) and gather single image.In certain embodiments, single image is to merge multiple rest images
Rest image, such as HDR (HDR) rest image.
In certain embodiments, some characteristics of the movement of this equipment indicate that this equipment is not aimed (for example, from user's
Pocket is drawn out and/or is moving upward to aim at scene).When equipment determines that it moves in the case of not being aimed,
This equipment retains single image.
In certain embodiments, scene execution scene Recognition is included with (2626) identification text.Action capture criterion includes
When the amount of the text in scene is less than the criterion being satisfied during predefined threshold value.In certain embodiments, this equipment picture is when figure
Piece is to be identified during the picture of receipt or document.When the picture that picture is receipt or document, this equipment capturing still image
Rather than enhancement mode photo.
It should be appreciated that the particular order that the operation in Figure 26 A-26D has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 2610 and 2612 is omitted.Extraly it is noted that this
In literary composition with regard to additive method described herein (for example, method 900,1000,10000,10050,1100,11000,1200,
2400th, 2500 and 2700) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 26 A-26D
The method 2600 stated.For example, above with respect to the contact of method 2600 description, gesture, user interface object, intensity threshold, animation
With image sequence alternatively have herein in connection with additive method described herein (for example, method 900,1000,10000,
10050th, 1100,11000,1200,2400,2500 and 2700) contact that describes, gesture, user interface object, intensity threshold,
One or more of feature of animation and image sequence.For simplicity, these details are not repeated herein.
The method that Figure 27 A-27D illustrates the montage sequence (for example, enhancement mode photo) according to some embodiments
2700 flow chart.In electronic equipment (for example, the equipment 300 of Fig. 3 with display, camera and alternatively Touch sensitive surface
Or the portable multifunction device 100 of Figure 1A) place's execution method 2700.In certain embodiments, this equipment is included for detecting
One or more sensors of the intensity of the contact with Touch sensitive surface.In certain embodiments, display is touch-screen display
And Touch sensitive surface is over the display or integrated with display.In certain embodiments, display is separated with Touch sensitive surface.Side
Certain operations in method 2700 are alternatively combined and/or the order of certain operations is alternatively changed.
According to some embodiments, this equipment provides the subset for being image sequence by image sequence editing (for example, to change
Beginning image in image sequence and terminate image) user interface.When user's request editing enhancement mode photo, this equipment carries
Can be used for changing the removable shank starting image and terminating image in image sequence for user.The initial position of shank
(for example, when user enters the user interface for montage sequence first time) (for example, is examined based on scene by this equipment
Survey) automatically provide.In certain embodiments, user in the beginning image of automatic suggestion and can terminate in image and image sequence
Initial pictures and final image between switch shank position.As used herein, term " initial pictures " and " final
Image " refers to the first image and last image in (original) image sequence, however, " beginning image " and " end image "
Refer to the first image in the subset of image sequence and last image.
For this reason, this equipment shows (2702) image (for example, when this equipment is in image edit mode over the display
When).
This image is one of the image sequence that shot by camera image.Image sequence includes presentation graphics.Image
Sequence is included by one or more images of camera collection after collection presentation graphics.Image sequence includes being adopted by camera
One or more images of collection before collection presentation graphics.In certain embodiments, the camera of shooting image sequence is electronics
The part of equipment.In certain embodiments, image sequence shoots (for example, image sequence by the camera of the part not being electronic equipment
It is listed in and is transferred to electronic equipment after being taken using the camera on another equipment).In certain embodiments, image sequence rings
Ying Yu detects and at the very first time, the activation of shutter release button is obtained, such as herein with reference to Fig. 5 A-5K and method 900
And/or described by Figure 22 A-22D and method 2600.In certain embodiments, presentation graphics and the representative by collected by camera
Property image is corresponding, as described by herein with reference to Fig. 5 A-5K and method 900 and/or Figure 22 A-22D and method 2600.
In certain embodiments, when an image is displayed, this equipment display image is regarding of the respective image in image sequence
Feel the instruction instruction of self-reinforcing type photo (for example, during this equipment display image).In certain embodiments, image is image sequence
Visually indicating of respective image in row is can piece supplying (for example, the selectable of Figure 23 A can piece supplying 2304).In some embodiments
In, piece supplying can be animated demonstration when enhancement mode photo is shown, and can piece supplying not be animated when rest image is shown
Demonstration.In certain embodiments, to the activation of piece supplying the display to edit menu can be obtained, can select from this edit menu user
Select the function for montage sequence.In certain embodiments, the certain operations of method 2700 activate for editing in user
It is performed after the function of image sequence.
In certain embodiments, shown image is the presentation graphics that (2704) are derived from image sequence.
In certain embodiments, shown image is the currently selected image that (2706) are derived from image sequence.?
In some embodiments, currently selected image is distinguished by vision with other images in image sequence.In certain embodiments,
The current image selecting is displayed on, in the expression (for example, thumbnail) of the current image selecting, the image comprising in image sequence
Expression in the case of be simultaneously displayed in (described below) second area.In certain embodiments, to given image
Selection utilization given image to substitute presentation graphics as new presentation graphics.In certain embodiments, user is permissible
New presentation graphics is selected to select for scheming by tapping is carried out on the thumbnail of the given image in image sequence
New presentation graphics as sequence.
When the image in display image sequence over the display, this equipment detection (2708) first input (for example, detects
The input corresponding with the request of the user interface being used for montage sequence for display, such as the editing icon in Figure 23 B
Tap gesture on 2310-3).
In response to the first input is detected:This equipment shows (2710) for being lacking of image sequence by image sequence editing
User interface in whole subsets.In certain embodiments, user interface is the part of editing pattern.
This user interface includes:Comprise the expression of image in image sequence region (for example, band, such as Figure 23 C
Band 2316);User is adjustable to start editing icon (for example, the beginning shank 2320-a of Figure 23 C), and it is via beginning editing
Position in the region of expression of image in comprising image sequence for the icon to be defined and to start figure in the subset of image sequence
Picture;And user's adjustable end editing icon (for example, the beginning shank 2320-b of Figure 23 C), it is via end editing figure
The position being marked in the region of expression of the image comprising in image sequence is defining the end image in the subset of image sequence.
In certain embodiments, image represent less than shown image.In certain embodiments, the expression of image is image sequence
In image thumbnail.In certain embodiments, the expression of image to arrange in chronological order.
Start editing icon and be positioned in automatically being selected by equipment in the region of expression of the image comprising in image sequence
The first position selected.Terminate editing icon be positioned in the region of expression of the image comprising in image sequence by setting
The standby second position automatically selecting.In certain embodiments, by this equipment automatically select for subset image (its
By the primary importance of editing icon illustrate) be not initial pictures in image sequence.In certain embodiments, for subset
Beginning image be ratio initial pictures in image sequence image more posteriorly.In certain embodiments, automatically selected by this equipment
The end image (it is illustrated by the second position terminating editing icon) for subset selected is not the final figure in image sequence
Picture.In certain embodiments, the end image for subset is the more forwardly of image of ratio final image in image sequence.
In certain embodiments, start editing icon and terminate editing icon between image expression with comprise image
Other in the region of the expression of the image in sequence represent by vision differentiation (2712).In certain embodiments, image sequence
Both include by the uninterrupted sequential chart image set of the image of collected by camera with the subset of image sequence.
In certain embodiments, the user interface for montage sequence includes the figure in (2714) display image sequence
The second area of picture.Second area is simultaneously shown with the region of the expression of the image comprising in image sequence.Real at some
Apply in example, second area occupies the half exceeding display, exceed the 80% of display, or exceed display 90%.?
In some embodiments, the second area of display occupies the whole region of display, saves at the top and bottom of display
Optional menu bar and the region comprising the expression of image in image sequence.
In certain embodiments, automatically select first that starts image and corresponding beginning editing icon in subset
Put the one or more features based on the image in (2716) image sequence.Automatically select end image and the correspondence in subset
Beginning editing icon one or more features based on the image in image sequence for the second position.In certain embodiments,
This equipment is selected/is advised based on the scene Recognition in image sequence beginning image and end image in subset.
In certain embodiments, image sequence includes (2718) initial pictures and final image.This equipment will not be included
In image sequence before the initial pictures in image sequence obtain and/or after the final image in image sequence
One or more expressions of the image obtaining are shown in the region of expression of the image comprising in image sequence.In some enforcements
In example, except editing original sequence, user can also add just before original sequence or just original
The image obtaining after image sequence.
In certain embodiments, before the second input is detected, when display is for the user interface of montage sequence
When, this equipment detection (2720) terminates the input on editing icon.In response to the input on editing icon is detected, this equipment will
Terminate the 3rd position from the region of the expression that the second position moves to the image comprising image sequence for the editing icon.One
In a little embodiments, user can rewrite the end image for subset automatically recommended/selected by this equipment manually.Similar
Ground, in certain embodiments, user can rewrite manually by this equipment automatically recommend/select for subset figure
Picture, such as move to by beginning editing icon using beginning on beginning editing icon and from primary importance and comprise image sequence
In the region of the expression of image in another location drag gesture.
When display is for the user interface of montage sequence, this equipment detection (2722) second input (for example, detects
The activation completing icon 2301 to Figure 23 C) or opened according to the current location starting editing icon and end editing icon
Another icon of beginning editing).
In response to the second input is detected, this equipment according to the current location starting editing icon and terminates editing icon
Image sequence editing (2724) is the subset of image sequence by current location.In certain embodiments, by image sequence editing
Subset for image sequence includes storing the data of the position of the end image starting in image and subset in instruction subset.
In certain embodiments, this equipment is deleted the subset that (2726) are not included in image sequence from image sequence
Image.In certain embodiments, (it is by this equipment by those images that image sequence is compiled as only including in subset for this equipment
Automatically select/advise and confirmed by user, or it is manually selected by user).In certain embodiments, this equipment continues storage
Not image in the subsets, for example, allow user in the time after a while in (for example, as obtained by camera) image sequence
All original images obtainable in the case of change image sequence further.
In certain embodiments, image sequence includes (2728) initial pictures and final image, and user interface includes
Replacement can piece supplying (for example, the reset button 2324 of Figure 23 C), " manual " button or other similar icons).Detecting second
Before input, when display is for the user interface of montage sequence, the detection of this equipment resets can input (example in piece supplying
As the tap gesture in the reset button 2324 of Figure 23 C).In response to detecting, reset can input in piece supplying:This equipment will be opened
Beginning editing icon be shown in the region of expression of the image comprising in image sequence with to the initial pictures in image sequence
It is bound at corresponding position;And the area that editing icon is shown in the expression of the image comprising in image sequence will be terminated
Being bound at corresponding position with to the final image in image sequence in domain.
In certain embodiments, in response to detecting, reset can input in piece supplying:This equipment is in comprising image sequence
The region of the expression of image in display (2730) automatically select can piece supplying (for example, " automatic " icon 2332 of Figure 23 D), when oneself
Dynamic selection piece supplying can be shown in starting editing icon first position and be shown in the by terminating editing icon when being activated
At two positions.In certain embodiments, the display of " automatic " icon is substituted with the display to " replacement " icon.
In certain embodiments, in response to the 3rd input is detected, this equipment choice (2732) is used for the son of image sequence
New presentation graphics (for example, the image in the centre of subset or the scene based on the subset execution to image sequence of collection
The image of identification selection).
In certain embodiments, according to start editing icon current location and terminate editing icon current location Lai
After the subset that image sequence editing is image sequence, this equipment (for example, when this equipment is in image and presents in pattern)
The presentation graphics of the subset of display (2734) image sequence over the display.In certain embodiments, the subset of image sequence
Presentation graphics identical with the presentation graphics of image sequence.In certain embodiments, the representativeness of the subset of image sequence
Image is different from the presentation graphics of image sequence.In certain embodiments, in response to for exiting the request of edit pattern
Corresponding input is showing the presentation graphics of subset.When showing presentation graphics over the display, this equipment detection the
Three inputs (for example, the corresponding input of request with the subset for replay image sequence, for example press and hold gesture or
Meet the gesture of the contact strength criterion for playback).In response to the 3rd input is detected, this equipment utilization is to image sequence
The animation of subset plays back and to substitute the display to presentation graphics.In certain embodiments, with such as herein with reference to Fig. 6 A-
The playback similar mode to image sequence of 6FF and method 1000/10000/10050 description to play back the son of image sequence
Collection.
In certain embodiments, in response to the 4th input is detected, the subset to image sequence for this equipment forbidden (2736)
Animation playback simultaneously retain image subset.
In certain embodiments, corresponding 3rd defeated with order to edit the request of presentation graphics in response to detecting
Enter, this equipment provides (2738) to be used for executing following option to the user of equipment:Disabling the dynamic of the subset to image sequence
Continue editor's presentation graphics in the case of drawing playback;And cancel edlin is entered to presentation graphics.
In certain embodiments, this equipment present (2740) for delete in image sequence in addition to presentation graphics
Image can piece supplying.In response to the 3rd input is detected, this equipment is deleted and is gathered after collection presentation graphics by camera
One or more images and by camera collection presentation graphics before collection one or more images.
It should be appreciated that the particular order that the operation in Figure 27 A-27E has been described is merely exemplary and not purport
It is the unique order that operation can be performed in the described order of instruction.It will be recognized by those of ordinary skill in the art that for right
The various modes that operation described herein is resequenced.In some embodiments, one described herein or many
Individual operation can be omitted.For example, in certain embodiments, operation 2714 and 2716 is omitted.Extraly it is noted that this
In literary composition with regard to additive method described herein (for example, method 900,1000,10000,10050,1100,11000,1200,
2400th, 2500 and 2600) details of other processes describing similar mode can also be applied to and retouch above with reference to Figure 27 A-27E
The method 2700 stated.For example, above with respect to the contact of method 2700 description, gesture, user interface object, intensity threshold, animation
With image sequence alternatively have herein in connection with additive method described herein (for example, method 900,1000,10000,
10050th, 1100,11000,1200,2400,2500 and 2600) contact that describes, gesture, user interface object, intensity threshold,
One or more of feature of animation and image sequence.For simplicity, these details are not repeated herein.
According to some embodiments, Figure 28 shows the electronic equipment 2800 of the principle configuration of the embodiment according to various descriptions
Functional-block diagram.The functional block of described equipment is alternatively implemented each to realize by the combination of hardware, software or hardware and software
Plant the principle of the embodiment of description.It will be understood by those skilled in the art that the functional block described in Figure 28 is alternatively combined or quilt
It is separated into sub- frame to implement the principle of the embodiment of various descriptions.Therefore, description herein is alternatively supported to retouching herein
Any possible combination of the functional block stated or separate or limit further.
As shown in figure 28, electronic equipment 2800 includes:Display unit 2802, it is display configured to image;Touch sensitive surface
Unit 2804, it is configured to detect user input;And processing unit 2808, it is with display unit 2802 and Touch sensitive surface list
Unit 2804 coupling.In certain embodiments, processing unit 2808 includes:Display enabling unit 2810, detector unit 2812, modification
Unit 2814 and deletion unit 2816.
Processing unit 2808 is configured on display unit 2812 (for example, using display enabling unit 2810) and makes it possible to
Enough show presentation graphics.Representative diagram seems one of the image sequence that shot by camera image.Image sequence include by
One or more images of camera collection after collection presentation graphics.Image sequence is included by camera in collection representative diagram
As the one or more images gathering before.Processing unit 2808 is additionally configured to work as and makes it possible to show on display unit 2802
(for example, utilize detector unit 2812 together with Touch sensitive surface unit 2804) when showing presentation graphics and detect for modification representative
The input of property image.Processing unit 2808 is additionally configured to detect the input for changing presentation graphics:According to this equipment
It is in the determination in the first edit pattern, (for example, using modification unit 2814) is changed presentation graphics, gathered generation by camera
After table image collection one or more images and by camera collection presentation graphics before collection one or many
Individual image;And, according to the determination being in the second edit pattern diverse with the first edit pattern, (for example, using repairing
Change unit 2814) modification presentation graphics, and do not change by camera collection presentation graphics after collection one or more
Image, and do not change by one or more images of camera collection before collection presentation graphics.
According to some embodiments, Figure 29 shows the electronic equipment 2900 of the principle configuration of the embodiment according to various descriptions
Functional block diagram.The functional block of described equipment is alternatively implemented various to realize by the combination of hardware, software or hardware and software
The principle of the embodiment of description.It will be understood by those skilled in the art that the functional block described in Figure 29 is alternatively combined or is divided
From the sub- frame of one-tenth to implement the principle of the embodiment of various descriptions.Therefore, description herein is alternatively supported to described herein
Any possible combination of functional block or separate or limit further.
As shown in figure 29, the first electronic equipment 2900 includes:Display unit 2902, it is display configured to image;Optional
Touch sensitive surface unit 2904, it is configured to detection input;One or more optional sensor units 2906, it is configured to examine
Survey the intensity with the contact of Touch sensitive surface unit 2904;And processing unit 2908, it is with display unit 2902, optional touch-sensitive table
Face unit 2904 and one or more optional sensor unit 2906 couple.In certain embodiments, processing unit 2908 wraps
Include:Display enabling unit 2910, detector unit 2912, determining unit 2914, modification unit 2916 and transmitting element 2918.
Processing unit 2908 is configured on display unit 2902 (for example, using display enabling unit 2910) and makes it possible to
Enough presentation graphics is shown in the user interface of the application being configured to be communicated with other electronic equipments.Representative diagram seems
One of the image sequence being shot by camera image.Image sequence includes the collection after collection presentation graphics by camera
One or more images.Image sequence is included by one or more images of camera collection before collection presentation graphics.Place
Reason unit 2908 is additionally configured to when making it possible to show presentation graphics on display unit 2902 (for example, using detection
Unit 2912 is together with Touch sensitive surface unit 2904) detection with order to send the request of presentation graphics or to be used for for selection
Be sent to the presentation graphics of the second long-range electronic equipment in electronic equipment using application asks corresponding input.Place
Reason unit 2908 be additionally configured in response to detect with using applying in order to send the request of presentation graphics or in order to select
For the corresponding input of the request being sent to the presentation graphics of the second electronic equipment:It is configured to according to the second electronic equipment
Using image sequence as one group of determination interacting, (for example, utilizing display enabling unit 2910) makes it possible to display and is used for
The first option set being at least partly sent to the second electronic equipment by image sequence;And according to the second electronic equipment not by
It is configured to image sequence as one group of determination interacting, (for example, using display enabling unit 2910) makes it possible to show
Show for by second option set being at least partly sent to the second electronic equipment of image sequence.Second option set and first
Option set is different.
According to some embodiments, Figure 30 shows the electronic equipment 3000 of the principle configuration of the embodiment according to various descriptions
Functional block diagram.The functional block of described equipment is alternatively implemented various to realize by the combination of hardware, software or hardware and software
The principle of the embodiment of description.It will be understood by those skilled in the art that the functional block described in Figure 30 is alternatively combined or is divided
From the sub- frame of one-tenth to implement the principle of the embodiment of various descriptions.Therefore, description herein is alternatively supported to described herein
Any possible combination of functional block or separate or limit further.
As shown in figure 30, electronic equipment 3000 includes:Display unit 3002, it is display configured to image;Optional touch-sensitive
Surface cell 3004, it is configured to detect user input;Camera unit 2906, it is configured to gather image;And process
Unit 3008, it is coupled with display unit 3002, optional Touch sensitive surface unit 3004 and camera unit 2906.In some embodiments
In, processing unit 3008 includes:Display enabling unit 3010, detector unit 3012, scene Recognition performance element 3014, reservation are single
Unit 3016, grouped element 3018 and collecting unit 3020.
Processing unit 3008 is configured as being in for when in the first acquisition of media pattern of camera unit 3006:Aobvious
Show that on unit 3002, (for example, using display enabling unit 3010) makes it possible to show that the live preview of scene simultaneously (for example, utilizes
Scene Recognition performance element 3014) scene Recognition is executed to scene.Processing unit 3008 is additionally configured to work as and makes it possible to show
Detect at the very first time to fast (for example, using detector unit 3012 together with Touch sensitive surface unit 3004) during live preview
The single actuation of door button.Processing unit 3008 is configured to respond to the single to shutter release button at the very first time is detected
Activation, meets, according to scene, the determination that action captures criterion, is at least partially based on the scene Recognition to scene execution, (for example, profit
With stick unit 3016) retain by camera unit 3006 with activation close time to shutter release button for the very first time
Simultaneously (for example, using grouped element 3018) multiple images are grouped in the first image sequence the multiple images of collection.First figure
As sequence includes:By camera unit 3006 in multiple figures collection before the activation to shutter release button for the very first time is detected
Picture;Presentation graphics, this presentation graphics represents the first image sequence and by camera unit 3006 in the first image sequence
One or more of other images after gather;And the collection after collection presentation graphics by camera unit 3006
Multiple images.Processing unit 3008 is configured to be unsatisfactory for, according to scene, the determination that action captures criterion, (for example, using reservation
Unit 3016) retain in time close to the single image to the activation of shutter release button at the very first time.
According to some embodiments, Figure 31 shows the electronic equipment 3100 of the principle configuration of the embodiment according to various descriptions
Functional-block diagram.The functional block of described equipment is alternatively implemented each to realize by the combination of hardware, software or hardware and software
Plant the principle of the embodiment of description.It will be understood by those skilled in the art that the functional block described in Figure 31 is alternatively combined or quilt
It is separated into sub- frame to implement the principle of the embodiment of various descriptions.Therefore, description herein is alternatively supported to retouching herein
Any possible combination of the functional block stated or separate or limit further.
As shown in figure 31, electronic equipment 3100 includes:Display unit 3102, it is display configured to image;Touch sensitive surface
Unit 3104, it is configured to detect user input;And processing unit 3108, it is with display unit 3102 and Touch sensitive surface list
Unit 3104 coupling.In certain embodiments, processing unit 3108 includes:Display enabling unit 3110, detector unit 3112, editing
Unit 3114, select unit 3116, mobile unit 3118 and deletion unit 3120.
Processing unit 3108 is configured on display unit 3102 (for example, using display enabling unit 3110) and makes it possible to
Enough display images.This image is one of the image sequence that shot by camera image.Image sequence includes presentation graphics.Figure
As sequence is included by one or more images of camera collection after collection presentation graphics.Image sequence includes being existed by camera
One or more images of collection before collection presentation graphics.
Processing unit 3108 is additionally configured to when the image making it possible in display image sequence on display unit 3102
When (for example, using detector unit 3112) detection first input.Processing unit 3108 is additionally configured in response to detecting first
Input:(for example, utilizing display enabling unit 3110) makes it possible to show for being lacking of image sequence by image sequence editing
User interface in whole subsets.This user interface includes:Comprise the region of the expression of image in image sequence;User can
The beginning editing icon adjusting, it is via the position in the region starting the expression in the image comprising in image sequence for the editing icon
Put the beginning image in the subset to define image sequence;And user's adjustable end editing icon, it is cut via end
The end in the subset of image sequence to be defined in the position collected in the region of expression of image in comprising image sequence for the icon
Image.Start editing icon and be positioned in being automatically selected by equipment in the region of expression of the image comprising in image sequence
First position.End editing icon is positioned in automatic by equipment in the region of expression of the image comprising in image sequence
The second position selecting.
Processing unit 3108 is additionally configured to the (example when making it possible to show the user interface for montage sequence
As using detector unit 3112) detection second input.Processing unit 3108 is additionally configured in response to the second input root is detected
To scheme according to the current location starting editing icon and the current location (for example, using editing unit 3114) terminating editing icon
As sequence cut is the subset of image sequence.
Operation in process as described above such as (for example, such as describes above with reference to Figure 1A and 3 alternately through running
) one or more of the information processor of general processor or special chip functional module to be implementing.Above with reference to figure
The operation of 9A-9G description alternatively to be implemented by the assembly described in Figure 1A -1B or Figure 13.For example, detection operates 908 alternatively
To be implemented by event classification program 170, event recognition program 180 and button.onrelease 190.In event classification program 170
Event monitor 171 detects the contact on touch-sensitive display 112, and event information is delivered by event schedule module 174
To application 136-1.Event information and corresponding event are defined 186 by the corresponding event recognizer 180 of application 136-1 to be compared
Relatively, and determine the first contact (or rotation of equipment) of the first position on Touch sensitive surface whether with predefined event or
Subevent is corresponding, the selection for example to the object in user interface of predefined event or subevent or to equipment from a kind of fixed
To the rotation to another kind of orientation.Accordingly predefine event or during subevent when detecting, event recognition program 180 activation with right
The button.onrelease 190 that the detection of event or subevent is associated.Button.onrelease 190 optionally uses or calls data
More new procedures 176 or object more new procedures 177 carry out more new opplication internal state 192.In certain embodiments, button.onrelease
What 190 access corresponding GUI more new procedures 178 are to update by applying display.Similarly, for those of ordinary skill in the art
It will be clear that, how other processes are implemented based on the assembly described in Figure 1A -1B.
The description having described above with reference to specific embodiment for illustrative purposes.Illustrative beg for however, above
It is limited to disclosed precise forms by being not intended to be exhaustive or by this utility model.For example, method described herein also may be used
Be applied in a similar manner with remotely control and display (for example, from California storehouse than Dinon Apple
Apple TV) communicate be arranged to management, playback and/or streaming (for example, from external server) audio content and/or
The electronic equipment of vision content.For such equipment, alternatively receive with the gesture on the Touch sensitive surface of remotely control, arrive
The phonetic entry of remotely control and/or the input corresponding to the activation of the button in remotely control, rather than in equipment itself
There is Touch sensitive surface, audio input device (for example, mike) and/or button.For such equipment, data is alternatively carried
It is supplied to display rather than shown by equipment itself.In view of teachings above can carry out many modifications and modification.Embodiment is chosen
And it is described in order to best to explain principle of the present utility model and its practical application, with so that this area other technologies people
Member is best using the enforcement of this utility model and the various descriptions with the various modifications as appropriate for the special-purpose predicted
Example.
Claims (54)
1. a kind of electronic equipment is it is characterised in that include:
Display unit, it is display configured to live preview;
Camera unit, it is configured to gather image;And
Processing unit, it is coupled with described display unit and described camera unit, and described processing unit is configured to:
When in being in for the first acquisition of media pattern of described camera unit:
Described live preview is shown on described display unit;
When showing described live preview, detect the activation to shutter release button at the very first time;And
In response to the activation to described shutter release button at the described very first time is detected:
By by described camera unit with the time close to the described activation of described shutter release button in the described very first time on
The multiple images of collection are grouped in the first image sequence, and wherein said first image sequence includes:
The multiple of collection were being detected before the described very first time is to the activation of described shutter release button by described camera unit
Image;
Presentation graphics, described presentation graphics represents described first image sequence and by described camera unit described first
Gather after one or more of other images in image sequence image;And
Multiple images by the collection after gathering described presentation graphics of described camera unit.
2. electronic equipment according to claim 1 is it is characterised in that detecting at the described very first time to described fast
Before the activation of door button, the plurality of image of collection is the image of predefined quantity.
3. electronic equipment according to claim 1 is it is characterised in that detecting at the described very first time to described fast
Before the activation of door button, the plurality of image of collection is the image in the time predefined before the described very first time.
4. electronic equipment according to claim 1 is it is characterised in that detecting at the described very first time to described
Before the activation of shutter release button, the plurality of image of collection is predetermined before collecting the time of described presentation graphics
Image in the adopted time.
5. electronic equipment according to claim 1 is it is characterised in that detecting at the described very first time to described fast
Before the activation of door button, the plurality of image of collection comes the comfortable described very first time and the before the described very first time
Time range between two times, and gathered before the activation to described shutter release button at the described very first time is detected
The plurality of image is independent of detection interacting close to described second time and described shutter release button in time.
6. electronic equipment according to claim 1 is it is characterised in that detecting in institute in described first image sequence
Before stating the activation to described shutter release button at the very first time, the plurality of image of collection meets one or more predefined points
Group criterion.
7. electronic equipment according to claim 6 is it is characterised in that described predefined packet criterion includes selecting in detection
To the image to the predefined quantity before the activation of described shutter release button.
8. electronic equipment according to claim 6 is it is characterised in that described predefined packet criterion includes selecting described
The image of the predefined quantity before presentation graphics.
9. electronic equipment according to claim 6 is it is characterised in that described predefined packet criterion includes selecting immediately
Image in the time predefined scope before the activation to described shutter release button is detected.
10. electronic equipment according to claim 6 is it is characterised in that described predefined packet criterion includes selecting tight
It is connected on the image in the time predefined scope before collecting the time of described presentation graphics.
11. electronic equipments according to claim 1 are it is characterised in that described equipment is entering described first acquisition of media
Start collection storage image after pattern, and delete afterwards and be not grouped into when being in described first acquisition of media pattern
In time close at the corresponding time to the image in the corresponding multiple images of the activation of described shutter release button.
12. electronic equipments according to claim 1 are it is characterised in that described equipment is opened after showing described live preview
Begin collection storage image, and delete afterwards and be not grouped in time when being in described first acquisition of media pattern
Close at the corresponding time to the image in the corresponding multiple images of the activation of described shutter release button.
13. electronic equipments according to claim 1 are it is characterised in that described equipment is only when showing described live preview
Stand on and detect that the activation to described shutter release button to gather and storage image, and deletion afterwards ought be in described first media and adopt
It is not grouped into when in integrated mode in time close to the accordingly multiple figures to the activation of described shutter release button at the corresponding time
The image that institute in picture gathers and stores.
14. electronic equipments according to claim 1 are it is characterised in that described first image sequence is deposited in memory
Store up as the first unique image collection.
15. electronic equipments according to claim 1 are it is characterised in that described live preview shows figure with first resolution
Picture, and the image of described first resolution that described first image sequence includes being displayed in described live preview.
16. electronic equipments according to claim 15 are it is characterised in that the described representativeness that gathered by described camera unit
Image has the second resolution higher than described first resolution.
17. electronic equipments according to claim 1 are it is characterised in that described processing unit is configured to:
In response to the activation to described shutter release button at the described very first time is detected:
The audio frequency corresponding with described first image sequence is associated with described first image sequence.
18. electronic equipments according to claim 1 are it is characterised in that described processing unit is configured to:
In response to the activation to described shutter release button at the described very first time is detected:
The metadata corresponding with described first image sequence is associated with described first image sequence.
19. electronic equipments according to claim 1 it is characterised in that described first acquisition of media pattern be configured to by
The user of described equipment enables or disables.
20. electronic equipments according to claim 19 it is characterised in that:
Described live preview be shown as including for enable described first acquisition of media pattern can piece supplying media capture use
The part at family interface;
When described first acquisition of media pattern is activated, described piece supplying demonstration can be animated;And
When described first acquisition of media pattern is disabled, described piece supplying can not be animated demonstration.
21. electronic equipments according to claim 1 are it is characterised in that be used in response to detecting to described shutter release button
Corresponding activation and the parameter of respective image sequence that is grouped can be by the user configuring of described equipment.
22. electronic equipments according to claim 1 it is characterised in that:
Described live preview be shown as including for enable described first acquisition of media pattern can piece supplying media capture use
The part at family interface;And
Described shutter release button is shown in the software push buttons in described media capture user interface;And
Described processing unit is configured to:
In response to the described activation to described shutter release button is detected, display and described shutter release button phase on described display unit
The animation of association, described animation continues to gather for institute with described camera unit after the described activation to described shutter release button
State the corresponding time quantum of the time quantum of the image of the first image sequence.
23. electronic equipments according to claim 1 are it is characterised in that detected described by described camera unit
At one time to before the activation of described shutter release button collection the plurality of image detect right at the described very first time
It is stored in memorizer with the first form before the activation of described shutter release button, and in response to detecting when described first
Between place the activation of described shutter release button is stored in described memorizer with the second form.
24. electronic equipments according to claim 1 are it is characterised in that described processing unit is configured to:
After the activation to described shutter release button at the described very first time is detected, detection is at the second time to described fast
Next activation of door button;And
In response to detecting at described second time to next activation described in described shutter release button:
By by described camera unit with the time close to the described activation of described shutter release button in described second time on
The multiple images of collection are grouped in the second image sequence, and wherein said second image sequence includes:
The multiple of before the activation to described shutter release button for described second time collection are being detected by described camera unit
Image;And
Presentation graphics, described presentation graphics represents described second image sequence and by described camera unit described second
Gather after one or more of other images in image sequence image.
25. electronic equipments according to claim 1 are it is characterised in that described processing unit is configured to automatically by mould
Paste image is excluded from described first image sequence.
26. electronic equipments according to claim 1 it is characterised in that:
Described first image sequence includes:
Initial pictures in described first image sequence,
The image of the first quantity of collection between described initial pictures and described presentation graphics,
Final image in described first image sequence, and
The image of the second quantity of collection between described presentation graphics and described final image;And
Described processing unit is configured to:
The detection input corresponding with order to change the request of the described presentation graphics in described first image sequence;And
Corresponding with order to change the described request of the described presentation graphics in described first image sequence in response to detecting
Described input:
Described presentation graphics is changed into by the presentation graphics being corrected according to the described input detecting;And
By image is added and described first in the end of described first image sequence according to the described input that detects
Delete at the other end of image sequence image changing the grouped the plurality of image in described first image sequence so that
Described first image sequence has the initial pictures being corrected and the final image being corrected.
27. electronic equipments according to claim 1 it is characterised in that described display unit is touch-sensitive display unit, and
Described processing unit is configured to:
Receive in order to show the request of the described presentation graphics from described first image sequence;
In response to receiving to show the described request of described presentation graphics, described touch-sensitive display unit show described
Presentation graphics;
When showing described presentation graphics, the touch receiving on described touch-sensitive display unit on described presentation graphics is defeated
Enter, described touch input includes the feature changing over;And
In response to the described touch input on described presentation graphics is received on described touch-sensitive display unit, touch described
On quick display unit, with the described feature based on described touch input, the speed determining that changes in time shows described the
Image in one image sequence.
28. a kind of for device that multiple images are grouped it is characterised in that including:
The part being activated when in the first acquisition of media pattern of the camera of the electronic equipment being in for including display, bag
Include:
For showing the part of live preview on the display;
For detecting the part of the activation to shutter release button at the very first time when showing described live preview;And
For the part of following operation being executed to the activation of described shutter release button in response to detecting at the described very first time:
By by described camera with the time close to the described activation of described shutter release button in the described very first time on gather
Multiple images be grouped in the first image sequence, wherein said first image sequence includes:
By described camera in multiple images collection before the described very first time is to the activation of described shutter release button is detected;
Presentation graphics, described presentation graphics represents described first image sequence and by described camera in described first image
Gather after one or more of other images in sequence image;And
Multiple images by the collection after gathering described presentation graphics of described camera.
29. devices according to claim 28 are it is characterised in that detecting at the described very first time to described shutter
Before the activation of button, the plurality of image of collection is the image of predefined quantity.
30. devices according to claim 28 are it is characterised in that detecting at the described very first time to described shutter
Before the activation of button, the plurality of image of collection is the image in the time predefined before the described very first time.
31. devices according to claim 28 are it is characterised in that detecting at the described very first time to described shutter
Before the activation of button the plurality of image of collection be collect before the time of described presentation graphics predefined when
Interior image.
32. devices according to claim 28 are it is characterised in that detecting at the described very first time to described shutter
Before the activation of button, the plurality of image of collection comes the comfortable described very first time and second before the described very first time
Time range between time, and gather institute before the activation to described shutter release button at the described very first time is detected
State multiple images independent of detection interacting close to described second time and described shutter release button in time.
33. devices according to claim 28 are it is characterised in that detecting described in described first image sequence
At the very first time, one or more predefined packets are met to the plurality of image of collection before the activation of described shutter release button
Criterion.
34. devices according to claim 33 are it is characterised in that described predefined packet criterion includes selecting detecting
Image to the predefined quantity before the activation of described shutter release button.
35. devices according to claim 33 are it is characterised in that described predefined packet criterion includes selecting in described generation
The image of the predefined quantity before table image.
36. devices according to claim 33 it is characterised in that described predefined packet criterion include select immediately preceding
The image in the time predefined scope before the activation to described shutter release button is detected.
37. devices according to claim 33 it is characterised in that described predefined packet criterion include select immediately preceding
Collect the image in the time predefined scope before the time of described presentation graphics.
38. devices according to claim 28 are it is characterised in that described equipment is entering described first acquisition of media pattern
After start collection and storage image, and delete afterwards be not grouped into when being in described first acquisition of media pattern when
Between upper close at the corresponding time to the image in the corresponding multiple images of the activation of described shutter release button.
39. devices according to claim 28 are it is characterised in that described equipment starts to adopt after showing described live preview
Collection and storage image, and delete afterwards be not grouped into when being in described first acquisition of media pattern close in time
To the image in the corresponding multiple images of the activation of described shutter release button at the corresponding time.
40. devices according to claim 28 it is characterised in that described equipment when show described live preview when independent of
Detection to gather and storage image to the activation of described shutter release button, and delete afterwards and ought be in described first acquisition of media mould
It is not grouped into when in formula in time in the corresponding multiple images to the activation of described shutter release button at the corresponding time
Institute's image of gathering and storing.
41. devices according to claim 28 are it is characterised in that described first image sequence is stored as in memory
First unique image collection.
42. devices according to claim 28 it is characterised in that described live preview is with first resolution display image,
And the image of the described first resolution that described first image sequence includes being displayed in described live preview.
43. devices according to claim 42 are it is characterised in that had by the described presentation graphics of described collected by camera
The second resolution higher than described first resolution.
44. devices according to claim 28 are it is characterised in that include:
For the part of following operation being executed to the activation of described shutter release button in response to detecting at the described very first time:
The audio frequency corresponding with described first image sequence is associated with described first image sequence.
45. devices according to claim 28 are it is characterised in that include:
For the part of following operation being executed to the activation of described shutter release button in response to detecting at the described very first time:
The metadata corresponding with described first image sequence is associated with described first image sequence.
46. devices according to claim 28 are it is characterised in that described first acquisition of media pattern is configured to by described
The user of equipment enables or disables.
47. devices according to claim 46 it is characterised in that:
Described live preview be shown as including for enable described first acquisition of media pattern can piece supplying media capture use
The part at family interface;
When described first acquisition of media pattern is activated, described piece supplying demonstration can be animated;And
When described first acquisition of media pattern is disabled, described piece supplying can not be animated demonstration.
48. devices according to claim 28 are it is characterised in that be used in response to the phase to described shutter release button is detected
The parameter of the respective image sequence that stress live and be grouped can be by the user configuring of described equipment.
49. devices according to claim 28 it is characterised in that:
Described live preview be shown as including for enable described first acquisition of media pattern can piece supplying media capture use
The part at family interface;And
Described shutter release button is shown in the software push buttons in described media capture user interface;And
Described equipment includes:
For the animation that is associated with described shutter release button in response to detecting the described activation to described shutter release button to show
Part, described animation continues to be used for described first image with described collected by camera after the described activation to described shutter release button
The corresponding time quantum of the time quantum of the image of sequence.
50. devices according to claim 28 are it is characterised in that detected in the described very first time by described camera
The plurality of image of collection before the activation of described shutter release button is being detected at the described very first time to described shutter
It is stored in memorizer with the first form before the activation of button, and in response to detecting at the described very first time to institute
State the activation of shutter release button and be stored in described memorizer with the second form.
51. devices according to claim 28 are it is characterised in that include:
For detecting at the second time to institute after the activation to described shutter release button at the described very first time is detected
State the part of next activation of shutter release button;And
For in response to the part to next activation described in described shutter release button at described second time is detected:
By by described camera with the time close to the described activation of described shutter release button in described second time on gather
Multiple images be grouped in the second image sequence, wherein said second image sequence includes:
By described camera in multiple images collection before the activation to described shutter release button for described second time is detected;
And
Presentation graphics, described presentation graphics represents described second image sequence and by described camera in described second image
Gather after one of other images in sequence or many images.
52. devices according to claim 28 it is characterised in that include for automatically by broad image from described first
The part of image sequence exclusion.
53. devices according to claim 28 it is characterised in that:
Described first image sequence includes:
Initial pictures in described first image sequence,
The image of the first quantity of collection between described initial pictures and described presentation graphics,
Final image in described first image sequence, and
The image of the second quantity of collection between described presentation graphics and described final image;And
Described equipment includes:
For detection and the corresponding input of asking in order to change described presentation graphics in described first image sequence
Part;And
Corresponding with order to change the described request of the described presentation graphics in described first image sequence in response to detecting
Described input:
For the part described presentation graphics to be changed into the presentation graphics being corrected according to the described input detecting;
And
For by image is added and described in the end of described first image sequence according to the described input detecting
Delete image at the other end of the first image sequence to change the grouped the plurality of image in described first image sequence
Described first image sequence is made to have the initial pictures being corrected and the part of the final image being corrected.
54. devices according to claim 28 are it is characterised in that described display is touch-sensitive display, and described set
Standby inclusion:
For receiving in order to show the part of the request of the described presentation graphics from described first image sequence;
For in response to receiving to show that the described request of described presentation graphics shows institute on described touch-sensitive display
State the part of presentation graphics;
For the touch on described presentation graphics being received on described touch-sensitive display when showing described presentation graphics
The part of input, described touch input includes the feature changing over;And
For in response to the described touch input on described presentation graphics being received on described touch-sensitive display to be based on
The described feature of described touch input in time change the speed determining showing the image in described first image sequence
Part.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201620470063.8U CN205942662U (en) | 2016-05-20 | 2016-05-20 | Electronic device and apparatus for grouping a plurality of images |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201620470063.8U CN205942662U (en) | 2016-05-20 | 2016-05-20 | Electronic device and apparatus for grouping a plurality of images |
Publications (1)
Publication Number | Publication Date |
---|---|
CN205942662U true CN205942662U (en) | 2017-02-08 |
Family
ID=57935183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201620470063.8U Active CN205942662U (en) | 2016-05-20 | 2016-05-20 | Electronic device and apparatus for grouping a plurality of images |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN205942662U (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110008364A (en) * | 2019-03-25 | 2019-07-12 | 联想(北京)有限公司 | Image processing method, device and system |
CN113473013A (en) * | 2021-06-30 | 2021-10-01 | 展讯通信(天津)有限公司 | Display method and device for beautifying effect of image and terminal equipment |
-
2016
- 2016-05-20 CN CN201620470063.8U patent/CN205942662U/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110008364A (en) * | 2019-03-25 | 2019-07-12 | 联想(北京)有限公司 | Image processing method, device and system |
CN113473013A (en) * | 2021-06-30 | 2021-10-01 | 展讯通信(天津)有限公司 | Display method and device for beautifying effect of image and terminal equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN205788149U (en) | Electronic device and apparatus for displaying image | |
CN106227441A (en) | Apparatus and method for capturing and interacting with enhanced digital images | |
CN109644217A (en) | For capturing equipment, method and graphic user interface with recording medium under various modes | |
CN108710462A (en) | Device, method and graphical user interface for manipulating user interface objects with visual and/or tactile feedback | |
CN205942662U (en) | Electronic device and apparatus for grouping a plurality of images | |
AU2023226703B2 (en) | Devices and methods for capturing and interacting with enhanced digital images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |