US20200279419A1 - Methods and devices for manipulating graphical views of data - Google Patents
Methods and devices for manipulating graphical views of data Download PDFInfo
- Publication number
- US20200279419A1 US20200279419A1 US16/877,373 US202016877373A US2020279419A1 US 20200279419 A1 US20200279419 A1 US 20200279419A1 US 202016877373 A US202016877373 A US 202016877373A US 2020279419 A1 US2020279419 A1 US 2020279419A1
- Authority
- US
- United States
- Prior art keywords
- chart
- touch
- display
- predefined
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 178
- 230000004044 response Effects 0.000 claims abstract description 106
- 230000008859 change Effects 0.000 claims abstract description 21
- 238000003860 storage Methods 0.000 claims description 22
- 230000000007 visual effect Effects 0.000 description 95
- 230000007704 transition Effects 0.000 description 55
- 238000013079 data visualisation Methods 0.000 description 33
- 238000010586 diagram Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 15
- 230000002776 aggregation Effects 0.000 description 10
- 238000004220 aggregation Methods 0.000 description 10
- 230000001149 cognitive effect Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 6
- 241000699666 Mus <mouse, genus> Species 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 230000007717 exclusion Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000008707 rearrangement Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 206010064127 Solar lentigo Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 102000004169 proteins and genes Human genes 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/177—Editing, e.g. inserting or deleting of tables; using ruled lines
- G06F40/18—Editing, e.g. inserting or deleting of tables; using ruled lines of spreadsheets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- This invention relates generally to devices and methods for displaying graphical views of data.
- the invention relates specifically to devices and methods for manipulating user interfaces displaying graphical views of data.
- Data visualization is a powerful tool for exploring large data sets, both by itself and coupled with data mining algorithms. Graphical views provide user-friendly ways to visualize and interpret data. However, the task of effectively visualizing large databases imposes significant demands on the human-computer interface to the visualization system.
- Such methods and interfaces may complement or replace conventional methods for visualizing data.
- Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface.
- For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
- some embodiments include methods for visualizing data.
- a method is performed at an electronic device with a touch-sensitive surface and a display.
- the method includes displaying a first chart on the display.
- the first chart concurrently displays a first set of categories, and each respective category in the first set of categories has a corresponding visual mark displayed in the first chart.
- the method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart.
- the method further includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart: removing the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition; and updating display of the first chart.
- the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- the method includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, ceasing to display the first visual mark.
- the method includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, displaying an indicium that the first category has been removed.
- the method includes, while displaying the indicium that the first category has been removed, changing from displaying the first chart with the first set of categories, other than the first category, to displaying a second chart.
- the second chart concurrently displays a second set of categories that are distinct from the first set of categories, and each respective category in the second set of categories has a corresponding visual mark displayed in the second chart.
- the method also includes, while displaying the second chart with the second set of categories, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the indicium that the first category has been removed and, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the first category has been removed, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart.
- updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart includes reordering display of the second set of categories in the second chart.
- the method includes, after updating display of the second chart to reflect inclusion of data that corresponds to the first category, detecting a third touch input, and, in response to detecting a third touch input, updating display of the second chart to reflect removal of data that corresponds to the first category in the first chart.
- the method includes, while displaying the first chart on the display, detecting a fourth touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a second visual mark for a second category in the first chart.
- the method also includes, in response to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart: maintaining display of the second category and the second visual mark in the second chart; removing display of all categories, other than the second category, in the first set of categories; and removing display of all visual marks, other than the second visual mark, that correspond to categories in the first set of categories.
- the method includes, in response to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, displaying an indicium that only the second category in the first set of categories remains displayed.
- the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface and the fourth touch input is a drag gesture or a swipe gesture that moves in a second predefined direction on the touch-sensitive surface that is distinct from the first predefined direction.
- a method is performed at an electronic device with a touch-sensitive surface and a display.
- the method includes displaying a first chart on the display.
- the first chart is derived from a set of data.
- the first chart concurrently displays a first set of categories and a label for the first set of categories.
- Each respective category in the first set of categories has a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories.
- the method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories.
- the method further includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, replacing display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition.
- the second chart is derived from the set of data.
- the second chart concurrently displays a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories.
- Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
- the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- a label for the first field and aggregation type is displayed with the first chart, and the label for the first field and aggregation type continues to be displayed with the second chart.
- a label for the first field and aggregation type is displayed with the first chart and the method includes, in response to detecting the first touch input: displaying an animation of the second set of categories replacing the first set of categories; displaying an animation of the label for the second set of categories replacing the label for the first set of categories; and maintaining display of the label for the first field and aggregation type.
- the method includes, while displaying the second chart with the second set of categories, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of an indicium that a predefined subset of data is not included in the aggregated values of the first field.
- the method also includes, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the predefined subset of data is not included in the aggregated values of the first field, updating display of the second chart to reflect inclusion of the predefined subset of data in the aggregated values.
- updating display of the second chart to reflect inclusion of the predefined subset of data includes reordering display of the second set of categories in the second chart.
- the method includes, after updating display of the second chart to reflect inclusion of the predefined subset of data, detecting a third touch input.
- the method also includes, in response to detecting a third touch input, updating display of the second chart to reflect removal of the predefined subset of data.
- replacing display of the first chart with the second chart via the animated transition in response to detecting the first touch input occurs without displaying a selection menu.
- the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface and the method includes, while displaying the second chart, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories.
- the method also includes, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, displaying a selection menu with possible sets of categories to display in a third chart.
- the method further includes detecting selection of a respective set of categories in the selection menu; and, in response to detecting selection of the respective set of categories in the selection menu: replacing display of the second chart with a third chart that contains the selected respective set of categories; and ceasing to display the selection menu.
- the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface; and the method includes, while displaying the second chart, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. The method also includes, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, displaying a selection menu with possible sets of categories to display in a third chart.
- the method further includes detecting selection of a first set of categories in the selection menu and a second set of categories in the selection menu; and, in accordance with detecting selection of the first set of categories in the selection menu and the second set of categories in the selection menu: replacing display of the second chart with a third chart that contains the first set of categories and the second set of categories; and ceasing to display the selection menu.
- a method is performed at an electronic device with a touch-sensitive surface and a display.
- the method includes displaying a chart on the display.
- the chart has a horizontal axis and a vertical axis.
- the horizontal axis includes first horizontal scale markers.
- the vertical axis includes first vertical scale markers.
- the method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart.
- the method further includes, while detecting the first touch input: horizontally expanding a portion of the chart such that a distance between first horizontal scale markers increases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
- the first touch input is a de-pinch gesture.
- the method includes, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases and while continuing to detect the first touch input: continuing to horizontally expand a portion of the chart; displaying second horizontal scale markers, the second horizontal scale markers being at a finer scale than the first horizontal scale markers; and continuing to maintain the vertical scale of the chart.
- the method includes, including, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases and while continuing to detect the first touch input: continuing to horizontally expand a portion of the chart; replacing a first set of displayed data marks with a second set of displayed data marks, where for at least some of the data marks in the first set of data marks, an individual data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks; and continuing to maintain the vertical scale of the chart.
- the method includes, after horizontally expanding the portion of the chart and maintaining the vertical scale of the chart while detecting the first touch input, ceasing to detect the first touch input.
- the method also includes, in response to ceasing to detect the first touch input, changing a vertical scale of the chart.
- a method is performed at an electronic device with a touch-sensitive surface and a display.
- the method includes displaying at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks.
- the method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart and, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, zooming in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks.
- the method further includes, while displaying the second portion of the chart at the second magnification, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart.
- the method further includes, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart: in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooming in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks; and, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displaying information about the first data mark.
- the second touch input is a same type of touch input as the first touch input.
- the information about the first data mark comprises a data record that corresponds to the first data mark.
- the data-mark-information-display criteria include the second magnification being a predefined magnification.
- the data-mark-information-display criteria include the first data mark in the plurality of data marks being the only data mark displayed at the second magnification after the first touch input.
- the data-mark-information-display criteria include the first data mark reaching a predefined magnification during the second touch input.
- the data-mark-information-display criteria include the device zooming in to display only the first data mark in the plurality of data marks during the second touch input.
- the method includes, in accordance with the determination that one or more predefined data-mark-information-display criteria are met, ceasing to display the first data mark.
- a method is performed at an electronic device with a touch-sensitive surface and a display.
- the method includes displaying a chart on the display, the chart including a plurality of data marks and detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart, the first predefined area having a corresponding first value.
- the method also includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart: selecting the first predefined area and visually distinguishing the first predefined area.
- the method further includes, while the first predefined area is selected, detecting a second touch input on the touch-sensitive surface and, in response to detecting the second touch input on the touch-sensitive surface: visually distinguishing a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area; and displaying a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
- the first touch input is a tap gesture.
- the first predefined area includes a column in the chart.
- the first predefined area includes a single data mark in the plurality of data marks.
- data marks in the plurality of data marks are displayed in corresponding columns in the chart, with a single data mark per column.
- data marks in the plurality of data marks are separated horizontally from one another.
- the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of the first predefined area.
- the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of an edge of the first predefined area.
- the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of a selection handle in or next to the first predefined area.
- the second touch input is a drag gesture
- the method includes detecting movement of a finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values.
- the method also includes, in response to detecting movement of the finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values, displaying a series of changes between the first value in the first predefined area and the corresponding values of the sequence of predefined areas.
- a selected area in the chart comprises the first predefined area and the sequence of predefined areas
- the method includes detecting a third touch input, the third touch input including initial contact of a finger at a location on the touch-sensitive surface that corresponds to a location on the display within the selected area in the chart, and movement of the finger across the touch-sensitive surface.
- the method also includes, in response to detecting the third touch input: moving the selected area across the chart, in accordance with the movement of the finger across the touch-sensitive surface, while maintaining a number of predefined areas in the moved selected area equal to the number of predefined areas in the sequence of predefined areas plus one; and displaying a change between a value corresponding to a leftmost predefined area in the moved selected area and a value corresponding to a rightmost predefined area in the moved selected area.
- a selected area in the chart comprises the first predefined area and the sequence of predefined areas
- the method includes detecting a fourth touch input.
- the method also includes, in response to detecting the fourth touch input: zooming in on the selected area in the chart; in accordance with a determination that areas in the chart outside the selected area are still displayed on the display, maintaining selection of the selected area; and in accordance with a determination only areas in the chart in the selected area are displayed on the display, ceasing selection of the selected area.
- a method is performed at an electronic device with a touch-sensitive surface and a display.
- the method includes displaying a chart on the display.
- the chart has a horizontal axis with a first horizontal scale with first horizontal scale markers.
- the chart has a vertical axis with a first vertical scale with first vertical scale markers.
- the chart includes a first set of data marks. Each respective data mark in the first set of data marks has a respective abscissa and a respective ordinate.
- the chart includes a line that connects adjacent data marks in the first set of data marks.
- the method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart and, while detecting the first touch input: expanding at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input; expanding at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input; adding a second set of second data marks, distinct from the first set of data marks, on the line.
- Each respective data mark in the second set of data marks includes a respective abscissa and a respective ordinate.
- Each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark.
- the method further includes, after adding the second set of data marks on the line: for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moving the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis; and animatedly adjusting the line so that the line connects the second set of data marks.
- adjacent data marks in the first set of first data marks are separated by a first horizontal distance.
- adjacent data marks in the second set of data marks are separated by a second horizontal distance that corresponds to a second horizontal scale that is finer than the first horizontal scale.
- each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark and the ordinate of the line at the respective abscissa of the respective data mark.
- a shape of the line is maintained when the second set of data marks is added to the line.
- a single data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks.
- animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs while detecting the first input.
- animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs after ceasing to detect the first input.
- the second vertical scale is the same as the first vertical scale.
- animatedly moving each respective data mark vertically and animatedly adjusting the line so that the line connects the set of second data marks occur concurrently.
- the method includes ceasing to display the set of first data marks when the second set of data marks is added.
- the method includes ceasing to display the set of first data marks after the second set of data marks is added.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for displaying a first chart on the display.
- the first chart concurrently displays a first set of categories, and each respective category in the first set of categories has a corresponding visual mark displayed in the first chart.
- the one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart.
- the one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart: removing the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition; and updating display of the first chart.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for displaying a first chart on the display.
- the first chart is derived from a set of data.
- the first chart concurrently displays a first set of categories and a label for the first set of categories.
- Each respective category in the first set of categories has a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories.
- the one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories.
- the one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, replacing display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition.
- the second chart is derived from the set of data.
- the second chart concurrently displays a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories.
- Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for displaying a chart on the display.
- the chart has a horizontal axis and a vertical axis.
- the horizontal axis includes first horizontal scale markers.
- the vertical axis includes first vertical scale markers.
- the one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart.
- the one or more programs further include instructions for, while detecting the first touch input: horizontally expanding a portion of the chart such that a distance between first horizontal scale markers increases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for displaying at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks.
- the one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart and, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, zooming in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks.
- the one or more programs further include instructions for, while displaying the second portion of the chart at the second magnification, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart.
- the one or more programs further include instructions for, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart: in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooming in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks; and, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displaying information about the first data mark.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for displaying a chart on the display, the chart including a plurality of data marks.
- the one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart, the first predefined area having a corresponding first value.
- the one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart: selecting the first predefined area; and visually distinguishing the first predefined area.
- the one or more programs further include instructions for, while the first predefined area is selected, detecting a second touch input on the touch-sensitive surface.
- the one or more programs further include instructions for, in response to detecting the second touch input on the touch-sensitive surface: visually distinguishing a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area; and displaying a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for displaying a chart on the display.
- the chart has a horizontal axis with a first horizontal scale with first horizontal scale markers.
- the chart has a vertical axis with a first vertical scale with first vertical scale markers.
- the chart includes a first set of data marks. Each respective data mark in the first set of data marks has a respective abscissa and a respective ordinate.
- the chart includes a line that connects adjacent data marks in the first set of data marks.
- the one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart and, while detecting the first touch input: expanding at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input; expanding at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input; adding a second set of second data marks, distinct from the first set of data marks, on the line.
- Each respective data mark in the second set of data marks includes a respective abscissa and a respective ordinate.
- Each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark.
- the one or more programs further include instructions for, after adding the second set of data marks on the line: for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moving the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis; and animatedly adjusting the line so that the line connects the second set of data marks.
- an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors.
- the one or more programs include instructions for performing any of the methods described herein.
- some embodiments include a non-transitory computer readable storage medium, storing one or more programs for execution by one or more processors of an electronic device with a display and a touch-sensitive surface, the one or more programs including instructions for performing any of the methods described herein.
- some embodiments include a graphical user interface on an electronic device with a display, a touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in the memory, the graphical user interface comprising user interfaces displayed in accordance with any of the methods described herein.
- electronic devices with displays and touch-sensitive surfaces are provided with faster, more efficient methods and interfaces for data visualization, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices.
- Such methods and interfaces may complement or replace conventional methods for data visualization.
- FIG. 1 illustrates a portable multifunction device having a touch screen, in accordance with some embodiments.
- FIG. 2 illustrates a portable multifunction device having a touch-sensitive surface that is separate from the display, in accordance with some embodiments.
- FIG. 3A is a block diagram illustrating a portable multifunction device having a touch screen, in accordance with some embodiments.
- FIG. 3B is a block diagram illustrating a portable multifunction device having a touch-sensitive surface, in accordance with some embodiments.
- FIGS. 4A-4B illustrate user interfaces for initiating data visualization, in accordance with some embodiments.
- FIGS. 5A-5G illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.
- FIGS. 6A-6L illustrate user interfaces for changing chart categories, in accordance with some embodiments.
- FIGS. 7A-7D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.
- FIGS. 8A-8D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.
- FIGS. 9A-9B illustrate user interfaces for changing chart views, in accordance with some embodiments.
- FIGS. 10A-10B illustrate user interfaces for adjusting a chart view, in accordance with some embodiments.
- FIGS. 11A-11J illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIGS. 12A-12D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIGS. 13A-13D illustrate user interfaces for selecting chart areas, in accordance with some embodiments.
- FIGS. 14A-14D illustrate user interfaces for exporting data visualizations, in accordance with some embodiments.
- FIGS. 15A-15C illustrate user interfaces for adjusting a chart view, in accordance with some embodiments.
- FIGS. 16A-16D illustrate user interfaces for changing chart categories, in accordance with some embodiments.
- FIGS. 17A-17B illustrate user interfaces for selecting chart areas, in accordance with some embodiments.
- FIGS. 18A-18E illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIGS. 19A-19D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIGS. 19E-19L illustrate user interfaces for displaying information about a data mark, in accordance with some embodiments.
- FIGS. 20A-20D are flow diagrams illustrating a method of data visualization in accordance with some embodiments.
- FIGS. 21A-21F are flow diagrams illustrating another method of data visualization in accordance with some embodiments.
- FIGS. 22A-22B are flow diagrams illustrating another method of data visualization in accordance with some embodiments.
- FIGS. 23A-23B are flow diagrams illustrating another method of data visualization in accordance with some embodiments.
- FIGS. 24A-24E are flow diagrams illustrating another method of data visualization in accordance with some embodiments.
- FIGS. 25A-25D are flow diagrams illustrating another method of data visualization in accordance with some embodiments.
- FIGS. 26A-26F illustrate scrolling filters in accordance with some embodiments.
- the methods, devices, and GUIs described herein make manipulation of data sets and data visualizations more efficient and intuitive for a user.
- a number of different intuitive user interfaces for data visualizations are described below. For example, applying a filter to a data set can be accomplished by a simple touch input on a given portion of a displayed chart rather than via a nested menu system. Additionally, switching between chart categories can be accomplished by a simple touch input on a displayed chart label.
- FIGS. 20A-20D are flow diagrams illustrating a method of adjusting chart filters.
- FIGS. 5A-5G, 7A-7D, and 8A-8D illustrate user interfaces for adjusting chart filters.
- the user interfaces in FIGS. 5A-5G, 7A-7D, and 8A-8D are used to illustrate the processes in FIGS. 20A-20D .
- FIGS. 21A-21F are flow diagrams illustrating a method of changing chart categories.
- FIGS. 6A-6L illustrate user interfaces for changing chart categories. The user interfaces in FIGS. 6A-6L are used to illustrate the processes in FIGS. 21A-21F .
- FIGS. 22A-22B are flow diagrams illustrating a method of adjusting chart magnification.
- FIGS. 11A-11J and 12A-12D illustrate user interfaces for adjusting chart magnification.
- FIGS. 15A-15C illustrate user interfaces for adjusting chart views. The user interfaces in FIGS. 11A-11J, 12A-12D, and 15A-15C are used to illustrate the processes in FIGS. 22A-22B .
- FIGS. 23A-23B are flow diagrams illustrating a method of displaying information about a data mark.
- FIGS. 19E-19L illustrate user interfaces for displaying information about a data mark. The user interfaces in FIGS. 19E-19L are used to illustrate the processes in FIGS. 23A-23B .
- FIGS. 24A-24E are flow diagrams illustrating a method of chart selection.
- FIGS. 13A-13D illustrate user interfaces for selecting chart areas.
- FIGS. 14A-14D illustrate user interfaces for exporting data visualizations.
- FIGS. 18A-18E illustrate user interfaces for adjusting chart magnification.
- the user interfaces in FIGS. 13A-13D, 14A-14D , and 18 A- 18 E are used to illustrate the processes in FIGS. 24A-24E .
- FIGS. 25A-25D are flow diagrams illustrating a method of update chart views.
- FIGS. 19A-19D illustrate user interfaces for adjusting chart magnification.
- the user interfaces in FIGS. 19A-19D are used to illustrate the processes in FIGS. 25A-25D .
- the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions.
- Other portable electronic devices such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads), are, optionally, used.
- the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad).
- a touch-sensitive surface e.g., a touch screen display and/or a touch pad.
- an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, a microphone, and/or a joystick.
- FIG. 1 illustrates portable multifunction device 100 having touch screen 102 , in accordance with some embodiments.
- device 100 is a mobile phone, a laptop computer, a personal digital assistant (PDA), or a tablet computer.
- Touch screen 102 is also sometimes called a touch-sensitive display and/or a touch-sensitive display system.
- Touch screen 102 optionally displays one or more graphics within a user interface (UI).
- UI user interface
- a user is enabled to select one or more of the graphics by making a touch input (e.g., touch input 108 ) on the graphics.
- the touch input is a contact on the touch screen.
- the touch input is a gesture that includes a contact and movement of the contact on the touch screen.
- the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with device 100 .
- a touch input on the graphics is optionally made with one or more fingers 110 (not drawn to scale in the figure) or one or more styluses 112 (not drawn to scale in the figure).
- selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some circumstances, inadvertent contact with a graphic does not select the graphic.
- a swipe gesture that sweeps over a visual mark optionally does not select the visual mark when the gesture corresponding to selection is a tap.
- Device 100 optionally also includes one or more physical buttons and/or other input/output devices, such as a microphone for verbal inputs.
- FIG. 2 illustrates multifunction device 200 in accordance with some embodiments.
- Device 200 need not be portable.
- device 200 is a laptop computer, a desktop computer, a tablet computer, or an educational device.
- Device 200 includes screen 202 and touch-sensitive surface 204 .
- Screen 202 optionally displays one or more graphics within a UI.
- a user is enabled to select one or more of the graphics by making a touch input (e.g., touch input 210 ) on touch-sensitive surface 204 such that a corresponding cursor (e.g., cursor 212 ) on screen 202 selects the one or more graphics.
- a touch input e.g., touch input 210
- a corresponding cursor e.g., cursor 212
- the particular user interface element is adjusted in accordance with the detected input.
- FIG. 3A is a block diagram illustrating portable multifunction device 100 , in accordance with some embodiments. It should be appreciated that device 100 is only one example of a portable multifunction device, and that device 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components.
- the various components shown in FIG. 3A are implemented in hardware, software, firmware, or a combination of hardware, software, and/or firmware, including one or more signal processing and/or application specific integrated circuits.
- Device 100 includes one or more processing units (CPU's) 302 , input/output (I/O) subsystem 306 , memory 308 (which optionally includes one or more computer readable storage mediums), and network communications interface 310 . These components optionally communicate over one or more communication buses or signal lines 304 . Communication buses 304 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
- CPU's processing units
- I/O subsystem 306 input/output subsystem 306
- memory 308 which optionally includes one or more computer readable storage mediums
- network communications interface 310 optionally communicate over one or more communication buses or signal lines 304 .
- Communication buses 304 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
- Memory 308 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 308 optionally includes one or more storage devices remotely located from processor(s) 302 . Memory 308 , or alternately the non-volatile memory device(s) within memory 308 , comprises a non-transitory computer readable storage medium.
- the software components stored in memory 308 include operating system 318 , communication module 320 , input/output (I/O) module 322 , and applications 328 .
- one or more of the various modules comprises a set of instructions in memory 308 .
- memory 308 stores one or more data sets in one or more database(s) 332 .
- Operating system 318 e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks
- Operating system 318 includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware, software, and/or firmware components.
- Communication module 320 facilitates communication with other devices over one or more external ports and also includes various software components for handling data received from other devices.
- I/O module 322 includes touch input sub-module 324 and graphics sub-module 326 .
- Touch input sub-module 324 optionally detects touch inputs with touch screen 102 and other touch sensitive devices (e.g., a touchpad or physical click wheel).
- Touch input sub-module 324 includes various software components for performing various operations related to detection of a touch input, such as determining if contact has occurred (e.g., detecting a finger-down event), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact).
- Touch input sub-module 324 receives contact data from the touch-sensitive surface (e.g., touch screen 102 ). These operations are, optionally, applied to single touch inputs (e.g., one finger contacts) or to multiple simultaneous touch inputs (e.g., “multitouch”/multiple finger contacts). In some embodiments, touch input sub-module 324 detects contact on a touchpad.
- Touch input sub-module 324 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns. Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an data mark). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event.
- Graphics sub-module 326 includes various known software components for rendering and displaying graphics on touch screen 102 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed.
- graphics includes any object that can be displayed to a user, including without limitation data visualizations, icons (such as user-interface objects including soft keys), text, digital images, animations and the like.
- graphics sub-module 326 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code.
- Graphics sub-module 326 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to the display or touch screen.
- Applications 328 optionally include data visualization module 330 for displaying graphical views of data and one or more other applications. Examples of other applications that are, optionally, stored in memory 308 include word processing applications, email applications, and presentation applications.
- data visualization module 330 includes executable instructions for displaying and manipulating various graphical views of data.
- modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein).
- modules i.e., sets of instructions
- memory 308 optionally stores a subset of the modules and data structures identified above.
- memory 308 optionally stores additional modules and data structures not described above.
- FIG. 3B is a block diagram illustrating multifunction device 200 , in accordance with some embodiments. It should be appreciated that device 200 is only one example of a multifunction device, and that device 200 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components.
- the various components shown in FIG. 3B are implemented in hardware, software, firmware, or a combination of hardware, software, and/or firmware, including one or more signal processing and/or application specific integrated circuits.
- Device 200 typically includes one or more processing units/cores (CPUs) 352 , one or more network or other communications interfaces 362 , memory 350 , I/O interface 356 , and one or more communication buses 354 for interconnecting these components.
- Communication buses 354 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
- I/O interface 306 comprises screen 202 (also sometimes called a display), touch-sensitive surface 204 , and one or more sensor(s) 360 (e.g., optical, acceleration, proximity, and/or touch-sensitive sensors).
- I/O interface 356 optionally includes a keyboard and/or mouse (or other pointing device) 358 .
- I/O interface 356 couples input/output peripherals on device 200 , such as screen 202 , touch-sensitive surface 204 , other input devices 358 , and one or more sensor(s) 360 , to CPU(s) 352 and/or memory 350 .
- Screen 202 provides an output interface between the device and a user.
- Screen 202 displays visual output to the user.
- the visual output optionally includes graphics, text, icons, data marks, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user-interface objects.
- Screen 202 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments.
- LCD liquid crystal display
- LPD light emitting polymer display
- LED light emitting diode
- device 200 includes touch-sensitive surface 204 (e.g., a touchpad) for detecting touch inputs.
- Touch-sensitive surface 204 accepts input from the user via touch inputs. For example, touch input 210 in FIG. 2 .
- Touch-sensitive surface 204 (along with any associated modules and/or sets of instructions in memory 350 ) detects touch inputs and converts the detected inputs into interaction with user-interface objects (e.g., one or more icons, data marks, or images) that are displayed on screen 202 .
- user-interface objects e.g., one or more icons, data marks, or images
- a point of contact between touch-sensitive surface 204 and the user corresponds to a finger of the user.
- Memory 350 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Memory 350 optionally includes one or more storage devices remotely located from CPU(s) 352 .
- the software components stored in memory 350 include operating system 364 , communication module 366 , input/output (I/O) module 368 , and applications 374 .
- one or more of the various modules comprises a set of instructions in memory 350 .
- memory 350 stores one or more data sets in one or more database(s) 378 .
- I/O module 368 includes touch input sub-module 370 and graphics sub-module 372 .
- applications 374 include data visualization module 376 .
- memory 350 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored in memory 308 of portable multifunction device 100 ( FIG. 3A ), or a subset thereof. Furthermore, memory 350 optionally stores additional programs, modules, and data structures not present in memory 308 of portable multifunction device 100 . For example, memory 350 of device 200 optionally stores drawing, presentation, and word processing applications, while memory 308 of portable multifunction device 100 ( FIG. 3A ) optionally does not store these modules.
- the Device 200 also includes a power system for powering the various components.
- the power system optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management, and distribution of power in portable devices.
- a power management system e.g., one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management, and distribution of power in portable devices.
- power sources e.g., battery, alternating current (AC)
- AC alternating current
- a recharging system
- Each of the above identified elements in FIG. 3B is, optionally, stored in one or more of the previously mentioned memory devices.
- Each of the above identified modules corresponds to a set of instructions for performing a function described above.
- the above identified modules or programs i.e., sets of instructions
- memory 350 optionally stores a subset of the modules and data structures identified above.
- memory 350 optionally stores additional modules and data structures not described above.
- UI user interfaces
- the inputs e.g., finger contacts
- the inputs are detected on a touch-sensitive surface on a device that is distinct from a display on the device.
- finger inputs e.g., finger contacts, finger tap gestures, finger swipe gestures
- one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or stylus input).
- a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact).
- a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact).
- multiple user inputs it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
- FIGS. 4A-4B illustrate user interfaces for initiating data visualization, in accordance with some embodiments.
- FIG. 4A shows UI 402 including an email application.
- the email application contains an email including attached file 402 .
- FIG. 4A also shows contact 410 over the icon corresponding to file 402 .
- FIG. 4B shows UI 450 including a data visualization application.
- the data visualization application includes a graphical view of data from file 402 .
- the graphical view includes chart 404 (e.g., a bar chart) with chart label 406 , a plurality of categories, and a plurality of category labels 408 .
- file 402 has a file type associated with the data visualization application and, in response to detecting contact 410 , the data visualization application is initialized and the data from file 402 is displayed in a graphical view.
- FIGS. 5A-5G illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.
- FIG. 5A shows UI 450 including category 502 - 1 and category label 408 - 1 .
- FIG. 5A also shows contact 510 detected at position 510 - a corresponding to the visual mark (e.g., a bar corresponding to category 502 - 1 ) for category 502 - 1 .
- FIG. 5B shows UI 520 including contact 510 detected at position 510 - b and the visual mark for category 502 - 1 moving in concert with movement of contact 510 via an animated transition.
- FIG. 5A shows UI 450 including category 502 - 1 and category label 408 - 1 .
- FIG. 5A also shows contact 510 detected at position 510 - a corresponding to the visual mark (e.g., a bar corresponding to category 502 - 1 ) for category 502 - 1 .
- FIG. 5B shows UI 520 including contact
- FIG. 5C shows UI 522 including contact 510 detected at position 510 - c and the visual mark for category 502 - 1 continuing to move in concert with movement of contact 510 via an animated transition.
- FIG. 5C also shows indicium 504 indicating that category 502 - 1 (Catering) is being filtered out of the data as a result of the current action (e.g., the movement of contact 510 ).
- FIG. 5D shows UI 524 including indicium 504 , contact 510 detected at position 510 - d , and the visual mark for category 502 - 1 continuing to move in concert with movement of a contact 510 via an animated transition.
- FIG. 5E shows UI 526 including indicium 504 and the removal of the visual mark for category 502 - 1 from the chart.
- FIG. 5F shows UI 528 including indicium 504 , contact 510 detected at position 510 - e , and the visual mark for category 502 - 1 continuing to move in concert with movement of a contact 510 via an animated transition.
- FIG. 5G shows UI 530 including indicium 506 , contact 510 detected at position 510 - f , and the visual mark for category 502 - 1 continuing to move in concert with movement of a contact 510 via an animated transition.
- the first category and the first visual mark are removed from the chart via an animated transition in response to contact 510 moving to a pre-defined location.
- the first category and the first visual mark are added back to the chart via an animated transition in response to contact 510 moving away from the pre-defined location.
- FIGS. 6A-6L illustrate user interfaces for changing chart categories, in accordance with some embodiments.
- FIG. 6A shows UI 601 including a chart with chart label 602 - 1 (Menu Item) and categories 502 (including categories 502 - 1 through 502 - 13 ) with category labels 408 .
- FIG. 6A also shows contact 610 detected at position 610 - a corresponding to chart label 602 - 1 .
- FIGS. 6B and 6C show contact 610 moving to position 610 - b and 610 - c respectively and the first chart with chart label 602 - 1 (Menu Item) being replaced by a second chart with chart label 602 - 2 (Menu Group) via an animated transition.
- FIGS. 6B and 6C also show chart categories 502 being replaced by categories 604 via an animated transition, and category labels 408 being replaced by category labels 606 via an animated transition.
- FIG. 6D shows UI 607 including the second chart with chart label 602 - 2 (Menu Group) and categories 604 with category labels 606 .
- FIG. 6D also shows contact 620 detected at position 620 - a corresponding to chart label 602 - 2 .
- FIGS. 6E and 6F show contact 620 moving to positions 620 - b and 620 - c respectively and the second chart with chart label 602 - 2 (Menu Group) being replaced by a third chart with chart label 602 - 3 (Day) via an animated transition.
- FIGS. 6E and 6F also show chart categories 604 being replaced by categories 612 via an animated transition, and category labels 606 being replaced by category labels 614 via an animated transition.
- FIG. 6G shows UI 613 including the third chart with chart label 602 - 3 (Day) and categories 612 with category labels 614 .
- FIG. 6G also shows contact 630 detected at a position corresponding to chart label 602 - 3 and selection menu 616 displayed. In some embodiments, contact 630 is detected and identified as a tap input and selection menu 616 is displayed in response.
- FIG. 6H shows UI 615 with selection menu 616 including selection categories 618 .
- FIG. 6H also shows contact 640 detected at a position corresponding to selection category 618 - 2 .
- FIG. 6I shows UI 617 including a fourth chart with chart label 602 - 4 (Hour) and categories 622 with category labels 624 . In some embodiments, the chart shown in FIG. 6I replaces the chart shown in FIG. 6H in response to the detection of contact 640 at a position corresponding to selection category 618 - 2 .
- FIG. 6J shows UI 619 including the fourth chart with chart label 602 - 4 (Hour) and categories 622 with category labels 624 .
- FIG. 6J also shows contact 650 detected at position 650 - a corresponding to chart label 602 - 4 .
- FIGS. 6K and 6L show contact 650 moving to positions 650 - b and 650 - c respectively and the fourth chart with chart label 602 - 4 (Hour) being replaced by the first chart with chart label 602 - 1 (Menu Item) via an animated transition.
- FIGS. 6K and 6L also show chart categories 622 being replaced by categories 502 via an animated transition, and category labels 624 being replaced by category labels 408 via an animated transition.
- FIGS. 7A-7D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.
- FIG. 7A shows UI 701 including a chart with categories 612 (including category 612 - 4 ) and corresponding category labels 614 .
- FIG. 7A also shows indicium 504 indicating that data corresponding to category 502 - 1 has been filtered out of the chart.
- FIG. 7A further shows contact 710 detected at position 710 - a corresponding to indicium 504 .
- FIGS. 7B and 7C show contact 710 moving to positions 710 - b and 710 - c respectively and the removal of indicium 504 along with the chart updating reflect inclusion of data that corresponds to category 502 - 1 .
- FIGS. 7B and 7C also show categories 612 reordered to reflect inclusion of the data corresponding to category 502 - 1 .
- FIG. 7D shows UI 707 including indicium 504 , contact 720 detected at a position corresponding to indicium 504 , and categories 612 .
- the chart is updated to reflect inclusion of data that corresponds to category 502 - 1 in response to contact 710 moving from a pre-defined location or area on the UI.
- the chart is updated to reflect exclusion of data that corresponds to category 502 - 1 in response to contact 720 .
- FIGS. 8A-8D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.
- FIG. 8A shows UI 801 including category 502 - 2 and category label 408 - 2 .
- FIG. 8A also shows contact 810 detected at position 810 - a corresponding to the visual mark for category 502 - 2 (e.g., a bar corresponding to category 502 - 2 ).
- FIGS. 8B and 8C show contact 810 moving to positions 810 - b and 810 - c respectively and the visual mark for category 502 - 2 moving in concert with movement of contact 810 via an animated transition.
- FIG. 8B and 8C also show indicium 802 indicating that only the data corresponding to category 502 - 2 is being included as a result of the current action (e.g., the movement of contact 810 ).
- FIG. 8D shows UI 805 including indicium 802 and the removal of the visual mark for all categories except for category 502 - 1 .
- FIGS. 9A-9B illustrate user interfaces for changing chart views, in accordance with some embodiments.
- FIG. 9A shows UI 901 including indicium 802 and a bar chart with category 502 - 2 .
- FIG. 9A also shows contact 910 detected at a position on UI 901 that corresponds to a line chart graphical view.
- FIG. 9B shows UI 903 including indicium 802 and a line chart.
- the bar chart shown in FIG. 9A is replaced by the line chart shown in FIG. 9B in response to detection of contact 910 at a position on UI 901 that corresponds to a line chart graphical view.
- FIGS. 10A-10B illustrate user interfaces for adjusting a chart view, in accordance with some embodiments.
- FIG. 10A shows UI 1001 including a chart.
- FIG. 10A also shows contact 1010 detected at position 1010 - a on UI 1001 .
- FIG. 10B shows contact 1010 at position 1010 - b and movement of the chart in concert with movement of contact 1010 .
- FIGS. 11A-11J illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIG. 11A shows UI 1101 including a chart at a first magnification (e.g., a first zoom level).
- FIG. 11A also shows contacts 1110 and 1120 detected at positions 1110 - a and 1120 - a respectively.
- FIG. 11B shows contacts 1110 and 1120 detected at positions 1110 - b and 1120 - b respectively and shows UI 1103 including the chart at a second magnification (e.g., zoomed in from the first zoom level).
- the relative positions of contacts 1110 and 1120 in FIG. 11B are further apart than the positions of contacts 1110 and 1120 in FIG.
- FIGS. 11A and 11B represent a de-pinch gesture on the touch screen.
- the second magnification of the chart shown in FIG. 11B includes the same vertical scale as the first magnification of the line shown in FIG. 11A .
- FIGS. 11C and 11D show an animated transition of the chart to a third magnification.
- the animated transition shown in FIGS. 11C and 11D includes an increase in the vertical scale of the chart.
- the animated transition shown in FIGS. 11C and 11D is in response to ceasing to detect contacts 1110 and 1120 (e.g., detecting lift off of the contacts).
- FIG. 11E shows UI 1109 including the chart at a fourth magnification.
- FIG. 11E also shows contacts 1130 and 1140 detected at positions 1130 - a and 1140 - a respectively.
- FIG. 11F shows contacts 1130 and 1140 detected at positions 1130 - b and 1140 - b respectively and shows UI 1111 including the chart at a fifth magnification (e.g., zoomed in from the fourth magnification).
- FIG. 11G shows UI 1113 including the chart at a sixth magnification.
- FIG. 11G also shows contacts 1150 and 1160 detected at positions 1150 - a and 1160 - a respectively.
- FIG. 11H shows contacts 1150 and 1160 detected at positions 1150 - b and 1160 - b respectively and shows UI 1115 including the chart at a seventh magnification (e.g., zoomed in from the sixth magnification).
- FIG. 11I shows UI 1117 including the chart at an eighth magnification.
- FIG. 11I also shows contacts 1170 and 1180 detected at positions 1170 - a and 1180 - a respectively.
- FIG. 11J shows contacts 1170 and 1180 detected at positions 1170 - b and 1180 - b respectively and shows UI 1119 including the chart at a ninth magnification (e.g., zoomed in from the eighth magnification).
- FIGS. 12A-12D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIG. 12A shows UI 1201 including a chart at an initial magnification.
- FIG. 12A also shows contacts 1210 and 1220 detected at positions 1210 - a and 1220 - a respectively.
- FIG. 12B shows contacts 1210 and 1220 detected at positions 1210 - b and 1220 - b respectively and shows UI 1203 including the chart at a second magnification (e.g., zoomed in from the initial magnification).
- the relative positions of contacts 1210 and 1220 in FIG. 12B are further apart than the positions of contacts 1210 and 1220 in FIG. 12A and represent a de-pinch gesture on the touch screen.
- FIG. 12A shows UI 1201 including a chart at an initial magnification.
- FIG. 12A also shows contacts 1210 and 1220 detected at positions 1210 - a and 1220 - a respectively.
- FIG. 12B shows
- FIG. 12C shows contacts 1210 and 1220 detected at positions 1210 - c and 1220 - c respectively and shows UI 1205 including the chart at a third magnification (e.g., zoomed out from the second magnification).
- the relative positions of contacts 1210 and 1220 in FIG. 12C are closer together than the positions of contacts 1210 and 1220 in FIG. 12B and represent a pinch gesture on the touch screen.
- FIG. 12D shows contacts 1210 and 1220 detected at positions 1210 - d and 1220 - d respectively and shows UI 1207 including the chart at a fourth magnification (e.g., zoomed out from the third magnification).
- the relative positions of contacts 1210 and 1220 in FIG. 12D are closer together than the positions of contacts 1210 and 1220 in FIG. 12A and represent a pinch gesture on the touch screen.
- FIGS. 13A-13D illustrate user interfaces for selecting chart areas, in accordance with some embodiments.
- FIG. 13A shows UI 1301 including a chart with selected portion 1302 and information regarding selected portion 1302 .
- FIG. 13A show information regarding the number of records in selected portion 1302 .
- FIG. 13A also shows contact 1310 detected at position 1310 - a corresponding to selected portion 1302 .
- FIG. 13B shows UI 1303 and contact 1310 at position 1310 - b and the chart with selected portion 1304 corresponding to the movement of contact 1310 .
- FIG. 13B also shows the chart including information regarding selected portion 1304 (e.g., information showing a difference between selected portion 1302 and selected portion 1304 ).
- FIG. 13A shows UI 1301 including a chart with selected portion 1302 and information regarding selected portion 1302 .
- FIG. 13A show information regarding the number of records in selected portion 1302 .
- FIG. 13A also shows contact 1310 detected at position 1310 - a
- FIG. 13C shows UI 1305 and contact 1310 at position 1310 - c and the chart with selected portion 1306 corresponding to the continued movement of contact 1310 .
- FIG. 13C also shows the chart including information regarding selected portion 1306 .
- FIG. 13D shows UI 1307 and contact 1310 at position 1310 - d and the chart with selected portion 1308 corresponding to the continued movement of contact 1310 .
- FIG. 13D also shows the chart including information regarding selected portion 1308 .
- FIGS. 14A-14D illustrate user interfaces for exporting data visualizations, in accordance with some embodiments.
- FIG. 14A shows UI 1401 including a chart with selected portion 1308 .
- FIG. 14B shows UI 1403 including the chart with selected portion 1308 and selection menu 1402 .
- FIG. 14B also shows contact 1410 detected at a position corresponding to an icon for selection menu 1402 .
- selection menu 1402 is shown in response to contact 1410 being detected over the icon for selection menu 1402 .
- FIG. 14C shows UI 1405 including the chart with selected portion 1308 and selection menu 1402 .
- FIG. 14C also shows contact 1420 detected at a position corresponding to a menu option (Email Image) in selection menu 1402 .
- FIG. 14D shows UI 1407 with an email that includes information from the chart.
- UI 1407 in FIG. 14D is shown in response to detecting contact 1420 at a position corresponding to the Email Image menu option in selection menu 1402 .
- FIGS. 15A-15C illustrate user interfaces for adjusting a chart view, in accordance with some embodiments.
- FIG. 15A shows UI 1501 including a chart.
- FIG. 15A also shows contact 1510 detected at position 1510 - a on UI 1501 .
- FIG. 15B shows UI 1503 and contact 1510 at position 1510 - b .
- FIG. 15B also shows movement of the chart in concert with movement of contact 1510 .
- FIG. 15B shows both contact 1510 and the chart moving to the right from their respective positions in FIG. 15A .
- FIG. 15C shows UI 1505 and contact 1510 at position 1510 - c .
- FIG. 15C also shows movement of the chart in concert with movement of contact 1510 .
- FIG. 15C shows both contact 1510 and the chart moving to the left from their respective positions in FIG. 15B .
- FIGS. 16A-16D illustrate user interfaces for changing chart categories, in accordance with some embodiments.
- FIG. 16A shows UI 1601 including a chart with chart label 1602 - 1 (Average).
- FIG. 16A also shows contact 1610 detected at a position corresponding to chart label 1602 - 1 .
- FIG. 16B shows UI 1603 including a chart with chart label 1602 - 2 (Percentile Bands).
- the chart shown in FIG. 16B replaces the chart shown in FIG. 16A in response to the detection of contact 1610 at a position on the chart label.
- FIG. 16C shows UI 1605 including a chart with chart label 1602 - 2 (Percentile Bands).
- FIG. 16C also shows contact 1620 detected at a position corresponding to chart label 1602 - 2 .
- FIG. 16D shows UI 1607 including a chart with chart label 1602 - 3 (Summary).
- the chart shown in FIG. 16D replaces the chart shown in FIG. 16C in response to the detection of contact 1620 at a position on the chart label.
- FIGS. 17A-17B illustrate user interfaces for selecting chart areas, in accordance with some embodiments.
- FIG. 17A shows UI 1701 including a chart.
- FIG. 17A also shows contact 1710 detected at a position corresponding to a portion of the chart.
- FIG. 17B shows UI 1703 including a chart with selected portion 1702 and information regarding selected portion 1702 .
- FIG. 17B shows information regarding the number of records in selected portion 1702 .
- FIG. 17B also shows contact 1720 detected at a position corresponding to selected portion 1702 .
- selected portion 1702 is selected in response to detecting contact 1720 .
- contact 1710 detected in FIG. 17A represents a first type of touch input (e.g., a swipe gesture) and contact 1720 detected in FIG. 17B represents a second type of touch input (e.g., a tap gesture).
- a first type of touch input e.g., a swipe gesture
- contact 1720 detected in FIG. 17B represents a second type of touch
- FIGS. 18A-18E illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIG. 18A shows UI 1801 including a chart with selected portion 1802 at an initial magnification.
- FIG. 18A also shows contacts 1810 and 1820 detected at positions 1810 - a and 1820 - a respectively.
- FIG. 18B shows contacts 1810 and 1820 detected at positions 1810 - b and 1820 - b respectively and shows UI 1803 including the chart at a second magnification (e.g., zoomed in from the initial magnification).
- the relative positions of contacts 1810 and 1820 in FIG. 18B are further apart than the positions of contacts 1810 and 1820 in FIG. 18A and represent a de-pinch gesture on the touch screen.
- FIG. 18C shows contacts 1810 and 1820 detected at positions 1810 - c and 1820 - c respectively and shows UI 1805 including the chart at a third magnification (e.g., zoomed in
- FIG. 18D shows UI 1807 including the chart at a fourth magnification.
- FIG. 18D also shows contacts 1830 and 1840 detected at positions 1830 - a and 1840 - a respectively.
- FIG. 18E shows contacts 1830 and 1840 detected at positions 1830 - b and 1840 - b respectively and shows UI 1809 including the chart at a fifth magnification (e.g., zoomed in from the fourth magnification).
- FIGS. 19A-19D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.
- FIG. 19A shows UI 1901 including a chart at an initial magnification.
- FIG. 19A also shows the chart including data marks 1902 (e.g., data marks 1902 - 1 through 1902 - 5 ).
- FIG. 19B shows UI 1903 including the chart at a second magnification (e.g., zoomed in from the initial magnification).
- FIG. 19B also shows the chart including data marks 1902 (e.g., a subset of data marks 1902 shown in FIG. 19A ) and data marks 1904 .
- FIG. 19A shows UI 1901 including a chart at an initial magnification.
- FIG. 19A also shows the chart including data marks 1902 (e.g., data marks 1902 - 1 through 1902 - 5 ).
- FIG. 19B shows UI 1903 including the chart at a second magnification (e.g., zoomed
- FIG. 19C shows UI 1905 including the chart at a third magnification (e.g., zoomed in from the second magnification).
- FIG. 19C also shows the chart including data marks 1902 and data marks 1904 .
- data marks 1904 are initially placed on the line connecting data marks 1902 as shown in FIG. 19B and are animatedly moved (e.g., using continuous motion rather than a jump) to their respective ordinates as shown in FIG. 19C .
- FIG. 19D shows UI 1907 including the chart with data marks 1902 and data marks 1904 at a fourth magnification (e.g., zoomed in from the third magnification).
- FIGS. 19E-19L illustrate user interfaces for displaying information about a data mark, in accordance with some embodiments.
- FIG. 19E shows UI 1909 including a chart at an initial magnification.
- FIG. 19E also shows the chart including data marks 1908 (e.g., including data marks 1908 - 1 and 1908 - 2 ).
- FIG. 19E also shows contacts 1930 and 1940 detected at positions 1930 - a and 1940 - a respectively.
- FIGS. 19F-19I show an animated transition from data mark 1908 - 1 to record 1914 - 1 in concert with movement of contacts 1930 and 1940 (e.g., the movement of contacts 1930 and 1940 represent a de-pinch gesture).
- FIG. 19E shows UI 1909 including a chart at an initial magnification.
- FIG. 19E also shows the chart including data marks 1908 (e.g., including data marks 1908 - 1 and 1908 - 2 ).
- FIG. 19E also shows contacts 1930 and 1940 detected at positions 1930 - a and
- FIG. 19F shows UI 1911 including contacts 1930 and 1940 detected at positions 1930 - b and 1940 - b respectively and an animated transition from data mark 1908 - 1 to record 1914 - 1 in concert with movement of contacts 1930 and 1940 .
- FIG. 19G shows UI 1913 including contacts 1930 and 1940 detected at positions 1930 - c and 1940 - c respectively and a continued animated transition (e.g., using continuous movement) from data mark 1908 - 1 to record 1914 - 1 in concert with movement of contacts 1930 and 1940 .
- FIG. 19H shows UI 1915 including contacts 1930 and 1940 detected at positions 1930 - d and 1940 - d respectively and a continued animated transition from data mark 1908 - 1 to record 1914 - 1 in concert with movement of contacts 1930 and 1940 .
- FIG. 19G shows UI 1913 including contacts 1930 and 1940 detected at positions 1930 - c and 1940 - c respectively and a continued animated transition (e.g., using continuous movement) from data mark 1908 - 1 to record 1914 - 1 in concert
- FIG. 19I shows UI 1917 including record 1914 - 1 .
- UI 1917 shown in FIG. 19I is displayed in response to ceasing to detect contacts 1930 and 1940 after completion of a de-pinch gesture (e.g., detecting lift of the contacts).
- FIG. 19J shows UI 1919 including a chart at an initial magnification.
- FIG. 19J also shows the chart including data marks 1908 (e.g., including data marks 1908 - 1 and 1908 - 2 ).
- FIGS. 19K and 19L show an animated transition from data mark 1908 - 2 to record 1914 - 2 .
- the animated transition from data mark 1908 - 2 to record 1914 - 2 is in concert with a touch input (e.g., a de-pinch gesture).
- FIGS. 20A-20D are flow diagrams illustrating method 2000 of data visualization, in accordance with some embodiments.
- Method 2000 is performed at an electronic device (e.g., portable multifunction device 100 , FIG. 1 , or device 200 , FIG. 2 ) with a display and a touch-sensitive surface.
- the display is a touch screen display and the touch-sensitive surface is on the display.
- the display is separate from the touch-sensitive surface.
- method 2000 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one or more processors 302 of portable multifunction device 100 and/or the one or more processors 352 of multifunction device 200 , as shown in FIGS. 3A-3B .
- Some operations in method 2000 are, optionally, combined and/or the order of some operations is, optionally, changed.
- method 2000 provides an intuitive way to change filtering. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen.
- the method reduces the cognitive burden on the user when applying and/or removing filters, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust filters faster and more efficiently conserves power and increase the time between battery charges.
- the device displays ( 2002 ) a first chart on the display.
- FIG. 5A shows UI 450 including a bar chart.
- the first chart concurrently displays ( 2004 ) a first set of categories.
- the bar chart in FIG. 5A includes categories 502 .
- Each respective category in the first set of categories has ( 2006 ) a corresponding visual mark (e.g., a picture, drawing, or other graphic) displayed in the first chart.
- a respective category in a bar chart has a corresponding bar that represents a value for that respective category
- a respective category in a pie chart has a corresponding slice of the pie chart that represents a value for that respective category, etcetera.
- the bar chart in FIG. 5A includes categories 502 and a bar (e.g., a visual mark) corresponding to each category.
- the device detects ( 2008 ) a first touch input (e.g., a swipe gesture or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart.
- a first touch input e.g., a swipe gesture or a drag gesture
- FIGS. 5A-5D show contact 510 detected at positions 510 - a , 510 - b , 510 - c , and 510 - d respectively.
- the first touch input is ( 2010 ) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- a drag gesture For example, a leftward drag gesture.
- the movement of contact 510 shown in FIGS. 5A-5D represents a swipe gesture toward the left side of the screen.
- the device In response ( 2012 ) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device removes ( 2014 ) the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition.
- FIGS. 5A-5E show an animated transition where the device removes category 502 - 1 and the visual mark corresponding category 502 - 1 in concert with movement of contact 510 .
- the device In response ( 2012 ) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device updates ( 2016 ) display of the first chart. For example, repositioning the remaining categories in the first set and their corresponding visual marks (e.g., graphics) in the first chart. Thus, data that corresponds to the first category is filtered out of the first chart. This process may be repeated to remove additional categories in the first set of categories from the first chart.
- the contact is a stylus contact.
- FIGS. 5A-5E show the chart being updated in response to detecting contact 510 and the updating including repositioning the remaining categories.
- the device in response ( 2012 ) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device ceases ( 2018 ) to display the first visual mark.
- the first visual mark remains displayed while the finger contact in the first touch input remains in continuous contact with the touch-sensitive surface, and the first visual mark ceases to be displayed (e.g., fades out) in response to detecting lift off of the finger contact in the first touch input from the touch-sensitive surface.
- FIGS. 5A-5D show an animated transition where the visual mark corresponding category 502 - 1 fades out and moves in concert with movement of contact 510 and FIG. 5E shows the first visual mark ceasing to be displayed.
- the device while displaying ( 2020 ) the first chart on the display, the device detects a fourth touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of a second visual mark for a second category in the first chart.
- a fourth touch input e.g., a tap gesture, a swipe gesture, or a drag gesture
- FIG. 8A shows the chart as in FIG. 5A including categories 502 .
- FIG. 8A also shows the device detecting contact 810 at position 810 - a corresponding to the visual mark for category 502 - 2 .
- the device in response ( 2022 ) to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, the device: maintains ( 2024 ) display of the second category and the second visual mark in the second chart; removes display of all categories, other than the second category, in the first set of categories; and removes display of all visual marks, other than the second visual mark, that correspond to categories in the first set of categories.
- the device responds differently to different finger gestures made on the touch-sensitive surface at a location that corresponds to a respective graphic for a respective category in the chart.
- FIGS. 8B-8D show movement of contact 810 and an animated transition where the device maintains the visual mark for category 502 - 2 removes the all other categories 502 and the visual marks for all other categories 502 .
- the device in response ( 2022 ) to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, the device displays ( 2026 ) an indicium that only the second category in the first set of categories remains displayed. For example, FIG. 8D shows indicium 802 indicating that only category 502 - 2 is displayed.
- the first touch input is ( 2028 ) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture) and the fourth touch input is a drag gesture or a swipe gesture that moves in a second predefined direction on the touch-sensitive surface that is distinct from the first predefined direction (e.g., a rightward drag gesture).
- the second predefined direction is opposite the first predefined direction.
- the second predefined direction is perpendicular to the first predefined direction.
- the movement of contact 510 shown in FIGS. 5A-5D represents a swipe gesture toward the left side of the screen and the movement of contact 810 shown in FIG. 8A-8C represents a swipe gesture toward the right side of the screen.
- the device in response ( 2012 ) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device displays ( 2030 ) an indicium that the first category has been removed.
- an indicium is displayed that indicates that data corresponding to the first category has been filtered out of the data that is used to create various related charts, such as the first chart and the second chart. For example, FIG. 5C shows indicium 504 indicating that category 502 - 1 has been removed.
- the device changes ( 2032 ) from displaying the first chart with the first set of categories, other than the first category, to displaying a second chart.
- FIGS. 6A-6C show an animated transition from a first chart shown in FIG. 6A to a second chart shown in FIG. 6C while continuing to display indicium 504 .
- the second chart concurrently displays ( 2034 ) a second set of categories that are distinct from the first set of categories.
- Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart.
- FIG. 6A shows a chart including categories 502
- FIG. 6C shows a second chart including categories 604 , distinct from categories 502 , and bars for each of categories 604 .
- the device while displaying the second chart with the second set of categories, the device detects ( 2036 ) a second touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the indicium that the first category has been removed.
- a second touch input e.g., a tap gesture, a swipe gesture, or a drag gesture
- FIG. 7A shows a chart distinct from the chart shown in FIG. 6A and including categories 612 and indicium 504 .
- FIG. 7A also shows the device detecting contact 710 at position 710 - a corresponding to indicium 504 .
- the device in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the first category has been removed, the device updates ( 2038 ) display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart.
- data that corresponds to the first category which was filtered out of the first chart and remained filtered out when the second chart was initially displayed, is added to the second chart and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the addition of the data that corresponds to the first category.
- FIGS. 7A-7C show the device detecting contact 710 moving from position 710 - a in FIG. 7A to position 710 - c in FIG. 7C .
- FIGS. 7A-7C also show an animated transition of the chart updating to reflect inclusion of the data from category 502 - 1 .
- updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart includes reordering ( 2040 ) display of the second set of categories in the second chart. For example, if the second set of categories in the second chart are ordered largest to smallest, and adding in the data that corresponds to the first category in the first chart changes the order of the second set of categories, then the display of the second chart is updated to reflect the changed order of the second set of categories. For example, via an animated rearrangement of the second set of categories as shown in FIGS. 7A-7C .
- the device detects ( 2042 ) a third touch input. For example, a tap gesture, a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the first category had been removed.
- FIG. 7D shows the device detecting contact 720 at a position corresponding to indicium 504 .
- the device in response to detecting a third touch input, updates ( 2044 ) display of the second chart to reflect removal of data that corresponds to the first category in the first chart.
- data that corresponds to the first category which was added to the second chart in response to the second touch input (e.g., a rightward swipe or drag gesture) is removed in response to the third touch input (e.g., a leftward swipe or drag gesture) and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the removal of the data that corresponds to the first category.
- FIG. 7C shows a bar chart including categories 612
- FIG. 7D shows the device detecting contact 720 at a position corresponding to indicium 504 and an update to the bar chart to reflect exclusion of data corresponding to category 502 - 1 .
- FIGS. 21A-21F are flow diagrams illustrating method 2100 of data visualization, in accordance with some embodiments.
- Method 2100 is performed at an electronic device (e.g., portable multifunction device 100 , FIG. 1 , or device 200 , FIG. 2 ) with a display and a touch-sensitive surface.
- the display is a touch screen display and the touch-sensitive surface is on the display.
- the display is separate from the touch-sensitive surface.
- method 2100 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one or more processors 302 of portable multifunction device 100 and/or the one or more processors 352 of multifunction device 200 , as shown in FIGS. 3A-3B .
- Some operations in method 2100 are, optionally, combined and/or the order of some operations is, optionally, changed.
- method 2100 provides an intuitive way to change chart categories. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when changing chart categories, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to switch categories faster and more efficiently conserves power and increase the time between battery charges.
- the device displays ( 2102 ) a first chart on the display.
- FIG. 6A shows UI 601 including a bar chart.
- the first chart is derived ( 2104 ) from a set of data.
- the chart in FIG. 6A is derived from a set of data in file 402 shown in FIG. 4A .
- the first chart concurrently displays ( 2106 ) a first set of categories and a label for the first set of categories.
- FIG. 6A a chart with chart label 602 - 1 including shows categories 502 , each with a corresponding category label 408 .
- Each respective category in the first set of categories has ( 2108 ) a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories.
- a respective category in the bar chart has a corresponding bar that represents a value for the sum of sales for that category.
- “sales” is the first field
- the aggregation type is SUM.
- Each of the records in the underlying data set is included in one of the categories and is aggregated with other records from the same category.
- the same first field “sales” is used and the same aggregation type SUM is used, but now the underlying records are grouped according to a different set of categories.
- the device detects ( 2110 ) a first touch input (e.g., a swipe gesture or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories.
- a first touch input e.g., a swipe gesture or a drag gesture
- FIG. 6A shows the device detecting contact 610 at position 610 - a corresponding to chart label 602 - 1 .
- the first touch input is ( 2112 ) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture).
- a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture).
- the movement of contact 610 shown in FIGS. 6A-6C represents a swipe gesture toward the left side of the screen.
- the device In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, the device replaces ( 2114 ) display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition.
- FIGS. 6A-6C show an animated transition where the device replaces the first chart with chart label 602 - 1 with a second chart with chart label 602 - 2 in concert with movement of contact 610 .
- the second chart is derived ( 2116 ) from the set of data.
- the chart in FIG. 6C is derived from a set of data in file 402 shown in FIG. 4A .
- the second chart concurrently displays ( 2118 ) a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories.
- the label for the first set of categories remains displayed while the finger contact in the first touch input remains in continuous contact with the touch-sensitive surface, and the label for the first set of categories ceases to be displayed (e.g., fades out) in response to detecting lift off of the finger contact in the first touch input from the touch-sensitive surface.
- FIG. 6A a first chart with chart label 602 - 1 including shows categories 502 , each with a corresponding category label 408 .
- 6B-6C show an animated transition where the first chart is replaced with a second chart with chart label 602 - 2 and including categories 604 , where chart label 602 - 2 is distinct from chart label 602 - 1 and categories 604 are distinct from categories 502 .
- Each respective category in the second set of categories has ( 2120 ) a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
- a respective category in the bar chart has a corresponding bar that represents a value for the sum of sales for that category.
- a label for the first field and aggregation type is displayed ( 2122 ) with the first chart, and the label for the first field and aggregation type continues to be displayed with the second chart.
- FIG. 6A shows a first chart including field label 606
- FIG. 6C shows a second chart including field label 606 .
- a label for the first field and aggregation type (e.g., SUM, MAX, MIN, AVERAGE, COUNT) is displayed ( 2124 ) with the first chart.
- a label for the first field and aggregation type e.g., SUM, MAX, MIN, AVERAGE, COUNT
- FIG. 6A shows a first chart including field label 606 .
- the device in response to detecting the first touch input, displays ( 2126 ) an animation of the second set of categories replacing the first set of categories; displays an animation of the label for the second set of categories replacing the label for the first set of categories; and maintains display of the label for the first field and aggregation type.
- FIGS. 6A-6C show an animated transition where the device replaces the first chart with chart label 602 - 1 with a second chart with chart label 602 - 2 in concert with movement of contact 610 while maintaining display of field label 606 .
- the device while displaying the second chart with the second set of categories, the device detects ( 2128 ) a second touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of an indicium that a predefined subset of data is not included in the aggregated values of the first field.
- a second touch input e.g., a tap gesture, a swipe gesture, or a drag gesture
- FIG. 7A shows a chart distinct from the chart shown in FIG. 6A and including categories 612 and indicium 504 .
- FIG. 7A also shows the device detecting contact 710 at position 710 - a corresponding to indicium 504 .
- the device in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the predefined subset of data is not included in the aggregated values of the first field, the device updates display ( 2130 ) of the second chart to reflect inclusion of the predefined subset of data in the aggregated values.
- data that was filtered out of the second set of data is added to the second set of data and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the addition of the data that was previously filtered out.
- FIGS. 7A-7C show the device detecting contact 710 moving from position 710 - a in FIG. 7A to position 710 - c in FIG. 7C .
- FIGS. 7A-7C also show an animated transition of the chart updating to reflect inclusion of the data from category 502 - 1 .
- updating display of the second chart to reflect inclusion of the predefined subset of data includes reordering ( 2132 ) display of the second set of categories in the second chart. For example, if the second set of categories in the second chart are ordered largest to smallest, and adding in the predefined subset of data changes the order of the second set of categories, then the display of the second chart is updated to reflect the changed order of the second set of categories. For example, via an animated rearrangement of the second set of categories as shown in FIGS. 7A-7C .
- the device detects ( 2134 ) a third touch input. For example, a tap gesture, a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the predefined subset of data is not included in the second set of data.
- a tap gesture a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the predefined subset of data is not included in the second set of data.
- FIG. 7D shows the device detecting contact 720 at a position corresponding to indicium 504 .
- the device in response to detecting a third touch input, updates ( 2136 ) display of the second chart to reflect removal of the predefined subset of data.
- the predefined subset of data which was added to the second chart in response to the second touch input (e.g., a rightward swipe or drag gesture) is removed in response to the third touch input (e.g., a leftward swipe or drag gesture) and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the removal of the predefined subset of data.
- FIG. 7C shows a bar chart including categories 612
- FIG. 7D shows the device detecting contact 720 at a position corresponding to indicium 504 and an update to the bar chart to reflect exclusion of data corresponding to category 502 - 1 .
- replacing display of the first chart with the second chart via the animated transition in response to detecting the first touch input occurs ( 2138 ) without displaying a selection menu.
- the device does not display a selection menu that contains possible sets of categories to display in the second chart.
- FIGS. 6A-6C show an animated transition where the device replaces the first chart with chart label 602 - 1 with a second chart with chart label 602 - 2 without displaying a selection menu.
- the first touch input is ( 2140 ) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- the movement of contact 610 shown in FIGS. 6D-6F represents a swipe gesture toward the left side of the screen.
- the device while displaying the second chart, the device detects ( 2142 ) a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories.
- FIG. 6G shows the chart as in FIG. 6F including categories 612 .
- FIG. 6G also shows the device detecting contact 630 at a position corresponding to chart label 602 - 3 , as shown in FIG. 6F .
- the device in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, the device displays ( 2144 ) a selection menu with possible sets of categories to display in a third chart.
- FIG. 6G shows UI 613 including selection menu 616 in response to the device detecting contact 630 .
- the device detects ( 2146 ) selection of a respective set of categories in the selection menu. For example, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the respective set of categories in the selection menu.
- FIG. 6H shows UI 615 including selection menu 616 , where selection menu 616 includes selection categories 618 .
- FIG. 6H also shows the device detecting contact 640 at a position corresponding to selection category 618 - 2 .
- the device in response to detecting selection of the respective set of categories in the selection menu, the device: replaces ( 2148 ) display of the second chart with a third chart that contains the selected respective set of categories; and ceasing to display the selection menu.
- swipe or drag gestures on a chart label are used as a shortcut to quickly move between different chart types, whereas a tap gesture on the chart label is used to display a selection menu with available chart types and another tap gesture is used to select and display a particular chart type.
- FIGS. 6G-6I show a transition between a first chart with categories 612 and a second chart with categories 622 .
- FIG. 6H shows UI 615 including the first chart and selection menu 616 , where selection menu 616 includes selection categories 618 .
- FIG. 6H also shows the device detecting contact 640 at a position corresponding to selection category 618 - 2 .
- FIG. 6I shows UI 617 including the second chart shown in response to the device detecting contact 640 .
- the first touch input is ( 2140 ) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- the movement of contact 620 shown in FIGS. 6D-6F represents a swipe gesture toward the left side of the screen.
- the device while displaying the second chart, the device detects ( 2150 ) a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories.
- FIG. 6G shows the chart as in FIG. 6F including categories 612 .
- FIG. 6G also shows the device detecting contact 630 at a position corresponding to chart label 602 - 3 , as shown in FIG. 6F .
- the device in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, the device displays ( 2152 ) a selection menu with possible sets of categories to display in a third chart.
- FIG. 6G shows UI 613 including selection menu 616 in response to the device detecting contact 630 .
- the device detects ( 2154 ) selection of a first set of categories in the selection menu and a second set of categories in the selection menu. For example, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the first set of categories in the selection menu and detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the second set of categories in the selection menu.
- FIG. 6H shows UI 615 including selection menu 616 , where selection menu 616 includes selection categories 618 .
- the device detects a plurality of selection categories 618 .
- the device in accordance with detecting selection of the first set of categories in the selection menu and the second set of categories in the selection menu, the device: replaces ( 2156 ) display of the second chart with a third chart that contains the first set of categories and the second set of categories; and ceases to display the selection menu.
- the third chart contains categories 612 as shown in FIG. 6G and categories 622 as shown in FIG. 6I .
- FIGS. 22A-22B are flow diagrams illustrating method 2200 of data visualization, in accordance with some embodiments.
- Method 2200 is performed at an electronic device (e.g., portable multifunction device 100 , FIG. 1 , or device 200 , FIG. 2 ) with a display and a touch-sensitive surface.
- the display is a touch screen display and the touch-sensitive surface is on the display.
- the display is separate from the touch-sensitive surface.
- method 2200 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one or more processors 302 of portable multifunction device 100 and/or the one or more processors 352 of multifunction device 200 , as shown in FIGS. 3A-3B .
- Some operations in method 2200 are, optionally, combined and/or the order of some operations is, optionally, changed.
- method 2200 provides an intuitive way to adjust chart magnification (e.g., zooming in and/or zooming out the chart view). This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when adjusting chart magnification, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust magnification faster and more efficiently conserves power and increase the time between battery charges.
- the device displays ( 2202 ) a first chart on the display.
- FIG. 11A shows UI 1101 including a chart.
- the chart has ( 2204 ) a horizontal axis and a vertical axis.
- the chart in FIG. 11A has a vertical axis (Money) and a horizontal axis (Time).
- the horizontal axis includes ( 2206 ) first horizontal scale markers.
- the chart in FIG. 11A has a horizontal axis (Time) with Month markers (e.g., February and March).
- the vertical axis includes ( 2208 ) first vertical scale markers.
- the chart in FIG. 11A has a vertical axis (Money) with thousand dollar markers (e.g., $1,000 through $4,000).
- the device detects ( 2210 ) a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart.
- FIG. 11A shows the device detecting contact 1110 at position 1110 - a and contact 1120 at position 1120 - a.
- the first touch input is ( 2212 ) a de-pinch gesture.
- the movement of contacts 1110 and 1120 shown in FIGS. 11A and 11B represents a de-pinch gesture.
- FIG. 11A shows UI 1101 including a chart with a vertical axis (Money) and a horizontal axis (Time).
- FIG. 11A also shows the device detecting contact 1110 at position 1110 - a and contact 1120 at position 1120 - a .
- FIG. 11B shows the device detecting contact 1110 at position 1110 - b and contact 1120 at position 1120 - b and also shows the distance between the horizontal markers (e.g., month markers) increasing while the distance between the vertical markers remains the same.
- the method further includes detecting a second touch input; and, while detecting the second touch input: horizontally shrinking a portion of the chart such that a distance between first horizontal scale markers decreases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
- FIG. 12B shows UI 1207 including a chart with a vertical axis (Money) and a horizontal axis (Time).
- FIGS. 12B also shows the device detecting contact 1210 at position 1210 - b and contact 1220 at position 1220 - b .
- FIGS. 12C and 12D show the device detecting contact 1110 at positions 1210 - c and 1210 - d and contact 1220 at positions 1220 - c and 1220 - d respectively.
- FIGS. 12C and 12D also show the distance between the horizontal markers (e.g., hour markers) decreasing while the distance between the vertical markers remains the same.
- the second touch input is a pinch gesture.
- the movement of contacts 1210 and 1220 shown in FIGS. 12B-12D represents a pinch gesture.
- the method further includes detecting a third touch input; and, while detecting the third touch input, adjusting the chart view and the horizontal axis of the chart corresponding to the third touch input.
- the third touch input is a drag gesture and the method includes: detecting movement of a finger contact in the drag gesture across the touch-sensitive surface and adjusting the chart view and horizontal axis of the chart accordingly.
- FIG. 15B shows UI 1503 including a chart with a first chart view.
- FIG. 15B also shows the device detecting contact 1510 at position 1510 - b .
- FIG. 15C shows the device detecting contact 1510 at position 1510 - c (to the left of position 1510 - b ).
- FIG. 15C also shows UI 1505 including a chart with a second chart view (e.g., shifted to the left compared to the first chart view).
- the device after horizontally expanding the portion of the chart and maintaining the vertical scale of the chart while detecting the first touch input, the device ceases ( 2216 ) to detect the first touch input.
- FIG. 11B shows UI 1103 with a horizontally expanded portion of the chart shown in FIG. 11A .
- FIG. 11B also shows the device detecting contacts 1110 and 1120 and the vertical scale of the chart remaining the same as the scale shown in FIG. 11A .
- the device in response to ceasing to detect the first touch input (e.g., detecting lift off of the fingers in the first touch input), the device changes ( 2218 ) a vertical scale of the chart.
- the vertical scale is adjusted so that all of the data marks are visible within a predefined margin. For example, in FIGS. 11C and 11D , the device ceases to detect contacts 1110 and 1120 and the vertical scale of the chart is adjusted such that the distance between the vertical markers increases.
- the device after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases ( 2220 ), the device, while continuing to detect the first touch input: continues ( 2222 ) to horizontally expand a portion of the chart; displays second horizontal scale markers, the second horizontal scale markers being at a finer scale than the first horizontal scale markers; and continues to maintain the vertical scale of the chart.
- the horizontal scale markers change from years to months, months to weeks, weeks to days, or days to hours, as shown in FIGS. 11A-11J .
- the second scale markers are displayed in addition to the first scale markers.
- the original first scale makers may be years, but when the horizontal scale is expanded, month scale markers are shown as well.
- FIG. 11A shows a chart with horizontal markers denoting months.
- FIG. 11B shows a chart with horizontal markers denoting months and horizontal markers denoting days.
- the device after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases ( 2224 ), the device, while continuing to detect the first touch input: continues ( 2226 ) to horizontally expand a portion of the chart; replaces a first set of displayed data marks with a second set of displayed data marks, where for at least some of the data marks in the first set of data marks, an individual data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks; and continues to maintain the vertical scale of the chart.
- FIG. 11A shows a chart including data marks 1102 (e.g., data mark 1102 - 1 and data mark 1102 - 3 ).
- FIG. 11A shows a chart including data marks 1102 (e.g., data mark 1102 - 1 and data mark 1102 - 3 ).
- 11B shows a chart including data marks 1102 (e.g., data mark 1102 - 1 and data mark 1102 - 3 ) and data marks 1104 (e.g., data mark 1104 - 1 ).
- a single data mark e.g., a circle, square, triangle, bar, or other representation of data points
- a single data mark that corresponds to multiple data points is replaced by a plurality of data marks that correspond to the multiple data points.
- the first set of data marks is replaced with the second set of data marks at the same time that the first horizontal scale marks are replaced with the second, finer horizontal scale marks.
- the data in a portion of the chart can be displayed at successively finer levels of granularity as a portion of the chart expands horizontally.
- FIGS. 23A-23B are flow diagrams illustrating method 2300 of data visualization, in accordance with some embodiments.
- Method 2300 is performed at an electronic device (e.g., portable multifunction device 100 , FIG. 1 , or device 200 , FIG. 2 ) with a display and a touch-sensitive surface.
- the display is a touch screen display and the touch-sensitive surface is on the display.
- the display is separate from the touch-sensitive surface.
- method 2300 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one or more processors 302 of portable multifunction device 100 and/or the one or more processors 352 of multifunction device 200 , as shown in FIGS. 3A-3B .
- Some operations in method 2300 are, optionally, combined and/or the order of some operations is, optionally, changed.
- method 2300 provides an intuitive way to display information about a data mark. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when accessing information about a data mark, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to access data mark information faster and more efficiently conserves power and increase the time between battery charges.
- the device displays ( 2302 ) at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks (e.g., circles, squares, triangles, bars, or other representations of data points).
- a plurality of data marks e.g., circles, squares, triangles, bars, or other representations of data points.
- FIG. 19D shows UI 1907 including a chart with data marks 1902 , 1904 , and 1906 .
- the device detects ( 2304 ) a first touch input (e.g., a de-pinch gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart.
- a first touch input e.g., a de-pinch gesture
- FIG. 19D shows the device detecting contact 1920 and contact 1922 .
- the device zooms ( 2306 ) in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks.
- FIG. 19E shows the a zoomed in view of the chart shown in FIG. 19D in response to the device detecting contacts 1920 and 1922 and both FIG. 19D and FIG. 19E show data mark 1906 - 1 .
- the device While displaying the second portion of the chart at the second magnification, the device detects ( 2308 ) a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart.
- FIG. 19E shows the device detecting contact 1930 at position 1930 - a and contact 1940 at position 1940 - a.
- the second touch input is ( 2310 ) a same type of touch input as the first touch input (e.g., both the first touch input and the second touch input are de-pinch gestures).
- contacts 1210 and 1220 shown in FIG. 19D and contacts 1930 and 1940 shown in FIGS. 19E-19H each represent a de-pinch gesture.
- the device In response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart ( 2312 ), the device, in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooms ( 2314 ) in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks.
- the device In response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart ( 2312 ), the device, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displays ( 2316 ) information about the first data mark. In some embodiments, while displaying information about the first data mark, the device detects a third touch input on the touch-sensitive surface; and in response to detecting the third touch input, ceases to display the information about the first data mark and display a fourth portion of the chart. In some embodiments, the fourth portion of the chart is the second portion of the chart.
- FIGS. 19E-19I show movement of contacts 1930 and 1940 representing a de-pinch gesture.
- FIGS. 19E-19I also show an animated transition from UI 1909 including data mark 1908 - 1 to UI 1917 including record 1914 - 1 (e.g., information about data mark 1908 - 1 ).
- the information about the first data mark comprises ( 2318 ) a data record that corresponds to the first data mark.
- FIGS. 19E-19I also show an animated transition from UI 1909 including data mark 1908 - 1 to UI 1917 including record 1914 - 1 .
- the data-mark-information-display criteria include ( 2320 ) the second magnification being a predefined magnification. For example, if the first touch input zooms in the chart to a predefined maximum magnification, then the second touch input causes display of the information about the first data mark, instead of (or in addition to) causing continued zooming in of the chart.
- the data-mark-information-display criteria include ( 2322 ) the first data mark in the plurality of data marks being the only data mark displayed at the second magnification after the first touch input. For example, if the first touch input zooms in the chart so that only the first data mark is displayed, then the second touch input causes display of the information about the first data mark, instead of (or in addition to) causing continued zooming in of the chart.
- the data-mark-information-display criteria include ( 2324 ) the first data mark reaching a predefined magnification during the second touch input. In some embodiments, if the first data mark reaches a predefined magnification during the second touch input (e.g., during a de-pinch gesture), then the device zooms in during the second touch input prior to reaching the predefined magnification, and the device displays the information about the first data mark after reaching the predefined magnification (with or without continuing to zoom in the chart during the remainder of the second touch input).
- a predefined magnification during the second touch input e.g., during a de-pinch gesture
- the data-mark-information-display criteria include ( 2326 ) the device zooming in to display only the first data mark in the plurality of data marks during the second touch input. In some embodiments, if during the second touch input (e.g., a de-pinch gesture), the device zooms in such that the first data mark is the only data mark that is displayed, the device displays the information about the first data mark after the first data mark is the only data mark that is displayed (with or without continuing to zoom in the chart during the remainder of the second touch input).
- the second touch input e.g., a de-pinch gesture
- the device in accordance with the determination that one or more predefined data-mark-information-display criteria are met, the device ceases ( 2328 ) to display the first data mark.
- display of the first data mark is replaced by display of a data record that corresponds to the first data mark when the one or more predefined data-mark-information-display criteria are met (e.g., via an animated transition).
- FIGS. 19E-19I also show an animated transition where display of record 1914 - 1 replaces display of data mark 1908 - 1 .
- FIGS. 24A-24E are flow diagrams illustrating method 2400 of data visualization, in accordance with some embodiments.
- Method 2400 is performed at an electronic device (e.g., portable multifunction device 100 , FIG. 1 , or device 200 , FIG. 2 ) with a display and a touch-sensitive surface.
- the display is a touch screen display and the touch-sensitive surface is on the display.
- the display is separate from the touch-sensitive surface.
- method 2400 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one or more processors 302 of portable multifunction device 100 and/or the one or more processors 352 of multifunction device 200 , as shown in FIGS. 3A-3B .
- Some operations in method 2400 are, optionally, combined and/or the order of some operations is, optionally, changed.
- method 2400 provides an intuitive way to select portions of a chart and/or display information about the underlying data. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when selecting chart areas, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to select portions of a chart and view information about the underlying data faster and more efficiently conserves power and increase the time between battery charges.
- the device displays ( 2402 ) a chart on the display, the chart including a plurality of data marks.
- FIG. 13A shows UI 1301 including a chart with a data marks 1312 (e.g., data mark 1312 - 1 through data mark 1312 - 10 ).
- data marks in the plurality of data marks are displayed ( 2404 ) in corresponding columns in the chart, with a single data mark per column.
- data marks 1312 in FIG. 13A are displayed such that each data mark is in a separate column of the chart.
- data marks in the plurality of data marks are separated ( 2406 ) horizontally from one another.
- data marks 1312 in FIG. 13A are displayed such that each data mark is separated horizontally from one another.
- the device detects ( 2408 ) a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart (e.g., a bar in a bar chart), the first predefined area having a corresponding first value.
- a first predefined area in the chart e.g., a bar in a bar chart
- FIG. 13A shows the device having detected contact 1309 at a position corresponding to selected portion 1302 .
- the first touch input is ( 2410 ) a tap gesture.
- contact 1309 shown in FIG. 13A represents a tap gestures.
- the first predefined area includes ( 2412 ) a column in the chart.
- FIG. 13A shows UI 1301 including a chart and selected portion 1302 shown in FIG. 13A includes a column of the chart.
- the first predefined area includes ( 2414 ) a single data mark in the plurality of data marks.
- selected portion 1302 shown in FIG. 13A includes only data mark 1312 - 1 .
- the device In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart ( 2416 ), the device: selects ( 2418 ) the first predefined area; and visually distinguishes the first predefined area.
- FIG. 13A shows the device having detected contact 1309 at a position corresponding to selected portion 1302 .
- FIG. 13A also shows selected portion 1302 visually distinguished from the remainder of the chart.
- the device detects ( 2420 ) a second touch input on the touch-sensitive surface.
- FIG. 13A shows selected portion 1302 and the device detecting contact 1310 .
- the second touch input is initially detected ( 2422 ) at a location on the touch-sensitive surface that corresponds to a location on the display of the first predefined area.
- FIG. 13A shows the device detecting contact 1310 at position 1310 - a corresponding to a part of selected portion 1302 .
- the second touch input is initially detected ( 2424 ) at a location on the touch-sensitive surface that corresponds to a location on the display of an edge of the first predefined area. For example, contact 1310 in FIG. 13A is detected on the edge of selected portion 1302 .
- the second touch input is initially detected ( 2426 ) at a location on the touch-sensitive surface that corresponds to a location on the display of a selection handle in or next to the first predefined area. For example, contact 1310 in FIG. 13A is detected at a position corresponding to the location of a handle for selected portion 1302 .
- the second touch input is ( 2428 ) a drag gesture.
- the movement of contact 1310 shown in FIGS. 13A-13D represents a drag gesture toward the right side of the screen.
- the device detects ( 2430 ) movement of a finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values.
- FIGS. 13B-13D show movement of contact 1310 across multiple columns (e.g., a particular column associated with each respective data mark in data marks 1312 ) within the chart.
- FIGS. 13B-13D also show the columns being added to the selected portion in accordance with the movement of contact 1310 .
- the device in response to detecting movement of the finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values, displays ( 2432 ) a series of changes between the first value in the first predefined area and the corresponding values of the sequence of predefined areas.
- FIGS. 13B-13D show columns being added to the selected portion in accordance with the movement of contact 1310 .
- FIGS. 13B-13D also show a change value denoting the change between the value of data mark 1312 - 1 and the value of the last selected data mark.
- FIG. 13D shows selection of data mark 1312 - 10 and a change value denoting the change between the value of data mark 1312 - 1 and the value of data mark 1312 - 10 .
- FIGS. 13B-13D show columns being added to the selected portion in accordance with the movement of contact 1310 .
- FIG. 13B shows contact 1310 at position 1310 - b and corresponding selected portion 1304 , where selected portion 1304 includes the columns in selected portion 1302 from FIG. 13A .
- FIG. 13C shows contact 1310 at position 1310 - c and corresponding selected portion 1306 , where selected portion 1306 includes the columns in selected portion 1304 from FIG. 13B .
- the device In response to detecting the second touch input on the touch-sensitive surface ( 2434 ), the device displays ( 2438 ) a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
- FIG. 13D shows selection of data mark 1312 - 10 and a change value denoting the change between the value of data mark 1312 - 1 and the value of data mark 1312 - 10 .
- a selected area in the chart comprises ( 2440 ) the first predefined area and the sequence of predefined areas.
- FIGS. 13B-13D show columns being added to the selected portion in accordance with the movement of contact 1310 .
- FIG. 13D shows contact 1310 at position 1310 - d and corresponding selected portion 1308 , where selected portion 1308 includes the columns from the selected portions shown in FIGS. 13A-13C .
- the device detects ( 2442 ) a third touch input, the third touch input including initial contact of a finger at a location on the touch-sensitive surface that corresponds to a location on the display within the selected area in the chart, and movement of the finger across the touch-sensitive surface.
- FIG. 14A shows selected portion 1308 including data for three months (February through April).
- FIG. 14A also shows the device detecting contact 1402 at a position corresponding to selected portion 1308 .
- the device in response to detecting the third touch input ( 2444 ), moves ( 2446 ) the selected area across the chart, in accordance with the movement of the finger across the touch-sensitive surface, while maintaining a number of predefined areas in the moved selected area equal to the number of predefined areas in the sequence of predefined areas plus one. For example, in some embodiments, in response to detecting movement of contact 1402 toward the left side of the screen, the device moves selected portion 1308 to include data for months January through March (i.e., data for three months).
- the device in response to detecting the third touch input ( 2444 ), displays ( 2448 ) a change between a value corresponding to a leftmost predefined area in the moved selected area and a value corresponding to a rightmost predefined area in the moved selected area. For example, in some embodiments, in response to detecting movement of contact 1402 toward the left side of the screen, the device moves selected portion 1308 to include data for months January through March and the change value updates to denote the change in value between the leftmost data mark in the selected portion and the rightmost data mark in the selected portion.
- a selected area in the chart comprises ( 2450 ) the first predefined area and the sequence of predefined areas.
- selected portion 1308 in FIG. 13D includes the columns from each of selected portions 1302 , 1304 , and 1306 .
- the device detects ( 2452 ) a fourth touch input (e.g., a de-pinch gesture).
- a fourth touch input e.g., a de-pinch gesture.
- FIGS. 18A-18C show contacts 1810 and 1820 and the movement of contacts 1810 and 1820 shown in FIGS. 18A-18C represents a de-pinch gesture.
- the device in response to detecting the fourth touch input ( 2454 ), the device zooms ( 2456 ) in on the selected area in the chart.
- FIGS. 18A-18C show a chart with selected portion 1802 and FIGS. 18A-18C show the device zooming in on selected portion 1802 in accordance with the movement of contacts 1810 and 1820 .
- the device in response to detecting the fourth touch input ( 2454 ), the device, in accordance with a determination that areas in the chart outside the selected area are still displayed on the display, maintains ( 2458 ) selection of the selected area. In some embodiments, while zooming in, the device maintains selection of the selected area. In some embodiments, after zooming in, the device maintains selection of the selected area. For example, FIGS. 18A-18C show the device zooming in on selected portion 1802 in accordance with the movement of contacts 1810 and 1820 and FIG. 18D shows the device maintaining selection of selected portion 1802 .
- the device in response to detecting the fourth touch input ( 2454 ), the device, in accordance with a determination only areas in the chart in the selected area are displayed on the display, ceases ( 2460 ) selection of the selected area.
- the selected area disappears when the device zooms in to the chart such that no area in the chart outside the selected area is displayed.
- the device while zooming in, the device ceases selection of the selected area.
- the device after zooming in, the device ceases selection of the selected area.
- FIGS. 25A-25D are flow diagrams illustrating method 2500 of data visualization, in accordance with some embodiments.
- Method 2500 is performed at an electronic device (e.g., portable multifunction device 100 , FIG. 1 , or device 200 , FIG. 2 ) with a display and a touch-sensitive surface.
- the display is a touch screen display and the touch-sensitive surface is on the display.
- the display is separate from the touch-sensitive surface.
- method 2500 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one or more processors 302 of portable multifunction device 100 and/or the one or more processors 352 of multifunction device 200 , as shown in FIGS. 3A-3B .
- Some operations in method 2500 are, optionally, combined and/or the order of some operations is, optionally, changed.
- method 2500 provides an intuitive way to update chart views. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen.
- the method reduces the cognitive burden on the user when adjusting a chart view (e.g., adjusting chart magnification), thereby creating a more efficient human-machine interface.
- a chart view e.g., adjusting chart magnification
- For battery-operated electronic devices enabling a user to adjust chart views faster and more efficiently conserves power and increase the time between battery charges.
- the device displays ( 2502 ) a chart on the display.
- FIG. 19A shows UI 1901 including a chart.
- the chart has ( 2504 ) a horizontal axis with a first horizontal scale with first horizontal scale markers.
- the chart in FIG. 19A has a horizontal scale with horizontal scale markers denoting years.
- the chart has ( 2506 ) a vertical axis with a first vertical scale with first vertical scale markers.
- the chart in FIG. 19A has a vertical scale with vertical scale markers denoting hundreds of sunspots.
- the chart includes ( 2508 ) a first set of data marks.
- the chart in FIG. 19A includes data marks 1902 .
- adjacent data marks in the first set of first data marks are separated ( 2510 ) by a first horizontal distance.
- the first horizontal distance corresponds to the first horizontal scale.
- the chart in FIG. 19A includes a respective data mark 1902 for each year on the horizontal axis.
- Each respective data mark in the first set of data marks has ( 2512 ) a respective abscissa and a respective ordinate.
- each data mark 1902 in FIG. 19A has a respective abscissa and a respective ordinate.
- the chart includes ( 2514 ) a line that connects adjacent data marks in the first set of data marks.
- the chart in FIG. 19A includes a line that connects data marks 1902 .
- the device detects ( 2516 ) a first touch input (e.g., a de-pinch gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the chart.
- a first touch input e.g., a de-pinch gesture
- the movement of contacts 1910 and 1912 shown in FIGS. 19A-19D represents a de-pinch gesture.
- FIG. 19B shows an expanded portion of the chart shown in FIG. 19A .
- FIG. 19B shows the distance between horizontal scale markers being greater than the distance between horizontal scale markers in FIG. 19A .
- FIG. 19B shows the expanded portion of the chart in response to contacts 1910 and 1912 moving from positions 1910 - a and 1912 - a to positions 1910 - b and 1912 - b.
- the device While detecting the first touch input ( 2518 ), the device expands ( 2522 ) at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input.
- the expanded portion of the chart shown in FIG. 19B includes expanded portions of the line connecting data marks 1902 .
- the device While detecting the first touch input ( 2518 ), the device adds ( 2524 ) a second set of second data marks, distinct from the first set of data marks, on the line. For example, FIG. 19B shows data marks 1904 added to the line connecting data marks 1902 .
- Each respective data mark in the second set of data marks includes ( 2526 ) a respective abscissa and a respective ordinate.
- each data mark 1904 shown in FIG. 19B includes a respective abscissa and a respective ordinate.
- Each respective data mark in the second set of data marks is ( 2528 ) placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark.
- each data mark 1904 shown in FIG. 19B is placed on the line based on its respective abscissa without regards to its respective ordinate.
- adjacent data marks in the second set of data marks are separated ( 2530 ) by a second horizontal distance that corresponds to a second horizontal scale that is finer than the first horizontal scale.
- the chart in FIG. 19B includes a respective data mark 1904 for each month and a respective data mark 1902 for each year.
- each respective data mark in the second set of data marks is placed ( 2532 ) on the line based on the respective abscissa of the respective data mark and the ordinate of the line at the respective abscissa of the respective data mark.
- each data mark in data marks 1904 shown in FIG. 19B is placed on the line based on its respective abscissa and the ordinate of the line at its' respective abscissa.
- a shape of the line is maintained ( 2534 ) when the second set of data marks is added to the line.
- the shape of the line in FIG. 19B is maintained when data marks 1904 are added to the line.
- a single data mark in the first set of data marks corresponds ( 2536 ) to a plurality of data marks in the second set of data marks.
- each data mark in data marks 1902 corresponds to twelve data marks in data marks 1904 (e.g., one for each month in the year).
- the device ceases ( 2538 ) to display the set of first data marks when the second set of data marks is added. For example, in some embodiments, the device ceases to display data marks 1902 when data marks 1904 are added to the line.
- the device After adding the second set of data marks on the line ( 2540 ), the device, for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moves ( 2542 ) the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis.
- data marks 1904 are animatedly moved from their initial positions shown in FIG. 19B to their respective ordinate as shown in FIG. 19C .
- animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs ( 2544 ) while detecting the first input.
- data marks 1904 are animatedly moved from their initial positions shown in FIG. 19B to their respective ordinate as shown in FIG. 19C while the device continues to detect contacts 1910 and 1912 .
- animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs ( 2546 ) after ceasing to detect the first input.
- the second vertical scale is ( 2548 ) the same as the first vertical scale.
- the device After adding the second set of data marks on the line ( 2540 ), the device animatedly adjusts ( 2550 ) the line so that the line connects the second set of data marks.
- the line connecting data marks 1904 is animatedly adjusted its' initial position shown in FIG. 19B to its' final position shown in FIG. 19C .
- animatedly moving each respective data mark vertically and animatedly adjusting the line so that the line connects the set of second data marks occur ( 2552 ) concurrently.
- the device ceases ( 2554 ) to display the set of first data marks after the second set of data marks is added.
- FIGS. 26A-26F illustrate how some embodiments allow scrolling through filter selections, with the data visualization updated immediately as the filter changes.
- These figures provide bar charts showing total sales for a three month period in 2014, and the data is filtered by region. In this example, the four regions are Central, East, South, and West.
- Filter indicia 2608 include a scrollable region indicator that indicates Central selection 2612 -C.
- user interface 2601 displays visual graphic 2610 -C, which shows data for the Central region.
- FIG. 26B the user wants to compare the Central region to the other regions, and the device detects a contact at position 2614 corresponding to Central selection 2312 -C. At this time user interface 2601 still displays visual graphic 2610 -C for the Central region.
- FIG. 26C the user has started scrolling upwards (e.g., using a swipe gesture), so the contact is moving upwards to position 2616 .
- the scrollable region indicator is transitioning from “Central” to “East,” so selection 2612 -C/E is in an interim state.
- Visual graphic 2610 -C is still the graphic for the Central region.
- the scrollable region indicator displays East selection 2612 -E, as illustrated in FIG. 26D
- visual graphic 2610 -E including data for the East region, is displayed.
- the contact is still moving upwards to position 2618 .
- FIG. 26E the contact has moved upward to position 2620 and the indicator shows “South” region selection 2612 -S.
- user interface 2601 displays visual graphic 2610 -S including data for the South region.
- the contact is still moving upwards to position 2622 , so the scrollable region indicator comes to West region selection 2612 -W, as illustrated in FIG. 26F .
- user interface 2601 displays the data for the West region in visual graphic 2610 -W.
- a user can quickly scroll through filter values, and the visual graphic updates according to the filter as different filter values are selected.
- the updates to the display depend on the scroll speed. For example, if the scrolling is performed slowly, the visual graphic is updated for each filter value as illustrated in FIGS. 26A-26F .
- the values are scrolled quickly, a user is probably not interested in the intermediate values, and thus the visual graphic is not updated until the scrolling slows down or stops.
- a method executes at an electronic device with a touch-sensitive surface and a display.
- the method displays a filter indicium on the display that specifies a first category of a first set of categories, each category corresponding to a respective value of a first field in a data set.
- FIG. 26A includes filter indicium 2608 , which specifies the Central category 2612 -C.
- the Central category is one category of a first set of categories that includes Central, East, South, and West. Each of these categories corresponds to a value of a “region” field in the data set.
- the field values are “Central,” “East,” “South,” and “West,” but in some embodiments the values are encoded differently (e.g., using numeric codes, or alphanumeric code, or just the first letter of each region).
- the method concurrently displays a first chart on the display, such as the visual graphic 2610 -C in FIG. 26A .
- the first chart includes a plurality of visual marks, such as the vertical bars 2650 , 2652 , and 2654 in FIG. 26A .
- Each visual mark corresponds to a respective aggregated value of a first measure in the data set, aggregated according to a second field in the data set and filtered to aggregate only values of the first measure that are associated with the first category.
- each of the vertical bars is computed as an aggregate of sales (here using the SUM aggregate function).
- the field “sales” in the data set is a measure.
- the data for the vertical bars is aggregated by month, which is a second field in the data set (typically computed from a date field).
- the February bar 2650 represents the aggregated total sales for February.
- the aggregated values for the three bars 2650 , 2652 , and 2654 are based on only data for the Central region (i.e., sales records for the Central region).
- the data set may consist of a table in which each row has a region field, a sales field, and a date field.
- the sales data for the bars 2650 , 2652 , and 2654 is gotten by summing rows whose region is Central.
- the method While displaying the first chart, the method detects a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the filter indicium. This is illustrated by the contact points 2614 and 2616 in FIGS. 26B and 26C .
- the method updates the indicium to specify a second category of the first set of categories. This is illustrated in FIG. 26D , where the filter indicium has changed to show the East region.
- the method displays an updated first chart on the display. As illustrated in FIG. 26D , the updated chart 2610 -E includes different bars, which are based on data for the East region.
- the updated first chart includes an updated plurality of visual marks, such as the vertical bars 2660 , 2662 , and 2664 in FIG. 26D .
- Each updated visual mark corresponds to a respective aggregated value of the first measure in the data set, aggregated according to the second field in the data set and filtered to aggregate only values of the first measure that are associated with the second category.
- each of the bars corresponds to aggregated sales, which is aggregated by month, and filtered to include data for just the East region.
- first first
- second second
- first contact could be termed a second contact
- first contact could be termed a first contact
- second contact could be termed a first contact, without changing the meaning of the description, so long as all occurrences of the “first contact” are renamed consistently and all occurrences of the second contact are renamed consistently.
- the first contact and the second contact are both contacts, but they are not the same contact.
- the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting,” that a stated condition precedent is true, depending on the context.
- the phrase “if it is determined [that a stated condition precedent is true]” or “if [a stated condition precedent is true]” or “when [a stated condition precedent is true]” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
Abstract
Description
- This application is a continuation of U.S. application Ser. No. 15/859,235, filed Dec. 29, 2017, entitled “Methods and Devices for Adjusting Chart Magnification,” which is a continuation of U.S. application Ser. No. 14/603,330, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Magnification,” now U.S. Pat. No. 9,857,952, which claims priority to U.S. Provisional Application Ser. No. 62/047,429, filed Sep. 8, 2014, entitled “Methods and Devices for Manipulating Graphical Views of Data,” each of which is incorporated herein by reference in its entirety.
- This application is related to U.S. patent application Ser. No. 14/603,302, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Filters,” U.S. patent application Ser. No. 14/603,312, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Magnification Asymmetrically,” and U.S. patent application Ser. No. 14/603,322, filed Jan. 22, 2015, entitled “Methods and Devices for Displaying Data Mark Information,” each of which is incorporated by reference herein in its entirety.
- This invention relates generally to devices and methods for displaying graphical views of data. The invention relates specifically to devices and methods for manipulating user interfaces displaying graphical views of data.
- Data sets with hundreds of variables or more arise today in many contexts, including, for example: gene expression data for uncovering the link between the genome and the various proteins for which it codes; demographic and consumer profiling data for capturing underlying sociological and economic trends; sales and marketing data for huge numbers of products in vast and ever-changing marketplaces; and environmental measurements for understanding phenomena such as pollution, meteorological changes, and resource impact issues.
- Data visualization is a powerful tool for exploring large data sets, both by itself and coupled with data mining algorithms. Graphical views provide user-friendly ways to visualize and interpret data. However, the task of effectively visualizing large databases imposes significant demands on the human-computer interface to the visualization system.
- In addition, as computing and networking speeds increase, data visualization that was traditionally performed on desktop computers can also be performed on portable electronic devices, such as smart phones, tablets, and laptop computers. These portable devices typically use touch-sensitive surfaces (e.g., touch screens and/or trackpads) as input devices. These portable devices typically have significantly smaller displays than desktop computers. Thus, additional challenges arise in using touch-sensitive surfaces to manipulate graphical views of data in a user-friendly manner on portable devices.
- Consequently, there is a need for faster, more efficient methods and interfaces for manipulating graphical views of data. Such methods and interfaces may complement or replace conventional methods for visualizing data. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
- The above deficiencies and other problems associated with visualizing data are reduced or eliminated by the disclosed methods, devices, and storage mediums. Various implementations of methods, devices, and storage mediums within the scope of the appended claims each have several aspects, no single one of which is solely responsible for the attributes described herein. Without limiting the scope of the appended claims, after considering this disclosure, one will understand how the aspects of various implementations are used to visualize data.
- In one aspect, some embodiments include methods for visualizing data.
- In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a first chart on the display. The first chart concurrently displays a first set of categories, and each respective category in the first set of categories has a corresponding visual mark displayed in the first chart. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart. The method further includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart: removing the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition; and updating display of the first chart.
- In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- In some embodiments, the method includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, ceasing to display the first visual mark.
- In some embodiments, the method includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, displaying an indicium that the first category has been removed.
- In some embodiments, the method includes, while displaying the indicium that the first category has been removed, changing from displaying the first chart with the first set of categories, other than the first category, to displaying a second chart. The second chart concurrently displays a second set of categories that are distinct from the first set of categories, and each respective category in the second set of categories has a corresponding visual mark displayed in the second chart. The method also includes, while displaying the second chart with the second set of categories, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the indicium that the first category has been removed and, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the first category has been removed, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart.
- In some embodiments, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart includes reordering display of the second set of categories in the second chart.
- In some embodiments, the method includes, after updating display of the second chart to reflect inclusion of data that corresponds to the first category, detecting a third touch input, and, in response to detecting a third touch input, updating display of the second chart to reflect removal of data that corresponds to the first category in the first chart.
- In some embodiments, the method includes, while displaying the first chart on the display, detecting a fourth touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a second visual mark for a second category in the first chart. The method also includes, in response to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart: maintaining display of the second category and the second visual mark in the second chart; removing display of all categories, other than the second category, in the first set of categories; and removing display of all visual marks, other than the second visual mark, that correspond to categories in the first set of categories.
- In some embodiments, the method includes, in response to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, displaying an indicium that only the second category in the first set of categories remains displayed.
- In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface and the fourth touch input is a drag gesture or a swipe gesture that moves in a second predefined direction on the touch-sensitive surface that is distinct from the first predefined direction.
- In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a first chart on the display. The first chart is derived from a set of data. The first chart concurrently displays a first set of categories and a label for the first set of categories. Each respective category in the first set of categories has a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories. The method further includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, replacing display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. The second chart is derived from the set of data. The second chart concurrently displays a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories. Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
- In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface.
- In some embodiments, a label for the first field and aggregation type is displayed with the first chart, and the label for the first field and aggregation type continues to be displayed with the second chart.
- In some embodiments, a label for the first field and aggregation type is displayed with the first chart and the method includes, in response to detecting the first touch input: displaying an animation of the second set of categories replacing the first set of categories; displaying an animation of the label for the second set of categories replacing the label for the first set of categories; and maintaining display of the label for the first field and aggregation type.
- In some embodiments, the method includes, while displaying the second chart with the second set of categories, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of an indicium that a predefined subset of data is not included in the aggregated values of the first field. The method also includes, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the predefined subset of data is not included in the aggregated values of the first field, updating display of the second chart to reflect inclusion of the predefined subset of data in the aggregated values.
- In some embodiments, updating display of the second chart to reflect inclusion of the predefined subset of data includes reordering display of the second set of categories in the second chart.
- In some embodiments, the method includes, after updating display of the second chart to reflect inclusion of the predefined subset of data, detecting a third touch input. The method also includes, in response to detecting a third touch input, updating display of the second chart to reflect removal of the predefined subset of data.
- In some embodiments, replacing display of the first chart with the second chart via the animated transition in response to detecting the first touch input occurs without displaying a selection menu.
- In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface and the method includes, while displaying the second chart, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. The method also includes, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, displaying a selection menu with possible sets of categories to display in a third chart. The method further includes detecting selection of a respective set of categories in the selection menu; and, in response to detecting selection of the respective set of categories in the selection menu: replacing display of the second chart with a third chart that contains the selected respective set of categories; and ceasing to display the selection menu.
- In some embodiments, the first touch input is a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface; and the method includes, while displaying the second chart, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. The method also includes, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, displaying a selection menu with possible sets of categories to display in a third chart. The method further includes detecting selection of a first set of categories in the selection menu and a second set of categories in the selection menu; and, in accordance with detecting selection of the first set of categories in the selection menu and the second set of categories in the selection menu: replacing display of the second chart with a third chart that contains the first set of categories and the second set of categories; and ceasing to display the selection menu.
- In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a chart on the display. The chart has a horizontal axis and a vertical axis. The horizontal axis includes first horizontal scale markers. The vertical axis includes first vertical scale markers. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. The method further includes, while detecting the first touch input: horizontally expanding a portion of the chart such that a distance between first horizontal scale markers increases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
- In some embodiments, the first touch input is a de-pinch gesture.
- In some embodiments, the method includes, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases and while continuing to detect the first touch input: continuing to horizontally expand a portion of the chart; displaying second horizontal scale markers, the second horizontal scale markers being at a finer scale than the first horizontal scale markers; and continuing to maintain the vertical scale of the chart.
- In some embodiments, the method includes, including, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases and while continuing to detect the first touch input: continuing to horizontally expand a portion of the chart; replacing a first set of displayed data marks with a second set of displayed data marks, where for at least some of the data marks in the first set of data marks, an individual data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks; and continuing to maintain the vertical scale of the chart.
- In some embodiments, the method includes, after horizontally expanding the portion of the chart and maintaining the vertical scale of the chart while detecting the first touch input, ceasing to detect the first touch input. The method also includes, in response to ceasing to detect the first touch input, changing a vertical scale of the chart.
- In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart and, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, zooming in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks. The method further includes, while displaying the second portion of the chart at the second magnification, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart. The method further includes, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart: in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooming in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks; and, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displaying information about the first data mark.
- In some embodiments, the second touch input is a same type of touch input as the first touch input.
- In some embodiments, the information about the first data mark comprises a data record that corresponds to the first data mark.
- In some embodiments, the data-mark-information-display criteria include the second magnification being a predefined magnification.
- In some embodiments, the data-mark-information-display criteria include the first data mark in the plurality of data marks being the only data mark displayed at the second magnification after the first touch input.
- In some embodiments, the data-mark-information-display criteria include the first data mark reaching a predefined magnification during the second touch input.
- In some embodiments, the data-mark-information-display criteria include the device zooming in to display only the first data mark in the plurality of data marks during the second touch input.
- In some embodiments, the method includes, in accordance with the determination that one or more predefined data-mark-information-display criteria are met, ceasing to display the first data mark.
- In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a chart on the display, the chart including a plurality of data marks and detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart, the first predefined area having a corresponding first value. The method also includes, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart: selecting the first predefined area and visually distinguishing the first predefined area. The method further includes, while the first predefined area is selected, detecting a second touch input on the touch-sensitive surface and, in response to detecting the second touch input on the touch-sensitive surface: visually distinguishing a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area; and displaying a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
- In some embodiments, the first touch input is a tap gesture.
- In some embodiments, the first predefined area includes a column in the chart.
- In some embodiments, the first predefined area includes a single data mark in the plurality of data marks.
- In some embodiments, data marks in the plurality of data marks are displayed in corresponding columns in the chart, with a single data mark per column.
- In some embodiments, data marks in the plurality of data marks are separated horizontally from one another.
- In some embodiments, the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of the first predefined area.
- In some embodiments, the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of an edge of the first predefined area.
- In some embodiments, the second touch input is initially detected at a location on the touch-sensitive surface that corresponds to a location on the display of a selection handle in or next to the first predefined area.
- In some embodiments, the second touch input is a drag gesture, and the method includes detecting movement of a finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values. The method also includes, in response to detecting movement of the finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values, displaying a series of changes between the first value in the first predefined area and the corresponding values of the sequence of predefined areas.
- In some embodiments, after the second touch input, a selected area in the chart comprises the first predefined area and the sequence of predefined areas, and the method includes detecting a third touch input, the third touch input including initial contact of a finger at a location on the touch-sensitive surface that corresponds to a location on the display within the selected area in the chart, and movement of the finger across the touch-sensitive surface. The method also includes, in response to detecting the third touch input: moving the selected area across the chart, in accordance with the movement of the finger across the touch-sensitive surface, while maintaining a number of predefined areas in the moved selected area equal to the number of predefined areas in the sequence of predefined areas plus one; and displaying a change between a value corresponding to a leftmost predefined area in the moved selected area and a value corresponding to a rightmost predefined area in the moved selected area.
- In some embodiments, after the second touch input, a selected area in the chart comprises the first predefined area and the sequence of predefined areas, and the method includes detecting a fourth touch input. The method also includes, in response to detecting the fourth touch input: zooming in on the selected area in the chart; in accordance with a determination that areas in the chart outside the selected area are still displayed on the display, maintaining selection of the selected area; and in accordance with a determination only areas in the chart in the selected area are displayed on the display, ceasing selection of the selected area.
- In some embodiments, a method is performed at an electronic device with a touch-sensitive surface and a display. The method includes displaying a chart on the display. The chart has a horizontal axis with a first horizontal scale with first horizontal scale markers. The chart has a vertical axis with a first vertical scale with first vertical scale markers. The chart includes a first set of data marks. Each respective data mark in the first set of data marks has a respective abscissa and a respective ordinate. The chart includes a line that connects adjacent data marks in the first set of data marks. The method also includes detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart and, while detecting the first touch input: expanding at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input; expanding at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input; adding a second set of second data marks, distinct from the first set of data marks, on the line. Each respective data mark in the second set of data marks includes a respective abscissa and a respective ordinate. Each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark. The method further includes, after adding the second set of data marks on the line: for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moving the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis; and animatedly adjusting the line so that the line connects the second set of data marks.
- In some embodiments, adjacent data marks in the first set of first data marks are separated by a first horizontal distance.
- In some embodiments, adjacent data marks in the second set of data marks are separated by a second horizontal distance that corresponds to a second horizontal scale that is finer than the first horizontal scale.
- In some embodiments, each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark and the ordinate of the line at the respective abscissa of the respective data mark.
- In some embodiments, a shape of the line is maintained when the second set of data marks is added to the line.
- In some embodiments, a single data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks.
- In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs while detecting the first input.
- In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs after ceasing to detect the first input.
- In some embodiments, the second vertical scale is the same as the first vertical scale.
- In some embodiments, animatedly moving each respective data mark vertically and animatedly adjusting the line so that the line connects the set of second data marks occur concurrently.
- In some embodiments, the method includes ceasing to display the set of first data marks when the second set of data marks is added.
- In some embodiments, the method includes ceasing to display the set of first data marks after the second set of data marks is added.
- In another aspect, some embodiments include electronic devices for visualizing data. In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a first chart on the display. The first chart concurrently displays a first set of categories, and each respective category in the first set of categories has a corresponding visual mark displayed in the first chart. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart. The one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart: removing the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition; and updating display of the first chart.
- In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a first chart on the display. The first chart is derived from a set of data. The first chart concurrently displays a first set of categories and a label for the first set of categories. Each respective category in the first set of categories has a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories. The one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, replacing display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. The second chart is derived from the set of data. The second chart concurrently displays a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories. Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories.
- In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a chart on the display. The chart has a horizontal axis and a vertical axis. The horizontal axis includes first horizontal scale markers. The vertical axis includes first vertical scale markers. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. The one or more programs further include instructions for, while detecting the first touch input: horizontally expanding a portion of the chart such that a distance between first horizontal scale markers increases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same.
- In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart and, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, zooming in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks. The one or more programs further include instructions for, while displaying the second portion of the chart at the second magnification, detecting a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart. The one or more programs further include instructions for, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart: in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooming in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks; and, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displaying information about the first data mark.
- In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a chart on the display, the chart including a plurality of data marks. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart, the first predefined area having a corresponding first value. The one or more programs further include instructions for, in response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart: selecting the first predefined area; and visually distinguishing the first predefined area. The one or more programs further include instructions for, while the first predefined area is selected, detecting a second touch input on the touch-sensitive surface. The one or more programs further include instructions for, in response to detecting the second touch input on the touch-sensitive surface: visually distinguishing a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area; and displaying a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas.
- In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for displaying a chart on the display. The chart has a horizontal axis with a first horizontal scale with first horizontal scale markers. The chart has a vertical axis with a first vertical scale with first vertical scale markers. The chart includes a first set of data marks. Each respective data mark in the first set of data marks has a respective abscissa and a respective ordinate. The chart includes a line that connects adjacent data marks in the first set of data marks. The one or more programs also include instructions for detecting a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart and, while detecting the first touch input: expanding at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input; expanding at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input; adding a second set of second data marks, distinct from the first set of data marks, on the line. Each respective data mark in the second set of data marks includes a respective abscissa and a respective ordinate. Each respective data mark in the second set of data marks is placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark. The one or more programs further include instructions for, after adding the second set of data marks on the line: for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moving the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis; and animatedly adjusting the line so that the line connects the second set of data marks.
- In some embodiments, an electronic device for visualizing data includes a display, a touch-sensitive surface, one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the one or more processors. The one or more programs include instructions for performing any of the methods described herein.
- In yet another aspect, some embodiments include a non-transitory computer readable storage medium, storing one or more programs for execution by one or more processors of an electronic device with a display and a touch-sensitive surface, the one or more programs including instructions for performing any of the methods described herein.
- In yet another aspect, some embodiments include a graphical user interface on an electronic device with a display, a touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in the memory, the graphical user interface comprising user interfaces displayed in accordance with any of the methods described herein.
- Thus, electronic devices with displays and touch-sensitive surfaces are provided with faster, more efficient methods and interfaces for data visualization, thereby increasing the effectiveness, efficiency, and user satisfaction with such devices. Such methods and interfaces may complement or replace conventional methods for data visualization.
- So that the present disclosure can be understood in greater detail, a more particular description may be had by reference to the features of various implementations, some of which are illustrated in the appended drawings. The appended drawings, however, merely illustrate the more pertinent features of the present disclosure and are therefore not to be considered limiting, for the description may admit to other effective features.
-
FIG. 1 illustrates a portable multifunction device having a touch screen, in accordance with some embodiments. -
FIG. 2 illustrates a portable multifunction device having a touch-sensitive surface that is separate from the display, in accordance with some embodiments. -
FIG. 3A is a block diagram illustrating a portable multifunction device having a touch screen, in accordance with some embodiments. -
FIG. 3B is a block diagram illustrating a portable multifunction device having a touch-sensitive surface, in accordance with some embodiments. -
FIGS. 4A-4B illustrate user interfaces for initiating data visualization, in accordance with some embodiments. -
FIGS. 5A-5G illustrate user interfaces for adjusting chart filters, in accordance with some embodiments. -
FIGS. 6A-6L illustrate user interfaces for changing chart categories, in accordance with some embodiments. -
FIGS. 7A-7D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments. -
FIGS. 8A-8D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments. -
FIGS. 9A-9B illustrate user interfaces for changing chart views, in accordance with some embodiments. -
FIGS. 10A-10B illustrate user interfaces for adjusting a chart view, in accordance with some embodiments. -
FIGS. 11A-11J illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments. -
FIGS. 12A-12D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments. -
FIGS. 13A-13D illustrate user interfaces for selecting chart areas, in accordance with some embodiments. -
FIGS. 14A-14D illustrate user interfaces for exporting data visualizations, in accordance with some embodiments. -
FIGS. 15A-15C illustrate user interfaces for adjusting a chart view, in accordance with some embodiments. -
FIGS. 16A-16D illustrate user interfaces for changing chart categories, in accordance with some embodiments. -
FIGS. 17A-17B illustrate user interfaces for selecting chart areas, in accordance with some embodiments. -
FIGS. 18A-18E illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments. -
FIGS. 19A-19D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments. -
FIGS. 19E-19L illustrate user interfaces for displaying information about a data mark, in accordance with some embodiments. -
FIGS. 20A-20D are flow diagrams illustrating a method of data visualization in accordance with some embodiments. -
FIGS. 21A-21F are flow diagrams illustrating another method of data visualization in accordance with some embodiments. -
FIGS. 22A-22B are flow diagrams illustrating another method of data visualization in accordance with some embodiments. -
FIGS. 23A-23B are flow diagrams illustrating another method of data visualization in accordance with some embodiments. -
FIGS. 24A-24E are flow diagrams illustrating another method of data visualization in accordance with some embodiments. -
FIGS. 25A-25D are flow diagrams illustrating another method of data visualization in accordance with some embodiments. -
FIGS. 26A-26F illustrate scrolling filters in accordance with some embodiments. - In accordance with common practice, the various features illustrated in the drawings may not be drawn to scale. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may not depict all of the components of a given system or device. Finally, like reference numerals may be used to denote like features throughout the specification and figures.
- As portable electronic devices become more compact, and the number of functions performed by applications on any given device increase, it has become a significant challenge to design user interfaces that allow users to interact with the applications easily. This challenge is particularly significant for portable devices with smaller screens and/or limited input devices. In addition, data visualization applications need to provide user-friendly ways to explore data in order to enable a user to extract significant meaning from a particular data set. Some application designers have resorted to using complex menu systems to enable a user to perform desired functions. These conventional user interfaces often result in complicated key sequences and/or menu hierarchies that must be memorized by the user and/or that are otherwise cumbersome and/or not intuitive to use.
- The methods, devices, and GUIs described herein make manipulation of data sets and data visualizations more efficient and intuitive for a user. A number of different intuitive user interfaces for data visualizations are described below. For example, applying a filter to a data set can be accomplished by a simple touch input on a given portion of a displayed chart rather than via a nested menu system. Additionally, switching between chart categories can be accomplished by a simple touch input on a displayed chart label.
-
FIGS. 20A-20D are flow diagrams illustrating a method of adjusting chart filters.FIGS. 5A-5G, 7A-7D, and 8A-8D illustrate user interfaces for adjusting chart filters. The user interfaces inFIGS. 5A-5G, 7A-7D, and 8A-8D are used to illustrate the processes inFIGS. 20A-20D . -
FIGS. 21A-21F are flow diagrams illustrating a method of changing chart categories.FIGS. 6A-6L illustrate user interfaces for changing chart categories. The user interfaces inFIGS. 6A-6L are used to illustrate the processes inFIGS. 21A-21F . -
FIGS. 22A-22B are flow diagrams illustrating a method of adjusting chart magnification.FIGS. 11A-11J and 12A-12D illustrate user interfaces for adjusting chart magnification.FIGS. 15A-15C illustrate user interfaces for adjusting chart views. The user interfaces inFIGS. 11A-11J, 12A-12D, and 15A-15C are used to illustrate the processes inFIGS. 22A-22B . -
FIGS. 23A-23B are flow diagrams illustrating a method of displaying information about a data mark.FIGS. 19E-19L illustrate user interfaces for displaying information about a data mark. The user interfaces inFIGS. 19E-19L are used to illustrate the processes inFIGS. 23A-23B . -
FIGS. 24A-24E are flow diagrams illustrating a method of chart selection.FIGS. 13A-13D illustrate user interfaces for selecting chart areas.FIGS. 14A-14D illustrate user interfaces for exporting data visualizations.FIGS. 18A-18E illustrate user interfaces for adjusting chart magnification. The user interfaces inFIGS. 13A-13D, 14A-14D , and 18A-18E are used to illustrate the processes inFIGS. 24A-24E . -
FIGS. 25A-25D are flow diagrams illustrating a method of update chart views.FIGS. 19A-19D illustrate user interfaces for adjusting chart magnification. The user interfaces inFIGS. 19A-19D are used to illustrate the processes inFIGS. 25A-25D . - Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the various described embodiments. However, it will be apparent to one of ordinary skill in the art that the various described embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
- Attention is now directed toward embodiments of portable devices with touch-sensitive displays. Embodiments of electronic devices and user interfaces for such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Other portable electronic devices, such as laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads), are, optionally, used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, a microphone, and/or a joystick.
-
FIG. 1 illustrates portablemultifunction device 100 havingtouch screen 102, in accordance with some embodiments. In some embodiments,device 100 is a mobile phone, a laptop computer, a personal digital assistant (PDA), or a tablet computer.Touch screen 102 is also sometimes called a touch-sensitive display and/or a touch-sensitive display system.Touch screen 102 optionally displays one or more graphics within a user interface (UI). In some embodiments, a user is enabled to select one or more of the graphics by making a touch input (e.g., touch input 108) on the graphics. In some instances, the touch input is a contact on the touch screen. In some instances, the touch input is a gesture that includes a contact and movement of the contact on the touch screen. In some instances, the gesture optionally includes one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact withdevice 100. For example, a touch input on the graphics is optionally made with one or more fingers 110 (not drawn to scale in the figure) or one or more styluses 112 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some circumstances, inadvertent contact with a graphic does not select the graphic. For example, a swipe gesture that sweeps over a visual mark optionally does not select the visual mark when the gesture corresponding to selection is a tap.Device 100 optionally also includes one or more physical buttons and/or other input/output devices, such as a microphone for verbal inputs. -
FIG. 2 illustratesmultifunction device 200 in accordance with some embodiments.Device 200 need not be portable. In some embodiments,device 200 is a laptop computer, a desktop computer, a tablet computer, or an educational device.Device 200 includesscreen 202 and touch-sensitive surface 204.Screen 202 optionally displays one or more graphics within a UI. In some embodiments, a user is enabled to select one or more of the graphics by making a touch input (e.g., touch input 210) on touch-sensitive surface 204 such that a corresponding cursor (e.g., cursor 212) onscreen 202 selects the one or more graphics. For example, when an input is detected on touch-sensitive surface 204 whilecursor 212 is over a particular user interface element (e.g., a button, window, slider or other user interface element), the particular user interface element is adjusted in accordance with the detected input. -
FIG. 3A is a block diagram illustrating portablemultifunction device 100, in accordance with some embodiments. It should be appreciated thatdevice 100 is only one example of a portable multifunction device, and thatdevice 100 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown inFIG. 3A are implemented in hardware, software, firmware, or a combination of hardware, software, and/or firmware, including one or more signal processing and/or application specific integrated circuits. -
Device 100 includes one or more processing units (CPU's) 302, input/output (I/O)subsystem 306, memory 308 (which optionally includes one or more computer readable storage mediums), andnetwork communications interface 310. These components optionally communicate over one or more communication buses orsignal lines 304.Communication buses 304 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. -
Memory 308 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.Memory 308 optionally includes one or more storage devices remotely located from processor(s) 302.Memory 308, or alternately the non-volatile memory device(s) withinmemory 308, comprises a non-transitory computer readable storage medium. - In some embodiments, the software components stored in
memory 308 includeoperating system 318,communication module 320, input/output (I/O)module 322, andapplications 328. In some embodiments, one or more of the various modules comprises a set of instructions inmemory 308. In some embodiments,memory 308 stores one or more data sets in one or more database(s) 332. - Operating system 318 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware, software, and/or firmware components.
-
Communication module 320 facilitates communication with other devices over one or more external ports and also includes various software components for handling data received from other devices. - I/
O module 322 includestouch input sub-module 324 and graphics sub-module 326. Touch input sub-module 324 optionally detects touch inputs withtouch screen 102 and other touch sensitive devices (e.g., a touchpad or physical click wheel).Touch input sub-module 324 includes various software components for performing various operations related to detection of a touch input, such as determining if contact has occurred (e.g., detecting a finger-down event), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact).Touch input sub-module 324 receives contact data from the touch-sensitive surface (e.g., touch screen 102). These operations are, optionally, applied to single touch inputs (e.g., one finger contacts) or to multiple simultaneous touch inputs (e.g., “multitouch”/multiple finger contacts). In some embodiments,touch input sub-module 324 detects contact on a touchpad. - Touch input sub-module 324 optionally detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns. Thus, a gesture is, optionally, detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an data mark). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event.
- Graphics sub-module 326 includes various known software components for rendering and displaying graphics on
touch screen 102 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation data visualizations, icons (such as user-interface objects including soft keys), text, digital images, animations and the like. In some embodiments, graphics sub-module 326 stores data representing graphics to be used. Each graphic is, optionally, assigned a corresponding code. Graphics sub-module 326 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to the display or touch screen. -
Applications 328 optionally includedata visualization module 330 for displaying graphical views of data and one or more other applications. Examples of other applications that are, optionally, stored inmemory 308 include word processing applications, email applications, and presentation applications. - In conjunction with I/
O interface 306, includingtouch screen 102, CPU(s) 302, and/or database(s) 332,data visualization module 330 includes executable instructions for displaying and manipulating various graphical views of data. - Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments,
memory 308 optionally stores a subset of the modules and data structures identified above. Furthermore,memory 308 optionally stores additional modules and data structures not described above. -
FIG. 3B is a block diagram illustratingmultifunction device 200, in accordance with some embodiments. It should be appreciated thatdevice 200 is only one example of a multifunction device, and thatdevice 200 optionally has more or fewer components than shown, optionally combines two or more components, or optionally has a different configuration or arrangement of the components. The various components shown inFIG. 3B are implemented in hardware, software, firmware, or a combination of hardware, software, and/or firmware, including one or more signal processing and/or application specific integrated circuits. -
Device 200 typically includes one or more processing units/cores (CPUs) 352, one or more network orother communications interfaces 362,memory 350, I/O interface 356, and one ormore communication buses 354 for interconnecting these components.Communication buses 354 optionally include circuitry (sometimes called a chipset) that interconnects and controls communications between system components. - I/
O interface 306 comprises screen 202 (also sometimes called a display), touch-sensitive surface 204, and one or more sensor(s) 360 (e.g., optical, acceleration, proximity, and/or touch-sensitive sensors). I/O interface 356 optionally includes a keyboard and/or mouse (or other pointing device) 358. I/O interface 356 couples input/output peripherals ondevice 200, such asscreen 202, touch-sensitive surface 204, other input devices 358, and one or more sensor(s) 360, to CPU(s) 352 and/ormemory 350. -
Screen 202 provides an output interface between the device and a user.Screen 202 displays visual output to the user. The visual output optionally includes graphics, text, icons, data marks, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output corresponds to user-interface objects.Screen 202 optionally uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other embodiments. - In addition to the touch screen,
device 200 includes touch-sensitive surface 204 (e.g., a touchpad) for detecting touch inputs. Touch-sensitive surface 204 accepts input from the user via touch inputs. For example,touch input 210 inFIG. 2 . Touch-sensitive surface 204 (along with any associated modules and/or sets of instructions in memory 350) detects touch inputs and converts the detected inputs into interaction with user-interface objects (e.g., one or more icons, data marks, or images) that are displayed onscreen 202. In an exemplary embodiment, a point of contact between touch-sensitive surface 204 and the user corresponds to a finger of the user. -
Memory 350 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices; and optionally includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.Memory 350 optionally includes one or more storage devices remotely located from CPU(s) 352. In some embodiments, the software components stored inmemory 350 includeoperating system 364,communication module 366, input/output (I/O)module 368, andapplications 374. In some embodiments, one or more of the various modules comprises a set of instructions inmemory 350. In some embodiments,memory 350 stores one or more data sets in one or more database(s) 378. In some embodiments, I/O module 368 includestouch input sub-module 370 and graphics sub-module 372. In some embodiments,applications 374 includedata visualization module 376. - In some embodiments,
memory 350 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored inmemory 308 of portable multifunction device 100 (FIG. 3A ), or a subset thereof. Furthermore,memory 350 optionally stores additional programs, modules, and data structures not present inmemory 308 of portablemultifunction device 100. For example,memory 350 ofdevice 200 optionally stores drawing, presentation, and word processing applications, whilememory 308 of portable multifunction device 100 (FIG. 3A ) optionally does not store these modules. -
Device 200 also includes a power system for powering the various components. The power system optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management, and distribution of power in portable devices. - Each of the above identified elements in
FIG. 3B is, optionally, stored in one or more of the previously mentioned memory devices. Each of the above identified modules corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various embodiments. In some embodiments,memory 350 optionally stores a subset of the modules and data structures identified above. Furthermore,memory 350 optionally stores additional modules and data structures not described above. - Attention is now directed towards embodiments of user interfaces (“UI”) that are, optionally, implemented on
portable multifunction device 100 ordevice 200. The following examples are shown utilizing a touch screen (e.g.,touch screen 102 inFIG. 1 ). However, it should be understood that, in some embodiments, the inputs (e.g., finger contacts) are detected on a touch-sensitive surface on a device that is distinct from a display on the device. In addition, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some embodiments, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously. -
FIGS. 4A-4B illustrate user interfaces for initiating data visualization, in accordance with some embodiments.FIG. 4A showsUI 402 including an email application. The email application contains an email including attachedfile 402.FIG. 4A also showscontact 410 over the icon corresponding to file 402.FIG. 4B showsUI 450 including a data visualization application. The data visualization application includes a graphical view of data fromfile 402. The graphical view includes chart 404 (e.g., a bar chart) withchart label 406, a plurality of categories, and a plurality of category labels 408. In some embodiments, file 402 has a file type associated with the data visualization application and, in response to detectingcontact 410, the data visualization application is initialized and the data fromfile 402 is displayed in a graphical view. -
FIGS. 5A-5G illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.FIG. 5A showsUI 450 including category 502-1 and category label 408-1.FIG. 5A also showscontact 510 detected at position 510-a corresponding to the visual mark (e.g., a bar corresponding to category 502-1) for category 502-1.FIG. 5B showsUI 520 includingcontact 510 detected at position 510-b and the visual mark for category 502-1 moving in concert with movement ofcontact 510 via an animated transition.FIG. 5C shows UI 522 includingcontact 510 detected at position 510-c and the visual mark for category 502-1 continuing to move in concert with movement ofcontact 510 via an animated transition.FIG. 5C also showsindicium 504 indicating that category 502-1 (Catering) is being filtered out of the data as a result of the current action (e.g., the movement of contact 510).FIG. 5D showsUI 524 includingindicium 504, contact 510 detected at position 510-d, and the visual mark for category 502-1 continuing to move in concert with movement of acontact 510 via an animated transition.FIG. 5E showsUI 526 includingindicium 504 and the removal of the visual mark for category 502-1 from the chart. -
FIG. 5F showsUI 528 includingindicium 504, contact 510 detected at position 510-e, and the visual mark for category 502-1 continuing to move in concert with movement of acontact 510 via an animated transition.FIG. 5G showsUI 530 includingindicium 506, contact 510 detected at position 510-f, and the visual mark for category 502-1 continuing to move in concert with movement of acontact 510 via an animated transition. In some embodiments, as shown inFIGS. 5A-5E , the first category and the first visual mark are removed from the chart via an animated transition in response to contact 510 moving to a pre-defined location. In some embodiments, as shown inFIGS. 5F-5G , the first category and the first visual mark are added back to the chart via an animated transition in response to contact 510 moving away from the pre-defined location. -
FIGS. 6A-6L illustrate user interfaces for changing chart categories, in accordance with some embodiments.FIG. 6A shows UI 601 including a chart with chart label 602-1 (Menu Item) and categories 502 (including categories 502-1 through 502-13) with category labels 408.FIG. 6A also showscontact 610 detected at position 610-a corresponding to chart label 602-1.FIGS. 6B and 6C showcontact 610 moving to position 610-b and 610-c respectively and the first chart with chart label 602-1 (Menu Item) being replaced by a second chart with chart label 602-2 (Menu Group) via an animated transition.FIGS. 6B and 6C also show chart categories 502 being replaced by categories 604 via an animated transition, and category labels 408 being replaced bycategory labels 606 via an animated transition. -
FIG. 6D shows UI 607 including the second chart with chart label 602-2 (Menu Group) and categories 604 with category labels 606.FIG. 6D also showscontact 620 detected at position 620-a corresponding to chart label 602-2.FIGS. 6E and 6F showcontact 620 moving to positions 620-b and 620-c respectively and the second chart with chart label 602-2 (Menu Group) being replaced by a third chart with chart label 602-3 (Day) via an animated transition.FIGS. 6E and 6F also show chart categories 604 being replaced by categories 612 via an animated transition, and category labels 606 being replaced by category labels 614 via an animated transition. -
FIG. 6G showsUI 613 including the third chart with chart label 602-3 (Day) and categories 612 with category labels 614.FIG. 6G also showscontact 630 detected at a position corresponding to chart label 602-3 andselection menu 616 displayed. In some embodiments, contact 630 is detected and identified as a tap input andselection menu 616 is displayed in response.FIG. 6H showsUI 615 withselection menu 616 including selection categories 618.FIG. 6H also showscontact 640 detected at a position corresponding to selection category 618-2.FIG. 6I showsUI 617 including a fourth chart with chart label 602-4 (Hour) and categories 622 with category labels 624. In some embodiments, the chart shown inFIG. 6I replaces the chart shown inFIG. 6H in response to the detection ofcontact 640 at a position corresponding to selection category 618-2. -
FIG. 6J showsUI 619 including the fourth chart with chart label 602-4 (Hour) and categories 622 with category labels 624.FIG. 6J also showscontact 650 detected at position 650-a corresponding to chart label 602-4.FIGS. 6K and 6L showcontact 650 moving to positions 650-b and 650-c respectively and the fourth chart with chart label 602-4 (Hour) being replaced by the first chart with chart label 602-1 (Menu Item) via an animated transition.FIGS. 6K and 6L also show chart categories 622 being replaced by categories 502 via an animated transition, and category labels 624 being replaced by category labels 408 via an animated transition. -
FIGS. 7A-7D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.FIG. 7A showsUI 701 including a chart with categories 612 (including category 612-4) and corresponding category labels 614.FIG. 7A also showsindicium 504 indicating that data corresponding to category 502-1 has been filtered out of the chart.FIG. 7A further showscontact 710 detected at position 710-a corresponding toindicium 504.FIGS. 7B and 7C showcontact 710 moving to positions 710-b and 710-c respectively and the removal ofindicium 504 along with the chart updating reflect inclusion of data that corresponds to category 502-1.FIGS. 7B and 7C also show categories 612 reordered to reflect inclusion of the data corresponding to category 502-1. -
FIG. 7D showsUI 707 includingindicium 504, contact 720 detected at a position corresponding toindicium 504, and categories 612. In some embodiments, as shown inFIGS. 7A-7C , the chart is updated to reflect inclusion of data that corresponds to category 502-1 in response to contact 710 moving from a pre-defined location or area on the UI. In some embodiments, as shown inFIG. 7D , the chart is updated to reflect exclusion of data that corresponds to category 502-1 in response to contact 720. -
FIGS. 8A-8D illustrate user interfaces for adjusting chart filters, in accordance with some embodiments.FIG. 8A showsUI 801 including category 502-2 and category label 408-2.FIG. 8A also showscontact 810 detected at position 810-a corresponding to the visual mark for category 502-2 (e.g., a bar corresponding to category 502-2).FIGS. 8B and 8C showcontact 810 moving to positions 810-b and 810-c respectively and the visual mark for category 502-2 moving in concert with movement ofcontact 810 via an animated transition.FIGS. 8B and 8C also showindicium 802 indicating that only the data corresponding to category 502-2 is being included as a result of the current action (e.g., the movement of contact 810).FIG. 8D showsUI 805 includingindicium 802 and the removal of the visual mark for all categories except for category 502-1. -
FIGS. 9A-9B illustrate user interfaces for changing chart views, in accordance with some embodiments.FIG. 9A shows UI 901 includingindicium 802 and a bar chart with category 502-2.FIG. 9A also showscontact 910 detected at a position on UI 901 that corresponds to a line chart graphical view.FIG. 9B shows UI 903 includingindicium 802 and a line chart. In some embodiments, the bar chart shown inFIG. 9A is replaced by the line chart shown inFIG. 9B in response to detection ofcontact 910 at a position on UI 901 that corresponds to a line chart graphical view. -
FIGS. 10A-10B illustrate user interfaces for adjusting a chart view, in accordance with some embodiments.FIG. 10A shows UI 1001 including a chart.FIG. 10A also showscontact 1010 detected at position 1010-a on UI 1001.FIG. 10B showscontact 1010 at position 1010-b and movement of the chart in concert with movement ofcontact 1010. -
FIGS. 11A-11J illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.FIG. 11A shows UI 1101 including a chart at a first magnification (e.g., a first zoom level).FIG. 11A also showscontacts FIG. 11B showscontacts contacts FIG. 11B are further apart than the positions ofcontacts FIG. 11A and represent a de-pinch gesture on the touch screen. The second magnification of the chart shown inFIG. 11B includes the same vertical scale as the first magnification of the line shown inFIG. 11A .FIGS. 11C and 11D show an animated transition of the chart to a third magnification. The animated transition shown inFIGS. 11C and 11D includes an increase in the vertical scale of the chart. In some embodiments, the animated transition shown inFIGS. 11C and 11D is in response to ceasing to detectcontacts 1110 and 1120 (e.g., detecting lift off of the contacts). -
FIG. 11E shows UI 1109 including the chart at a fourth magnification.FIG. 11E also showscontacts FIG. 11F showscontacts -
FIG. 11G shows UI 1113 including the chart at a sixth magnification.FIG. 11G also showscontacts FIG. 11H showscontacts -
FIG. 11I shows UI 1117 including the chart at an eighth magnification.FIG. 11I also showscontacts FIG. 11J showscontacts -
FIGS. 12A-12D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.FIG. 12A shows UI 1201 including a chart at an initial magnification.FIG. 12A also showscontacts FIG. 12B showscontacts contacts FIG. 12B are further apart than the positions ofcontacts FIG. 12A and represent a de-pinch gesture on the touch screen.FIG. 12C showscontacts contacts FIG. 12C are closer together than the positions ofcontacts FIG. 12B and represent a pinch gesture on the touch screen.FIG. 12D showscontacts contacts FIG. 12D are closer together than the positions ofcontacts FIG. 12A and represent a pinch gesture on the touch screen. -
FIGS. 13A-13D illustrate user interfaces for selecting chart areas, in accordance with some embodiments.FIG. 13A shows UI 1301 including a chart with selectedportion 1302 and information regarding selectedportion 1302. For example,FIG. 13A show information regarding the number of records in selectedportion 1302.FIG. 13A also showscontact 1310 detected at position 1310-a corresponding to selectedportion 1302.FIG. 13B showsUI 1303 andcontact 1310 at position 1310-b and the chart with selectedportion 1304 corresponding to the movement ofcontact 1310.FIG. 13B also shows the chart including information regarding selected portion 1304 (e.g., information showing a difference between selectedportion 1302 and selected portion 1304).FIG. 13C shows UI 1305 andcontact 1310 at position 1310-c and the chart with selectedportion 1306 corresponding to the continued movement ofcontact 1310.FIG. 13C also shows the chart including information regarding selectedportion 1306.FIG. 13D shows UI 1307 andcontact 1310 at position 1310-d and the chart with selectedportion 1308 corresponding to the continued movement ofcontact 1310.FIG. 13D also shows the chart including information regarding selectedportion 1308. -
FIGS. 14A-14D illustrate user interfaces for exporting data visualizations, in accordance with some embodiments.FIG. 14A shows UI 1401 including a chart with selectedportion 1308.FIG. 14B showsUI 1403 including the chart with selectedportion 1308 andselection menu 1402.FIG. 14B also showscontact 1410 detected at a position corresponding to an icon forselection menu 1402. In some embodiments,selection menu 1402 is shown in response to contact 1410 being detected over the icon forselection menu 1402.FIG. 14C shows UI 1405 including the chart with selectedportion 1308 andselection menu 1402.FIG. 14C also showscontact 1420 detected at a position corresponding to a menu option (Email Image) inselection menu 1402.FIG. 14D showsUI 1407 with an email that includes information from the chart. In some embodiments,UI 1407 inFIG. 14D is shown in response to detectingcontact 1420 at a position corresponding to the Email Image menu option inselection menu 1402. -
FIGS. 15A-15C illustrate user interfaces for adjusting a chart view, in accordance with some embodiments.FIG. 15A shows UI 1501 including a chart.FIG. 15A also showscontact 1510 detected at position 1510-a on UI 1501.FIG. 15B shows UI 1503 andcontact 1510 at position 1510-b.FIG. 15B also shows movement of the chart in concert with movement ofcontact 1510. For example,FIG. 15B shows bothcontact 1510 and the chart moving to the right from their respective positions inFIG. 15A .FIG. 15C shows UI 1505 andcontact 1510 at position 1510-c.FIG. 15C also shows movement of the chart in concert with movement ofcontact 1510. For example,FIG. 15C shows bothcontact 1510 and the chart moving to the left from their respective positions inFIG. 15B . -
FIGS. 16A-16D illustrate user interfaces for changing chart categories, in accordance with some embodiments.FIG. 16A shows UI 1601 including a chart with chart label 1602-1 (Average).FIG. 16A also showscontact 1610 detected at a position corresponding to chart label 1602-1.FIG. 16B shows UI 1603 including a chart with chart label 1602-2 (Percentile Bands). In some embodiments, the chart shown inFIG. 16B replaces the chart shown inFIG. 16A in response to the detection ofcontact 1610 at a position on the chart label. -
FIG. 16C shows UI 1605 including a chart with chart label 1602-2 (Percentile Bands).FIG. 16C also showscontact 1620 detected at a position corresponding to chart label 1602-2.FIG. 16D shows UI 1607 including a chart with chart label 1602-3 (Summary). In some embodiments, the chart shown inFIG. 16D replaces the chart shown inFIG. 16C in response to the detection ofcontact 1620 at a position on the chart label. -
FIGS. 17A-17B illustrate user interfaces for selecting chart areas, in accordance with some embodiments.FIG. 17A shows UI 1701 including a chart.FIG. 17A also showscontact 1710 detected at a position corresponding to a portion of the chart.FIG. 17B showsUI 1703 including a chart with selected portion 1702 and information regarding selected portion 1702. For example,FIG. 17B shows information regarding the number of records in selected portion 1702.FIG. 17B also showscontact 1720 detected at a position corresponding to selected portion 1702. In some embodiments, selected portion 1702 is selected in response to detectingcontact 1720. In some embodiments,contact 1710 detected inFIG. 17A represents a first type of touch input (e.g., a swipe gesture) andcontact 1720 detected inFIG. 17B represents a second type of touch input (e.g., a tap gesture). -
FIGS. 18A-18E illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.FIG. 18A shows UI 1801 including a chart with selectedportion 1802 at an initial magnification.FIG. 18A also showscontacts FIG. 18B showscontacts contacts FIG. 18B are further apart than the positions ofcontacts FIG. 18A and represent a de-pinch gesture on the touch screen.FIG. 18C showscontacts -
FIG. 18D showsUI 1807 including the chart at a fourth magnification.FIG. 18D also showscontacts FIG. 18E showscontacts -
FIGS. 19A-19D illustrate user interfaces for adjusting chart magnification, in accordance with some embodiments.FIG. 19A showsUI 1901 including a chart at an initial magnification.FIG. 19A also shows the chart including data marks 1902 (e.g., data marks 1902-1 through 1902-5).FIG. 19B showsUI 1903 including the chart at a second magnification (e.g., zoomed in from the initial magnification).FIG. 19B also shows the chart including data marks 1902 (e.g., a subset of data marks 1902 shown inFIG. 19A ) and data marks 1904.FIG. 19C shows UI 1905 including the chart at a third magnification (e.g., zoomed in from the second magnification).FIG. 19C also shows the chart including data marks 1902 and data marks 1904. In some embodiments, data marks 1904 are initially placed on the line connecting data marks 1902 as shown inFIG. 19B and are animatedly moved (e.g., using continuous motion rather than a jump) to their respective ordinates as shown inFIG. 19C .FIG. 19D shows UI 1907 including the chart with data marks 1902 and data marks 1904 at a fourth magnification (e.g., zoomed in from the third magnification). -
FIGS. 19E-19L illustrate user interfaces for displaying information about a data mark, in accordance with some embodiments.FIG. 19E shows UI 1909 including a chart at an initial magnification.FIG. 19E also shows the chart including data marks 1908 (e.g., including data marks 1908-1 and 1908-2).FIG. 19E also showscontacts FIGS. 19F-19I show an animated transition from data mark 1908-1 to record 1914-1 in concert with movement ofcontacts 1930 and 1940 (e.g., the movement ofcontacts FIG. 19F showsUI 1911 includingcontacts contacts FIG. 19G showsUI 1913 includingcontacts contacts FIG. 19H shows UI 1915 includingcontacts contacts FIG. 19I shows UI 1917 including record 1914-1. In some embodiments, UI 1917 shown inFIG. 19I is displayed in response to ceasing to detectcontacts -
FIG. 19J shows UI 1919 including a chart at an initial magnification.FIG. 19J also shows the chart including data marks 1908 (e.g., including data marks 1908-1 and 1908-2).FIGS. 19K and 19L show an animated transition from data mark 1908-2 to record 1914-2. In some embodiments, the animated transition from data mark 1908-2 to record 1914-2 is in concert with a touch input (e.g., a de-pinch gesture). - Attention is now directed towards methods that are, optionally, implemented on
portable multifunction device 100 ordevice 200. -
FIGS. 20A-20D are flowdiagrams illustrating method 2000 of data visualization, in accordance with some embodiments.Method 2000 is performed at an electronic device (e.g.,portable multifunction device 100,FIG. 1 , ordevice 200,FIG. 2 ) with a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch-sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. In some embodiments,method 2000 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one ormore processors 302 of portablemultifunction device 100 and/or the one ormore processors 352 ofmultifunction device 200, as shown inFIGS. 3A-3B . Some operations inmethod 2000 are, optionally, combined and/or the order of some operations is, optionally, changed. - As described below,
method 2000 provides an intuitive way to change filtering. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when applying and/or removing filters, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust filters faster and more efficiently conserves power and increase the time between battery charges. - The device displays (2002) a first chart on the display. For example,
FIG. 5A showsUI 450 including a bar chart. - The first chart concurrently displays (2004) a first set of categories. For example, the bar chart in
FIG. 5A includes categories 502. - Each respective category in the first set of categories has (2006) a corresponding visual mark (e.g., a picture, drawing, or other graphic) displayed in the first chart. For example, a respective category in a bar chart has a corresponding bar that represents a value for that respective category, a respective category in a pie chart has a corresponding slice of the pie chart that represents a value for that respective category, etcetera. For example, the bar chart in
FIG. 5A includes categories 502 and a bar (e.g., a visual mark) corresponding to each category. - The device detects (2008) a first touch input (e.g., a swipe gesture or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of a first visual mark for a first category in the first chart. For example,
FIGS. 5A- 5D show contact 510 detected at positions 510-a, 510-b, 510-c, and 510-d respectively. - In some embodiments, the first touch input is (2010) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface. For example, a leftward drag gesture. For example, the movement of
contact 510 shown inFIGS. 5A-5D represents a swipe gesture toward the left side of the screen. - In response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device removes (2014) the first category and the first visual mark from the first chart via an animated transition, where the first visual mark moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. For example,
FIGS. 5A-5E show an animated transition where the device removes category 502-1 and the visual mark corresponding category 502-1 in concert with movement ofcontact 510. - In response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device updates (2016) display of the first chart. For example, repositioning the remaining categories in the first set and their corresponding visual marks (e.g., graphics) in the first chart. Thus, data that corresponds to the first category is filtered out of the first chart. This process may be repeated to remove additional categories in the first set of categories from the first chart. In some embodiments, the contact is a stylus contact. For example,
FIGS. 5A-5E show the chart being updated in response to detectingcontact 510 and the updating including repositioning the remaining categories. - In some embodiments, in response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device ceases (2018) to display the first visual mark. In some embodiments, the first visual mark remains displayed while the finger contact in the first touch input remains in continuous contact with the touch-sensitive surface, and the first visual mark ceases to be displayed (e.g., fades out) in response to detecting lift off of the finger contact in the first touch input from the touch-sensitive surface. For example,
FIGS. 5A-5D show an animated transition where the visual mark corresponding category 502-1 fades out and moves in concert with movement ofcontact 510 andFIG. 5E shows the first visual mark ceasing to be displayed. - In some embodiments, while displaying (2020) the first chart on the display, the device detects a fourth touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of a second visual mark for a second category in the first chart. For example,
FIG. 8A shows the chart as inFIG. 5A including categories 502.FIG. 8A also shows thedevice detecting contact 810 at position 810-a corresponding to the visual mark for category 502-2. - In some embodiments, in response (2022) to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, the device: maintains (2024) display of the second category and the second visual mark in the second chart; removes display of all categories, other than the second category, in the first set of categories; and removes display of all visual marks, other than the second visual mark, that correspond to categories in the first set of categories. In some embodiments, the device responds differently to different finger gestures made on the touch-sensitive surface at a location that corresponds to a respective graphic for a respective category in the chart. For example, in accordance with a determination that the gesture is a leftward swipe or drag gesture, the device removes the respective category from the chart. On the other hand, in accordance with a determination that the gesture is a rightward swipe or drag gesture, the device maintains the respective category, but removes all the other categories from the chart. For example,
FIGS. 8B-8D show movement ofcontact 810 and an animated transition where the device maintains the visual mark for category 502-2 removes the all other categories 502 and the visual marks for all other categories 502. - In some embodiments, in response (2022) to detecting the fourth touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second visual mark for the second category in the first chart, the device displays (2026) an indicium that only the second category in the first set of categories remains displayed. For example,
FIG. 8D showsindicium 802 indicating that only category 502-2 is displayed. - In some embodiments, the first touch input is (2028) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture) and the fourth touch input is a drag gesture or a swipe gesture that moves in a second predefined direction on the touch-sensitive surface that is distinct from the first predefined direction (e.g., a rightward drag gesture). In some embodiments, the second predefined direction is opposite the first predefined direction. In some embodiments, the second predefined direction is perpendicular to the first predefined direction. For example, the movement of
contact 510 shown inFIGS. 5A-5D represents a swipe gesture toward the left side of the screen and the movement ofcontact 810 shown inFIG. 8A-8C represents a swipe gesture toward the right side of the screen. - In some embodiments, in response (2012) to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first visual mark for the first category in the first chart, the device displays (2030) an indicium that the first category has been removed. In some embodiments, an indicium is displayed that indicates that data corresponding to the first category has been filtered out of the data that is used to create various related charts, such as the first chart and the second chart. For example,
FIG. 5C showsindicium 504 indicating that category 502-1 has been removed. - In some embodiments, while displaying the indicium that the first category has been removed, the device changes (2032) from displaying the first chart with the first set of categories, other than the first category, to displaying a second chart. For example,
FIGS. 6A-6C show an animated transition from a first chart shown inFIG. 6A to a second chart shown inFIG. 6C while continuing to displayindicium 504. - In some embodiments, the second chart concurrently displays (2034) a second set of categories that are distinct from the first set of categories. Each respective category in the second set of categories has a corresponding visual mark displayed in the second chart. For example,
FIG. 6A shows a chart including categories 502 andFIG. 6C shows a second chart including categories 604, distinct from categories 502, and bars for each of categories 604. - In some embodiments, while displaying the second chart with the second set of categories, the device detects (2036) a second touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the indicium that the first category has been removed. For example,
FIG. 7A shows a chart distinct from the chart shown inFIG. 6A and including categories 612 andindicium 504.FIG. 7A also shows thedevice detecting contact 710 at position 710-a corresponding toindicium 504. - In some embodiments, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the first category has been removed, the device updates (2038) display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart. Thus, data that corresponds to the first category, which was filtered out of the first chart and remained filtered out when the second chart was initially displayed, is added to the second chart and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the addition of the data that corresponds to the first category. For example,
FIGS. 7A-7C show thedevice detecting contact 710 moving from position 710-a inFIG. 7A to position 710-c inFIG. 7C .FIGS. 7A-7C also show an animated transition of the chart updating to reflect inclusion of the data from category 502-1. - In some embodiments, updating display of the second chart to reflect inclusion of data that corresponds to the first category in the first chart includes reordering (2040) display of the second set of categories in the second chart. For example, if the second set of categories in the second chart are ordered largest to smallest, and adding in the data that corresponds to the first category in the first chart changes the order of the second set of categories, then the display of the second chart is updated to reflect the changed order of the second set of categories. For example, via an animated rearrangement of the second set of categories as shown in
FIGS. 7A-7C . - In some embodiments, after updating display of the second chart to reflect inclusion of data that corresponds to the first category, the device detects (2042) a third touch input. For example, a tap gesture, a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the first category had been removed. For example,
FIG. 7D shows thedevice detecting contact 720 at a position corresponding toindicium 504. - In some embodiments, in response to detecting a third touch input, the device updates (2044) display of the second chart to reflect removal of data that corresponds to the first category in the first chart. Thus, data that corresponds to the first category, which was added to the second chart in response to the second touch input (e.g., a rightward swipe or drag gesture), is removed in response to the third touch input (e.g., a leftward swipe or drag gesture) and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the removal of the data that corresponds to the first category. For example,
FIG. 7C shows a bar chart including categories 612 andFIG. 7D shows thedevice detecting contact 720 at a position corresponding toindicium 504 and an update to the bar chart to reflect exclusion of data corresponding to category 502-1. -
FIGS. 21A-21F are flowdiagrams illustrating method 2100 of data visualization, in accordance with some embodiments.Method 2100 is performed at an electronic device (e.g.,portable multifunction device 100,FIG. 1 , ordevice 200,FIG. 2 ) with a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch-sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. In some embodiments,method 2100 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one ormore processors 302 of portablemultifunction device 100 and/or the one ormore processors 352 ofmultifunction device 200, as shown inFIGS. 3A-3B . Some operations inmethod 2100 are, optionally, combined and/or the order of some operations is, optionally, changed. - As described below,
method 2100 provides an intuitive way to change chart categories. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when changing chart categories, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to switch categories faster and more efficiently conserves power and increase the time between battery charges. - The device displays (2102) a first chart on the display. For example,
FIG. 6A shows UI 601 including a bar chart. - The first chart is derived (2104) from a set of data. For example, the chart in
FIG. 6A is derived from a set of data infile 402 shown inFIG. 4A . - The first chart concurrently displays (2106) a first set of categories and a label for the first set of categories. For example,
FIG. 6A a chart with chart label 602-1 including shows categories 502, each with a corresponding category label 408. - Each respective category in the first set of categories has (2108) a corresponding visual mark displayed in the first chart, the corresponding visual mark representing an aggregate value of a first field in the set of data, aggregated according to the first set of categories. For example, in
FIG. 6A , a respective category in the bar chart has a corresponding bar that represents a value for the sum of sales for that category. In this example, “sales” is the first field, and the aggregation type is SUM. Each of the records in the underlying data set is included in one of the categories and is aggregated with other records from the same category. When switched to a different set of categories, the same first field “sales” is used and the same aggregation type SUM is used, but now the underlying records are grouped according to a different set of categories. - The device detects (2110) a first touch input (e.g., a swipe gesture or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the label for the first set of categories. For example,
FIG. 6A shows thedevice detecting contact 610 at position 610-a corresponding to chart label 602-1. - In some embodiments, the first touch input is (2112) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface (e.g., a leftward drag gesture). For example, the movement of
contact 610 shown inFIGS. 6A-6C represents a swipe gesture toward the left side of the screen. - In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the first set of categories, the device replaces (2114) display of the first chart with a second chart via an animated transition, where the label for the first set of categories moves in concert with movement of a finger contact in the first touch input during at least a portion of the animated transition. For example,
FIGS. 6A-6C show an animated transition where the device replaces the first chart with chart label 602-1 with a second chart with chart label 602-2 in concert with movement ofcontact 610. - The second chart is derived (2116) from the set of data. For example, the chart in
FIG. 6C is derived from a set of data infile 402 shown inFIG. 4A . - The second chart concurrently displays (2118) a second set of categories, which replaces display of the first set of categories, and a label for the second set of categories, which replaces display of the label for the first set of categories. In some embodiments, the label for the first set of categories remains displayed while the finger contact in the first touch input remains in continuous contact with the touch-sensitive surface, and the label for the first set of categories ceases to be displayed (e.g., fades out) in response to detecting lift off of the finger contact in the first touch input from the touch-sensitive surface. For example,
FIG. 6A a first chart with chart label 602-1 including shows categories 502, each with a corresponding category label 408.FIGS. 6B-6C show an animated transition where the first chart is replaced with a second chart with chart label 602-2 and including categories 604, where chart label 602-2 is distinct from chart label 602-1 and categories 604 are distinct from categories 502. - Each respective category in the second set of categories has (2120) a corresponding visual mark displayed in the second chart, the corresponding visual mark representing an aggregate value of the first field in the set of data, aggregated according to the second set of categories. For example, in
FIG. 6C , a respective category in the bar chart has a corresponding bar that represents a value for the sum of sales for that category. - In some embodiments, a label for the first field and aggregation type is displayed (2122) with the first chart, and the label for the first field and aggregation type continues to be displayed with the second chart. For example,
FIG. 6A shows a first chart includingfield label 606 andFIG. 6C shows a second chart includingfield label 606. - In some embodiments, a label for the first field and aggregation type (e.g., SUM, MAX, MIN, AVERAGE, COUNT) is displayed (2124) with the first chart. For example,
FIG. 6A shows a first chart includingfield label 606. - In some embodiments, in response to detecting the first touch input, the device: displays (2126) an animation of the second set of categories replacing the first set of categories; displays an animation of the label for the second set of categories replacing the label for the first set of categories; and maintains display of the label for the first field and aggregation type. For example,
FIGS. 6A-6C show an animated transition where the device replaces the first chart with chart label 602-1 with a second chart with chart label 602-2 in concert with movement ofcontact 610 while maintaining display offield label 606. - In some embodiments, while displaying the second chart with the second set of categories, the device detects (2128) a second touch input (e.g., a tap gesture, a swipe gesture, or a drag gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of an indicium that a predefined subset of data is not included in the aggregated values of the first field. For example,
FIG. 7A shows a chart distinct from the chart shown inFIG. 6A and including categories 612 andindicium 504.FIG. 7A also shows thedevice detecting contact 710 at position 710-a corresponding toindicium 504. - In some embodiments, in response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the indicium that the predefined subset of data is not included in the aggregated values of the first field, the device updates display (2130) of the second chart to reflect inclusion of the predefined subset of data in the aggregated values. In some embodiments, data that was filtered out of the second set of data is added to the second set of data and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the addition of the data that was previously filtered out. For example,
FIGS. 7A-7C show thedevice detecting contact 710 moving from position 710-a inFIG. 7A to position 710-c inFIG. 7C .FIGS. 7A-7C also show an animated transition of the chart updating to reflect inclusion of the data from category 502-1. - In some embodiments, updating display of the second chart to reflect inclusion of the predefined subset of data includes reordering (2132) display of the second set of categories in the second chart. For example, if the second set of categories in the second chart are ordered largest to smallest, and adding in the predefined subset of data changes the order of the second set of categories, then the display of the second chart is updated to reflect the changed order of the second set of categories. For example, via an animated rearrangement of the second set of categories as shown in
FIGS. 7A-7C . - In some embodiments, after updating display of the second chart to reflect inclusion of the predefined subset of data, the device detects (2134) a third touch input. For example, a tap gesture, a swipe gesture, or a drag gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a predefined area that displays one or more indicium of data filters, such as the area that displayed the indicium that the predefined subset of data is not included in the second set of data. For example,
FIG. 7D shows thedevice detecting contact 720 at a position corresponding toindicium 504. - In some embodiments, in response to detecting a third touch input, the device updates (2136) display of the second chart to reflect removal of the predefined subset of data. Thus, the predefined subset of data, which was added to the second chart in response to the second touch input (e.g., a rightward swipe or drag gesture), is removed in response to the third touch input (e.g., a leftward swipe or drag gesture) and the visual marks (e.g., graphics) that correspond to the second set of categories in the second chart are automatically updated accordingly to reflect the removal of the predefined subset of data. For example,
FIG. 7C shows a bar chart including categories 612 andFIG. 7D shows thedevice detecting contact 720 at a position corresponding toindicium 504 and an update to the bar chart to reflect exclusion of data corresponding to category 502-1. - In some embodiments, replacing display of the first chart with the second chart via the animated transition in response to detecting the first touch input occurs (2138) without displaying a selection menu. For example, between displaying the first chart and displaying the second chart, the device does not display a selection menu that contains possible sets of categories to display in the second chart. For example,
FIGS. 6A-6C show an animated transition where the device replaces the first chart with chart label 602-1 with a second chart with chart label 602-2 without displaying a selection menu. - In some embodiments, the first touch input is (2140) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface. For example, the movement of
contact 610 shown inFIGS. 6D-6F represents a swipe gesture toward the left side of the screen. - In some embodiments, while displaying the second chart, the device detects (2142) a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. For example,
FIG. 6G shows the chart as inFIG. 6F including categories 612.FIG. 6G also shows thedevice detecting contact 630 at a position corresponding to chart label 602-3, as shown inFIG. 6F . - In some embodiments, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, the device displays (2144) a selection menu with possible sets of categories to display in a third chart. For example,
FIG. 6G showsUI 613 includingselection menu 616 in response to thedevice detecting contact 630. - In some embodiments, the device detects (2146) selection of a respective set of categories in the selection menu. For example, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the respective set of categories in the selection menu. For example,
FIG. 6H showsUI 615 includingselection menu 616, whereselection menu 616 includes selection categories 618.FIG. 6H also shows thedevice detecting contact 640 at a position corresponding to selection category 618-2. - In some embodiments, in response to detecting selection of the respective set of categories in the selection menu, the device: replaces (2148) display of the second chart with a third chart that contains the selected respective set of categories; and ceasing to display the selection menu. Thus, in some embodiments, swipe or drag gestures on a chart label are used as a shortcut to quickly move between different chart types, whereas a tap gesture on the chart label is used to display a selection menu with available chart types and another tap gesture is used to select and display a particular chart type. For example,
FIGS. 6G-6I show a transition between a first chart with categories 612 and a second chart with categories 622. Specifically,FIG. 6H showsUI 615 including the first chart andselection menu 616, whereselection menu 616 includes selection categories 618.FIG. 6H also shows thedevice detecting contact 640 at a position corresponding to selection category 618-2.FIG. 6I showsUI 617 including the second chart shown in response to thedevice detecting contact 640. - In some embodiments, the first touch input is (2140) a drag gesture or a swipe gesture that moves in a first predefined direction on the touch-sensitive surface. For example, the movement of
contact 620 shown inFIGS. 6D-6F represents a swipe gesture toward the left side of the screen. - In some embodiments, while displaying the second chart, the device detects (2150) a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of a label for the second set of categories. For example,
FIG. 6G shows the chart as inFIG. 6F including categories 612.FIG. 6G also shows thedevice detecting contact 630 at a position corresponding to chart label 602-3, as shown inFIG. 6F . - In some embodiments, in response to detecting the tap gesture at the location on the touch-sensitive surface that corresponds to the location on the display of the label for the second set of categories, the device displays (2152) a selection menu with possible sets of categories to display in a third chart. For example,
FIG. 6G showsUI 613 includingselection menu 616 in response to thedevice detecting contact 630. - In some embodiments, the device detects (2154) selection of a first set of categories in the selection menu and a second set of categories in the selection menu. For example, detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the first set of categories in the selection menu and detecting a tap gesture at a location on the touch-sensitive surface that corresponds to a location on the display of the second set of categories in the selection menu. For example,
FIG. 6H showsUI 615 includingselection menu 616, whereselection menu 616 includes selection categories 618. In some embodiments, the device detects a plurality of selection categories 618. - In some embodiments, in accordance with detecting selection of the first set of categories in the selection menu and the second set of categories in the selection menu, the device: replaces (2156) display of the second chart with a third chart that contains the first set of categories and the second set of categories; and ceases to display the selection menu. For example, in some embodiments, the third chart contains categories 612 as shown in
FIG. 6G and categories 622 as shown inFIG. 6I . -
FIGS. 22A-22B are flowdiagrams illustrating method 2200 of data visualization, in accordance with some embodiments.Method 2200 is performed at an electronic device (e.g.,portable multifunction device 100,FIG. 1 , ordevice 200,FIG. 2 ) with a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch-sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. In some embodiments,method 2200 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one ormore processors 302 of portablemultifunction device 100 and/or the one ormore processors 352 ofmultifunction device 200, as shown inFIGS. 3A-3B . Some operations inmethod 2200 are, optionally, combined and/or the order of some operations is, optionally, changed. - As described below,
method 2200 provides an intuitive way to adjust chart magnification (e.g., zooming in and/or zooming out the chart view). This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when adjusting chart magnification, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust magnification faster and more efficiently conserves power and increase the time between battery charges. - The device displays (2202) a first chart on the display. For example,
FIG. 11A shows UI 1101 including a chart. - The chart has (2204) a horizontal axis and a vertical axis. For example, the chart in
FIG. 11A has a vertical axis (Money) and a horizontal axis (Time). - The horizontal axis includes (2206) first horizontal scale markers. For example, the chart in
FIG. 11A has a horizontal axis (Time) with Month markers (e.g., February and March). - The vertical axis includes (2208) first vertical scale markers. For example, the chart in
FIG. 11A has a vertical axis (Money) with thousand dollar markers (e.g., $1,000 through $4,000). - The device detects (2210) a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. For example,
FIG. 11A shows thedevice detecting contact 1110 at position 1110-a andcontact 1120 at position 1120-a. - In some embodiments, the first touch input is (2212) a de-pinch gesture. For example, the movement of
contacts FIGS. 11A and 11B represents a de-pinch gesture. - While detecting the first touch input, the device: horizontally expands (2214) a portion of the chart such that a distance between first horizontal scale markers increases; and maintains a vertical scale of the chart such that a distance between first vertical scale markers remains the same. For example,
FIG. 11A shows UI 1101 including a chart with a vertical axis (Money) and a horizontal axis (Time).FIG. 11A also shows thedevice detecting contact 1110 at position 1110-a andcontact 1120 at position 1120-a.FIG. 11B shows thedevice detecting contact 1110 at position 1110-b andcontact 1120 at position 1120-b and also shows the distance between the horizontal markers (e.g., month markers) increasing while the distance between the vertical markers remains the same. In some embodiments, the method further includes detecting a second touch input; and, while detecting the second touch input: horizontally shrinking a portion of the chart such that a distance between first horizontal scale markers decreases; and maintaining a vertical scale of the chart such that a distance between first vertical scale markers remains the same. For example,FIG. 12B shows UI 1207 including a chart with a vertical axis (Money) and a horizontal axis (Time).FIG. 12B also shows thedevice detecting contact 1210 at position 1210-b andcontact 1220 at position 1220-b.FIGS. 12C and 12D show thedevice detecting contact 1110 at positions 1210-c and 1210-d andcontact 1220 at positions 1220-c and 1220-d respectively.FIGS. 12C and 12D also show the distance between the horizontal markers (e.g., hour markers) decreasing while the distance between the vertical markers remains the same. In some embodiments, the second touch input is a pinch gesture. For example, the movement ofcontacts FIGS. 12B-12D represents a pinch gesture. In some embodiments, the method further includes detecting a third touch input; and, while detecting the third touch input, adjusting the chart view and the horizontal axis of the chart corresponding to the third touch input. In some embodiments, the third touch input is a drag gesture and the method includes: detecting movement of a finger contact in the drag gesture across the touch-sensitive surface and adjusting the chart view and horizontal axis of the chart accordingly. For example,FIG. 15B shows UI 1503 including a chart with a first chart view.FIG. 15B also shows thedevice detecting contact 1510 at position 1510-b.FIG. 15C shows thedevice detecting contact 1510 at position 1510-c (to the left of position 1510-b).FIG. 15C also shows UI 1505 including a chart with a second chart view (e.g., shifted to the left compared to the first chart view). - In some embodiments, after horizontally expanding the portion of the chart and maintaining the vertical scale of the chart while detecting the first touch input, the device ceases (2216) to detect the first touch input. For example,
FIG. 11B shows UI 1103 with a horizontally expanded portion of the chart shown inFIG. 11A .FIG. 11B also shows thedevice detecting contacts FIG. 11A . - In some embodiments, in response to ceasing to detect the first touch input (e.g., detecting lift off of the fingers in the first touch input), the device changes (2218) a vertical scale of the chart. In some embodiments, the vertical scale is adjusted so that all of the data marks are visible within a predefined margin. For example, in
FIGS. 11C and 11D , the device ceases to detectcontacts - In some embodiments, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases (2220), the device, while continuing to detect the first touch input: continues (2222) to horizontally expand a portion of the chart; displays second horizontal scale markers, the second horizontal scale markers being at a finer scale than the first horizontal scale markers; and continues to maintain the vertical scale of the chart. In some embodiments, the horizontal scale markers change from years to months, months to weeks, weeks to days, or days to hours, as shown in
FIGS. 11A-11J . In other embodiments, the second scale markers are displayed in addition to the first scale markers. For example, the original first scale makers may be years, but when the horizontal scale is expanded, month scale markers are shown as well. For example,FIG. 11A shows a chart with horizontal markers denoting months.FIG. 11B shows a chart with horizontal markers denoting months and horizontal markers denoting days. - In some embodiments, after horizontally expanding the portion of the chart such that the distance between first horizontal scale markers increases (2224), the device, while continuing to detect the first touch input: continues (2226) to horizontally expand a portion of the chart; replaces a first set of displayed data marks with a second set of displayed data marks, where for at least some of the data marks in the first set of data marks, an individual data mark in the first set of data marks corresponds to a plurality of data marks in the second set of data marks; and continues to maintain the vertical scale of the chart. For example,
FIG. 11A shows a chart including data marks 1102 (e.g., data mark 1102-1 and data mark 1102-3).FIG. 11B shows a chart including data marks 1102 (e.g., data mark 1102-1 and data mark 1102-3) and data marks 1104 (e.g., data mark 1104-1). In some embodiments, a single data mark (e.g., a circle, square, triangle, bar, or other representation of data points), displayed in the chart prior to horizontally expanding the chart, actually corresponds to a plurality of data points. In some embodiments, during horizontal expansion, a single data mark that corresponds to multiple data points is replaced by a plurality of data marks that correspond to the multiple data points. In some embodiments, the first set of data marks is replaced with the second set of data marks at the same time that the first horizontal scale marks are replaced with the second, finer horizontal scale marks. Thus, the data in a portion of the chart can be displayed at successively finer levels of granularity as a portion of the chart expands horizontally. -
FIGS. 23A-23B are flowdiagrams illustrating method 2300 of data visualization, in accordance with some embodiments.Method 2300 is performed at an electronic device (e.g.,portable multifunction device 100,FIG. 1 , ordevice 200,FIG. 2 ) with a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch-sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. In some embodiments,method 2300 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one ormore processors 302 of portablemultifunction device 100 and/or the one ormore processors 352 ofmultifunction device 200, as shown inFIGS. 3A-3B . Some operations inmethod 2300 are, optionally, combined and/or the order of some operations is, optionally, changed. - As described below,
method 2300 provides an intuitive way to display information about a data mark. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when accessing information about a data mark, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to access data mark information faster and more efficiently conserves power and increase the time between battery charges. - The device displays (2302) at least a first portion of a chart on the display at a first magnification, the first portion of the chart containing a plurality of data marks (e.g., circles, squares, triangles, bars, or other representations of data points). For example,
FIG. 19D shows UI 1907 including a chart with data marks 1902, 1904, and 1906. - The device detects (2304) a first touch input (e.g., a de-pinch gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the first portion of the chart. For example,
FIG. 19D shows thedevice detecting contact 1920 andcontact 1922. - In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first portion of the chart, the device zooms (2306) in to display a second portion of the chart at a second magnification, the second portion of the chart including a first data mark in the plurality of data marks. For example,
FIG. 19E shows the a zoomed in view of the chart shown inFIG. 19D in response to thedevice detecting contacts FIG. 19D andFIG. 19E show data mark 1906-1. - While displaying the second portion of the chart at the second magnification, the device detects (2308) a second touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the second portion of the chart. For example,
FIG. 19E shows thedevice detecting contact 1930 at position 1930-a andcontact 1940 at position 1940-a. - In some embodiments, the second touch input is (2310) a same type of touch input as the first touch input (e.g., both the first touch input and the second touch input are de-pinch gestures). For example,
contacts FIG. 19D andcontacts FIGS. 19E-19H each represent a de-pinch gesture. - In response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart (2312), the device, in accordance with a determination that one or more predefined data-mark-information-display criteria are not met, zooms (2314) in to display a third portion of the chart at a third magnification, the third portion of the chart including the first data mark in the plurality of data marks.
- In response to detecting the second touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the second portion of the chart (2312), the device, in accordance with a determination that the one or more predefined data-mark-information-display criteria are met, displays (2316) information about the first data mark. In some embodiments, while displaying information about the first data mark, the device detects a third touch input on the touch-sensitive surface; and in response to detecting the third touch input, ceases to display the information about the first data mark and display a fourth portion of the chart. In some embodiments, the fourth portion of the chart is the second portion of the chart. For example,
FIGS. 19E-19I show movement ofcontacts FIGS. 19E-19I also show an animated transition from UI 1909 including data mark 1908-1 to UI 1917 including record 1914-1 (e.g., information about data mark 1908-1). - In some embodiments, the information about the first data mark comprises (2318) a data record that corresponds to the first data mark. For example,
FIGS. 19E-19I also show an animated transition from UI 1909 including data mark 1908-1 to UI 1917 including record 1914-1. - In some embodiments, the data-mark-information-display criteria include (2320) the second magnification being a predefined magnification. For example, if the first touch input zooms in the chart to a predefined maximum magnification, then the second touch input causes display of the information about the first data mark, instead of (or in addition to) causing continued zooming in of the chart.
- In some embodiments, the data-mark-information-display criteria include (2322) the first data mark in the plurality of data marks being the only data mark displayed at the second magnification after the first touch input. For example, if the first touch input zooms in the chart so that only the first data mark is displayed, then the second touch input causes display of the information about the first data mark, instead of (or in addition to) causing continued zooming in of the chart.
- In some embodiments, the data-mark-information-display criteria include (2324) the first data mark reaching a predefined magnification during the second touch input. In some embodiments, if the first data mark reaches a predefined magnification during the second touch input (e.g., during a de-pinch gesture), then the device zooms in during the second touch input prior to reaching the predefined magnification, and the device displays the information about the first data mark after reaching the predefined magnification (with or without continuing to zoom in the chart during the remainder of the second touch input).
- In some embodiments, the data-mark-information-display criteria include (2326) the device zooming in to display only the first data mark in the plurality of data marks during the second touch input. In some embodiments, if during the second touch input (e.g., a de-pinch gesture), the device zooms in such that the first data mark is the only data mark that is displayed, the device displays the information about the first data mark after the first data mark is the only data mark that is displayed (with or without continuing to zoom in the chart during the remainder of the second touch input).
- In some embodiments, in accordance with the determination that one or more predefined data-mark-information-display criteria are met, the device ceases (2328) to display the first data mark. In some embodiments, display of the first data mark is replaced by display of a data record that corresponds to the first data mark when the one or more predefined data-mark-information-display criteria are met (e.g., via an animated transition). For example,
FIGS. 19E-19I also show an animated transition where display of record 1914-1 replaces display of data mark 1908-1. -
FIGS. 24A-24E are flowdiagrams illustrating method 2400 of data visualization, in accordance with some embodiments.Method 2400 is performed at an electronic device (e.g.,portable multifunction device 100,FIG. 1 , ordevice 200,FIG. 2 ) with a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch-sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. In some embodiments,method 2400 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one ormore processors 302 of portablemultifunction device 100 and/or the one ormore processors 352 ofmultifunction device 200, as shown inFIGS. 3A-3B . Some operations inmethod 2400 are, optionally, combined and/or the order of some operations is, optionally, changed. - As described below,
method 2400 provides an intuitive way to select portions of a chart and/or display information about the underlying data. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when selecting chart areas, thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to select portions of a chart and view information about the underlying data faster and more efficiently conserves power and increase the time between battery charges. - The device displays (2402) a chart on the display, the chart including a plurality of data marks. For example,
FIG. 13A shows UI 1301 including a chart with a data marks 1312 (e.g., data mark 1312-1 through data mark 1312-10). - In some embodiments, data marks in the plurality of data marks are displayed (2404) in corresponding columns in the chart, with a single data mark per column. For example, data marks 1312 in
FIG. 13A are displayed such that each data mark is in a separate column of the chart. - In some embodiments, data marks in the plurality of data marks are separated (2406) horizontally from one another. For example, data marks 1312 in
FIG. 13A are displayed such that each data mark is separated horizontally from one another. - The device detects (2408) a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of a first predefined area in the chart (e.g., a bar in a bar chart), the first predefined area having a corresponding first value. For example,
FIG. 13A shows the device having detectedcontact 1309 at a position corresponding to selectedportion 1302. - In some embodiments, the first touch input is (2410) a tap gesture. For example, in some embodiments,
contact 1309 shown inFIG. 13A represents a tap gestures. - In some embodiments, the first predefined area includes (2412) a column in the chart. For example,
FIG. 13A shows UI 1301 including a chart and selectedportion 1302 shown inFIG. 13A includes a column of the chart. - In some embodiments, the first predefined area includes (2414) a single data mark in the plurality of data marks. For example, selected
portion 1302 shown inFIG. 13A includes only data mark 1312-1. - In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the first predefined area in the chart (2416), the device: selects (2418) the first predefined area; and visually distinguishes the first predefined area. For example,
FIG. 13A shows the device having detectedcontact 1309 at a position corresponding to selectedportion 1302.FIG. 13A also shows selectedportion 1302 visually distinguished from the remainder of the chart. - While the first predefined area is selected, the device detects (2420) a second touch input on the touch-sensitive surface. For example,
FIG. 13A shows selectedportion 1302 and thedevice detecting contact 1310. - In some embodiments, the second touch input is initially detected (2422) at a location on the touch-sensitive surface that corresponds to a location on the display of the first predefined area. For example,
FIG. 13A shows thedevice detecting contact 1310 at position 1310-a corresponding to a part of selectedportion 1302. - In some embodiments, the second touch input is initially detected (2424) at a location on the touch-sensitive surface that corresponds to a location on the display of an edge of the first predefined area. For example,
contact 1310 inFIG. 13A is detected on the edge of selectedportion 1302. - In some embodiments, the second touch input is initially detected (2426) at a location on the touch-sensitive surface that corresponds to a location on the display of a selection handle in or next to the first predefined area. For example,
contact 1310 inFIG. 13A is detected at a position corresponding to the location of a handle for selectedportion 1302. - In some embodiments, the second touch input is (2428) a drag gesture. For example, the movement of
contact 1310 shown inFIGS. 13A-13D represents a drag gesture toward the right side of the screen. - In some embodiments, the device detects (2430) movement of a finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values. For example,
FIGS. 13B-13D show movement ofcontact 1310 across multiple columns (e.g., a particular column associated with each respective data mark in data marks 1312) within the chart.FIGS. 13B-13D also show the columns being added to the selected portion in accordance with the movement ofcontact 1310. - In some embodiments, in response to detecting movement of the finger contact in the drag gesture across locations on the touch-sensitive surface that correspond to locations on the display of the sequence of predefined areas in the chart that have corresponding values, the device displays (2432) a series of changes between the first value in the first predefined area and the corresponding values of the sequence of predefined areas. For example,
FIGS. 13B-13D show columns being added to the selected portion in accordance with the movement ofcontact 1310.FIGS. 13B-13D also show a change value denoting the change between the value of data mark 1312-1 and the value of the last selected data mark. Specifically,FIG. 13D shows selection of data mark 1312-10 and a change value denoting the change between the value of data mark 1312-1 and the value of data mark 1312-10. - In response to detecting the second touch input on the touch-sensitive surface (2434), the device visually distinguishes (2436) a sequence of predefined areas in the chart, where the sequence of predefined areas is adjacent to the first predefined area. For example,
FIGS. 13B-13D show columns being added to the selected portion in accordance with the movement ofcontact 1310. Specifically,FIG. 13B showscontact 1310 at position 1310-b and corresponding selectedportion 1304, where selectedportion 1304 includes the columns in selectedportion 1302 fromFIG. 13A .FIG. 13C showscontact 1310 at position 1310-c and corresponding selectedportion 1306, where selectedportion 1306 includes the columns in selectedportion 1304 fromFIG. 13B . - In response to detecting the second touch input on the touch-sensitive surface (2434), the device displays (2438) a change between the first value for the first predefined area and a value for a last predefined area in the sequence of predefined areas. For example,
FIG. 13D shows selection of data mark 1312-10 and a change value denoting the change between the value of data mark 1312-1 and the value of data mark 1312-10. - In some embodiments, after the second touch input, a selected area in the chart comprises (2440) the first predefined area and the sequence of predefined areas. For example,
FIGS. 13B-13D show columns being added to the selected portion in accordance with the movement ofcontact 1310. Specifically,FIG. 13D showscontact 1310 at position 1310-d and corresponding selectedportion 1308, where selectedportion 1308 includes the columns from the selected portions shown inFIGS. 13A-13C . - In some embodiments, the device detects (2442) a third touch input, the third touch input including initial contact of a finger at a location on the touch-sensitive surface that corresponds to a location on the display within the selected area in the chart, and movement of the finger across the touch-sensitive surface. For example,
FIG. 14A shows selectedportion 1308 including data for three months (February through April).FIG. 14A also shows thedevice detecting contact 1402 at a position corresponding to selectedportion 1308. - In some embodiments, in response to detecting the third touch input (2444), the device moves (2446) the selected area across the chart, in accordance with the movement of the finger across the touch-sensitive surface, while maintaining a number of predefined areas in the moved selected area equal to the number of predefined areas in the sequence of predefined areas plus one. For example, in some embodiments, in response to detecting movement of
contact 1402 toward the left side of the screen, the device moves selectedportion 1308 to include data for months January through March (i.e., data for three months). - In some embodiments, in response to detecting the third touch input (2444), the device displays (2448) a change between a value corresponding to a leftmost predefined area in the moved selected area and a value corresponding to a rightmost predefined area in the moved selected area. For example, in some embodiments, in response to detecting movement of
contact 1402 toward the left side of the screen, the device moves selectedportion 1308 to include data for months January through March and the change value updates to denote the change in value between the leftmost data mark in the selected portion and the rightmost data mark in the selected portion. - In some embodiments, after the second touch input, a selected area in the chart comprises (2450) the first predefined area and the sequence of predefined areas. For example, selected
portion 1308 inFIG. 13D includes the columns from each of selectedportions - In some embodiments, the device detects (2452) a fourth touch input (e.g., a de-pinch gesture). For example,
FIGS. 18A- 18C show contacts contacts FIGS. 18A-18C represents a de-pinch gesture. - In some embodiments, in response to detecting the fourth touch input (2454), the device zooms (2456) in on the selected area in the chart. For example,
FIGS. 18A-18C show a chart with selectedportion 1802 andFIGS. 18A-18C show the device zooming in on selectedportion 1802 in accordance with the movement ofcontacts - In some embodiments, in response to detecting the fourth touch input (2454), the device, in accordance with a determination that areas in the chart outside the selected area are still displayed on the display, maintains (2458) selection of the selected area. In some embodiments, while zooming in, the device maintains selection of the selected area. In some embodiments, after zooming in, the device maintains selection of the selected area. For example,
FIGS. 18A-18C show the device zooming in on selectedportion 1802 in accordance with the movement ofcontacts FIG. 18D shows the device maintaining selection of selectedportion 1802. - In some embodiments, in response to detecting the fourth touch input (2454), the device, in accordance with a determination only areas in the chart in the selected area are displayed on the display, ceases (2460) selection of the selected area. In some embodiments, the selected area disappears when the device zooms in to the chart such that no area in the chart outside the selected area is displayed. In some embodiments, while zooming in, the device ceases selection of the selected area. In some embodiments, after zooming in, the device ceases selection of the selected area.
-
FIGS. 25A-25D are flowdiagrams illustrating method 2500 of data visualization, in accordance with some embodiments.Method 2500 is performed at an electronic device (e.g.,portable multifunction device 100,FIG. 1 , ordevice 200,FIG. 2 ) with a display and a touch-sensitive surface. In some embodiments, the display is a touch screen display and the touch-sensitive surface is on the display. In some embodiments, the display is separate from the touch-sensitive surface. In some embodiments,method 2500 is governed by instructions that are stored in a non-transitory computer readable storage medium and that are executed by one or more processors of a device, such as the one ormore processors 302 of portablemultifunction device 100 and/or the one ormore processors 352 ofmultifunction device 200, as shown inFIGS. 3A-3B . Some operations inmethod 2500 are, optionally, combined and/or the order of some operations is, optionally, changed. - As described below,
method 2500 provides an intuitive way to update chart views. This method is particularly useful when the user is interacting with a portable device and/or a compact device with a smaller screen. The method reduces the cognitive burden on the user when adjusting a chart view (e.g., adjusting chart magnification), thereby creating a more efficient human-machine interface. For battery-operated electronic devices, enabling a user to adjust chart views faster and more efficiently conserves power and increase the time between battery charges. - The device displays (2502) a chart on the display. For example,
FIG. 19A showsUI 1901 including a chart. - The chart has (2504) a horizontal axis with a first horizontal scale with first horizontal scale markers. For example, the chart in
FIG. 19A has a horizontal scale with horizontal scale markers denoting years. - The chart has (2506) a vertical axis with a first vertical scale with first vertical scale markers. For example, the chart in
FIG. 19A has a vertical scale with vertical scale markers denoting hundreds of sunspots. - The chart includes (2508) a first set of data marks. For example, the chart in
FIG. 19A includes data marks 1902. - In some embodiments, adjacent data marks in the first set of first data marks are separated (2510) by a first horizontal distance. In some embodiments, the first horizontal distance corresponds to the first horizontal scale. For example, the chart in
FIG. 19A includes a respective data mark 1902 for each year on the horizontal axis. - Each respective data mark in the first set of data marks has (2512) a respective abscissa and a respective ordinate. For example, in some embodiments, each data mark 1902 in
FIG. 19A has a respective abscissa and a respective ordinate. - The chart includes (2514) a line that connects adjacent data marks in the first set of data marks. For example, the chart in
FIG. 19A includes a line that connects data marks 1902. - The device detects (2516) a first touch input (e.g., a de-pinch gesture) at a location on the touch-sensitive surface that corresponds to a location on the display of the chart. For example, the movement of
contacts FIGS. 19A-19D represents a de-pinch gesture. - While detecting the first touch input (2518), the device expands (2520) at least a portion of the chart such that a distance between adjacent first horizontal scale markers increases in accordance with the first touch input. For example,
FIG. 19B shows an expanded portion of the chart shown inFIG. 19A .FIG. 19B shows the distance between horizontal scale markers being greater than the distance between horizontal scale markers inFIG. 19A . In some embodiments,FIG. 19B shows the expanded portion of the chart in response tocontacts - While detecting the first touch input (2518), the device expands (2522) at least a portion of the line that connects adjacent data marks in the first set of data marks in accordance with the first touch input. For example, the expanded portion of the chart shown in
FIG. 19B includes expanded portions of the line connecting data marks 1902. - While detecting the first touch input (2518), the device adds (2524) a second set of second data marks, distinct from the first set of data marks, on the line. For example,
FIG. 19B shows data marks 1904 added to the line connecting data marks 1902. - Each respective data mark in the second set of data marks includes (2526) a respective abscissa and a respective ordinate. For example, in some embodiments, each data mark 1904 shown in
FIG. 19B includes a respective abscissa and a respective ordinate. - Each respective data mark in the second set of data marks is (2528) placed on the line based on the respective abscissa of the respective data mark, independent of the respective ordinate of the respective data mark. For example, in some embodiments, each data mark 1904 shown in
FIG. 19B is placed on the line based on its respective abscissa without regards to its respective ordinate. - In some embodiments, adjacent data marks in the second set of data marks are separated (2530) by a second horizontal distance that corresponds to a second horizontal scale that is finer than the first horizontal scale. For example, the chart in
FIG. 19B includes a respective data mark 1904 for each month and a respective data mark 1902 for each year. - In some embodiments, each respective data mark in the second set of data marks is placed (2532) on the line based on the respective abscissa of the respective data mark and the ordinate of the line at the respective abscissa of the respective data mark. For example, in some embodiments, each data mark in data marks 1904 shown in
FIG. 19B is placed on the line based on its respective abscissa and the ordinate of the line at its' respective abscissa. - In some embodiments, a shape of the line is maintained (2534) when the second set of data marks is added to the line. For example, in some embodiments, the shape of the line in
FIG. 19B is maintained when data marks 1904 are added to the line. - In some embodiments, a single data mark in the first set of data marks corresponds (2536) to a plurality of data marks in the second set of data marks. For example, in some embodiments, each data mark in data marks 1902 corresponds to twelve data marks in data marks 1904 (e.g., one for each month in the year).
- In some embodiments, the device ceases (2538) to display the set of first data marks when the second set of data marks is added. For example, in some embodiments, the device ceases to display data marks 1902 when data marks 1904 are added to the line.
- After adding the second set of data marks on the line (2540), the device, for each respective data mark in the second set of data marks placed on the line at a vertical position distinct from its respective ordinate, animatedly moves (2542) the respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis. For example, in some embodiments, data marks 1904 are animatedly moved from their initial positions shown in
FIG. 19B to their respective ordinate as shown inFIG. 19C . - In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs (2544) while detecting the first input. For example, data marks 1904 are animatedly moved from their initial positions shown in
FIG. 19B to their respective ordinate as shown inFIG. 19C while the device continues to detectcontacts - In some embodiments, animatedly moving each respective data mark vertically in accordance with the respective ordinate for the respective data mark and a second vertical scale for the vertical axis occurs (2546) after ceasing to detect the first input.
- In some embodiments, the second vertical scale is (2548) the same as the first vertical scale.
- After adding the second set of data marks on the line (2540), the device animatedly adjusts (2550) the line so that the line connects the second set of data marks. For example, in some embodiments, the line connecting data marks 1904 is animatedly adjusted its' initial position shown in
FIG. 19B to its' final position shown inFIG. 19C . - In some embodiments, animatedly moving each respective data mark vertically and animatedly adjusting the line so that the line connects the set of second data marks occur (2552) concurrently.
- In some embodiments, the device ceases (2554) to display the set of first data marks after the second set of data marks is added.
-
FIGS. 26A-26F illustrate how some embodiments allow scrolling through filter selections, with the data visualization updated immediately as the filter changes. These figures provide bar charts showing total sales for a three month period in 2014, and the data is filtered by region. In this example, the four regions are Central, East, South, and West. - Initially, the user has filtered the data to display sales data for just the Central region, as shown in
FIG. 26A .Filter indicia 2608 include a scrollable region indicator that indicates Central selection 2612-C. Corresponding to this filter selection, user interface 2601 displays visual graphic 2610-C, which shows data for the Central region. InFIG. 26B , the user wants to compare the Central region to the other regions, and the device detects a contact atposition 2614 corresponding to Central selection 2312-C. At this time user interface 2601 still displays visual graphic 2610-C for the Central region. - In
FIG. 26C , the user has started scrolling upwards (e.g., using a swipe gesture), so the contact is moving upwards toposition 2616. At this time, the scrollable region indicator is transitioning from “Central” to “East,” so selection 2612-C/E is in an interim state. Visual graphic 2610-C is still the graphic for the Central region. When the scrollable region indicator displays East selection 2612-E, as illustrated inFIG. 26D , visual graphic 2610-E, including data for the East region, is displayed. At this time, the contact is still moving upwards toposition 2618. As illustrated inFIG. 26E , the contact has moved upward toposition 2620 and the indicator shows “South” region selection 2612-S. When the scrollable region indicator shows the South region, user interface 2601 displays visual graphic 2610-S including data for the South region. - As illustrated in
FIG. 26E , the contact is still moving upwards toposition 2622, so the scrollable region indicator comes to West region selection 2612-W, as illustrated inFIG. 26F . When this occurs, user interface 2601 displays the data for the West region in visual graphic 2610-W. - As illustrated by
FIGS. 26A-26F , a user can quickly scroll through filter values, and the visual graphic updates according to the filter as different filter values are selected. In some embodiments, the updates to the display depend on the scroll speed. For example, if the scrolling is performed slowly, the visual graphic is updated for each filter value as illustrated inFIGS. 26A-26F . On the other hand, if the values are scrolled quickly, a user is probably not interested in the intermediate values, and thus the visual graphic is not updated until the scrolling slows down or stops. - As illustrated by
FIGS. 26A-26F , in some embodiments a method executes at an electronic device with a touch-sensitive surface and a display. The method displays a filter indicium on the display that specifies a first category of a first set of categories, each category corresponding to a respective value of a first field in a data set. For example,FIG. 26A includesfilter indicium 2608, which specifies the Central category 2612-C. The Central category is one category of a first set of categories that includes Central, East, South, and West. Each of these categories corresponds to a value of a “region” field in the data set. In some implementations, the field values are “Central,” “East,” “South,” and “West,” but in some embodiments the values are encoded differently (e.g., using numeric codes, or alphanumeric code, or just the first letter of each region). - The method concurrently displays a first chart on the display, such as the visual graphic 2610-C in
FIG. 26A . The first chart includes a plurality of visual marks, such as thevertical bars FIG. 26A . Each visual mark corresponds to a respective aggregated value of a first measure in the data set, aggregated according to a second field in the data set and filtered to aggregate only values of the first measure that are associated with the first category. For example, inFIG. 26A , each of the vertical bars is computed as an aggregate of sales (here using the SUM aggregate function). The field “sales” in the data set is a measure. The data for the vertical bars is aggregated by month, which is a second field in the data set (typically computed from a date field). For example, theFebruary bar 2650 represents the aggregated total sales for February. In addition, the aggregated values for the threebars bars - While displaying the first chart, the method detects a first touch input at a location on the touch-sensitive surface that corresponds to a location on the display of the filter indicium. This is illustrated by the contact points 2614 and 2616 in
FIGS. 26B and 26C . In response to detecting the first touch input at the location on the touch-sensitive surface that corresponds to the location on the display of the filter indicium, the method updates the indicium to specify a second category of the first set of categories. This is illustrated inFIG. 26D , where the filter indicium has changed to show the East region. In addition, the method displays an updated first chart on the display. As illustrated inFIG. 26D , the updated chart 2610-E includes different bars, which are based on data for the East region. The updated first chart includes an updated plurality of visual marks, such as thevertical bars FIG. 26D . Each updated visual mark corresponds to a respective aggregated value of the first measure in the data set, aggregated according to the second field in the data set and filtered to aggregate only values of the first measure that are associated with the second category. InFIG. 26D , each of the bars corresponds to aggregated sales, which is aggregated by month, and filtered to include data for just the East region. - It will be understood that, although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without changing the meaning of the description, so long as all occurrences of the “first contact” are renamed consistently and all occurrences of the second contact are renamed consistently. The first contact and the second contact are both contacts, but they are not the same contact.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the claims. As used in the description of the embodiments and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
- As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting,” that a stated condition precedent is true, depending on the context. Similarly, the phrase “if it is determined [that a stated condition precedent is true]” or “if [a stated condition precedent is true]” or “when [a stated condition precedent is true]” may be construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
- The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the claims to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain principles of operation and practical applications, to thereby enable others skilled in the art.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/877,373 US20200279419A1 (en) | 2014-09-08 | 2020-05-18 | Methods and devices for manipulating graphical views of data |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462047429P | 2014-09-08 | 2014-09-08 | |
US14/603,330 US9857952B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart magnification |
US15/859,235 US10657685B2 (en) | 2014-09-08 | 2017-12-29 | Methods and devices for adjusting chart magnification |
US16/877,373 US20200279419A1 (en) | 2014-09-08 | 2020-05-18 | Methods and devices for manipulating graphical views of data |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/859,235 Continuation US10657685B2 (en) | 2014-09-08 | 2017-12-29 | Methods and devices for adjusting chart magnification |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200279419A1 true US20200279419A1 (en) | 2020-09-03 |
Family
ID=57730120
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/603,330 Active 2036-01-12 US9857952B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart magnification |
US14/603,302 Active 2035-03-20 US10706597B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart filters |
US14/603,322 Active 2037-03-06 US11017569B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for displaying data mark information |
US14/603,312 Active 2035-03-29 US10521092B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart magnification asymmetrically |
US15/859,235 Active 2036-01-11 US10657685B2 (en) | 2014-09-08 | 2017-12-29 | Methods and devices for adjusting chart magnification |
US16/877,373 Pending US20200279419A1 (en) | 2014-09-08 | 2020-05-18 | Methods and devices for manipulating graphical views of data |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/603,330 Active 2036-01-12 US9857952B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart magnification |
US14/603,302 Active 2035-03-20 US10706597B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart filters |
US14/603,322 Active 2037-03-06 US11017569B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for displaying data mark information |
US14/603,312 Active 2035-03-29 US10521092B2 (en) | 2014-09-08 | 2015-01-22 | Methods and devices for adjusting chart magnification asymmetrically |
US15/859,235 Active 2036-01-11 US10657685B2 (en) | 2014-09-08 | 2017-12-29 | Methods and devices for adjusting chart magnification |
Country Status (1)
Country | Link |
---|---|
US (6) | US9857952B2 (en) |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11424040B2 (en) | 2013-01-03 | 2022-08-23 | Aetna Inc. | System and method for pharmacovigilance |
US10489717B2 (en) | 2013-01-03 | 2019-11-26 | Aetna, Inc. | System and method for pharmacovigilance |
US10831356B2 (en) | 2014-02-10 | 2020-11-10 | International Business Machines Corporation | Controlling visualization of data by a dashboard widget |
USD819656S1 (en) * | 2014-05-28 | 2018-06-05 | Jan Magnus Edman | Display screen with graphical user interface |
US20170068416A1 (en) * | 2015-09-08 | 2017-03-09 | Chian Chiu Li | Systems And Methods for Gesture Input |
USD789978S1 (en) * | 2015-12-11 | 2017-06-20 | Adp, Llc | Display screen with graphical user interface |
USD789977S1 (en) * | 2015-12-11 | 2017-06-20 | Adp, Llc | Display screen with graphical user interface |
US10748312B2 (en) * | 2016-02-12 | 2020-08-18 | Microsoft Technology Licensing, Llc | Tagging utilizations for selectively preserving chart elements during visualization optimizations |
US10347017B2 (en) * | 2016-02-12 | 2019-07-09 | Microsoft Technology Licensing, Llc | Interactive controls that are collapsible and expandable and sequences for chart visualization optimizations |
USD811432S1 (en) | 2016-04-18 | 2018-02-27 | Aetna Inc. | Computer display with graphical user interface for a pharmacovigilance tool |
US11287945B2 (en) | 2016-09-08 | 2022-03-29 | Chian Chiu Li | Systems and methods for gesture input |
FR3061598B1 (en) * | 2016-12-29 | 2020-10-16 | Thales Sa | PILOTAGE INFORMATION CALCULATION AND DISPLAY PROCESS INCLUDING A "RELIEF FACTOR" |
CN108319577B (en) * | 2017-01-18 | 2021-09-28 | 阿里巴巴集团控股有限公司 | Chart processing method and device and electronic equipment |
USD843414S1 (en) * | 2017-11-14 | 2019-03-19 | Intuit Inc. | Display screen or portion thereof with transitional icon |
USD834061S1 (en) * | 2017-11-14 | 2018-11-20 | Intuit Inc. | Display screen or portion thereof with transitional icon |
USD834062S1 (en) * | 2017-11-14 | 2018-11-20 | Intuit Inc. | Display screen or portion thereof with transitional icon |
USD843416S1 (en) * | 2017-11-14 | 2019-03-19 | Intuit Inc. | Display screen or portion thereof with transitional icon |
USD834066S1 (en) * | 2017-11-14 | 2018-11-20 | Intuit Inc. | Display screen or portion thereof with transitional icon |
CA3042310C (en) | 2018-05-04 | 2021-06-08 | Abl Ip Holding Llc | Optics for aisle lighting |
US10732828B2 (en) | 2018-06-28 | 2020-08-04 | Sap Se | Gestures used in a user interface for navigating analytic data |
US10552736B1 (en) * | 2019-03-06 | 2020-02-04 | Capital One Services, Llc | Counter data generation for data profiling using only true samples |
US11341125B2 (en) | 2019-06-01 | 2022-05-24 | Apple Inc. | Methods and system for collection view update handling using a diffable data source |
USD931313S1 (en) * | 2019-07-19 | 2021-09-21 | Aristocrat Technologies Australia Pty Limited | Display screen or portion thereof with transitional graphical user interface |
US11488450B2 (en) | 2019-10-02 | 2022-11-01 | Aristocrat Technologies Australia Pty Limited | Gaming device for awarding additional feature game instances with controlled oversized symbols |
CN111651115A (en) * | 2020-06-03 | 2020-09-11 | 北京小米移动软件有限公司 | Numerical value selection method and device, terminal device and storage medium |
WO2023234785A1 (en) * | 2022-05-31 | 2023-12-07 | Xero Limited | Graphical user interface |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100214300A1 (en) * | 2009-02-25 | 2010-08-26 | Quinton Alsbury | Displaying Bar Charts With A Fish-Eye Distortion Effect |
Family Cites Families (126)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5848187A (en) * | 1991-11-18 | 1998-12-08 | Compaq Computer Corporation | Method and apparatus for entering and manipulating spreadsheet cell data |
US5414809A (en) | 1993-04-30 | 1995-05-09 | Texas Instruments Incorporated | Graphical display of data |
US5806078A (en) * | 1994-06-09 | 1998-09-08 | Softool Corporation | Version management system |
JPH09106336A (en) * | 1995-10-11 | 1997-04-22 | Sharp Corp | Method for displaying plural display images within display window of information processor |
US6480194B1 (en) | 1996-11-12 | 2002-11-12 | Silicon Graphics, Inc. | Computer-related method, system, and program product for controlling data visualization in external dimension(s) |
US6400366B1 (en) | 1998-09-14 | 2002-06-04 | Visual Insights, Inc. | Method and system for the interactive visualization and examination of data |
US7469381B2 (en) | 2007-01-07 | 2008-12-23 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US6529217B1 (en) * | 1999-06-15 | 2003-03-04 | Microsoft Corporation | System and method for graphically displaying a set of data fields |
EP1182822B1 (en) | 2000-02-21 | 2013-02-13 | Kabushiki Kaisha Toshiba | Network Management Equipment |
US8205149B2 (en) * | 2001-01-05 | 2012-06-19 | Microsoft Corporation | Enhanced find and replace for electronic documents |
US6906717B2 (en) * | 2001-02-27 | 2005-06-14 | Microsoft Corporation | Multiple chart user interface |
AU2002953555A0 (en) | 2002-12-23 | 2003-01-16 | Canon Kabushiki Kaisha | Method for presenting hierarchical data |
US8745483B2 (en) | 2004-10-07 | 2014-06-03 | International Business Machines Corporation | Methods, systems and computer program products for facilitating visualization of interrelationships in a spreadsheet |
US7345688B2 (en) | 2004-10-18 | 2008-03-18 | Microsoft Corporation | Semantic thumbnails |
EP1812946A1 (en) | 2004-11-15 | 2007-08-01 | Credence Systems Corporation | System and method for focused ion beam data analysis |
JP4695462B2 (en) | 2005-08-29 | 2011-06-08 | 株式会社アイ・エヌ情報センター | Graph display device and program |
US20110145689A1 (en) * | 2005-09-09 | 2011-06-16 | Microsoft Corporation | Named object view over multiple files |
US20070061699A1 (en) * | 2005-09-09 | 2007-03-15 | Microsoft Corporation | Named object view of electronic data report |
US7783977B2 (en) * | 2005-09-29 | 2010-08-24 | Datanab, Llc | System and method for balancing of ventilation systems |
US8024650B2 (en) | 2006-03-31 | 2011-09-20 | Microsoft Corporation | Drilling on elements in arbitrary ad-hoc reports |
US20070285426A1 (en) * | 2006-06-08 | 2007-12-13 | Matina Nicholas A | Graph with zoom operated clustering functions |
US8654125B2 (en) | 2006-06-22 | 2014-02-18 | International Business Machines Corporation | System and method of chart data layout |
US7844892B2 (en) * | 2006-08-17 | 2010-11-30 | International Business Machines Corporation | Method and system for display of business intelligence data |
US7864163B2 (en) | 2006-09-06 | 2011-01-04 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying structured electronic documents |
US8106856B2 (en) | 2006-09-06 | 2012-01-31 | Apple Inc. | Portable electronic device for photo management |
US7877707B2 (en) | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US8321781B2 (en) * | 2007-02-08 | 2012-11-27 | Microsoft Corporation | User interface pane for an interactive chart |
US7737979B2 (en) | 2007-02-12 | 2010-06-15 | Microsoft Corporation | Animated transitions for data visualization |
US20120117500A1 (en) | 2007-02-23 | 2012-05-10 | Enrico Maim | Method for the extraction, combination, synthesis and visualisation of multi-dimensional data from different sources |
US8161416B2 (en) | 2007-03-16 | 2012-04-17 | Sap Ag | Navigator for displays |
US8910084B2 (en) | 2007-05-07 | 2014-12-09 | Oracle International Corporation | Aggregate layout for data visualization techniques |
US7903439B2 (en) * | 2007-05-18 | 2011-03-08 | Texas Instruments Incorporated | Methods and apparatus to control a digital power supply |
US8171432B2 (en) | 2008-01-06 | 2012-05-01 | Apple Inc. | Touch screen device, method, and graphical user interface for displaying and selecting application options |
US20090171606A1 (en) | 2007-12-31 | 2009-07-02 | Takahiro Murata | Semiconductor manufacture performance analysis |
US9477776B2 (en) | 2008-04-02 | 2016-10-25 | Paypal, Inc. | System and method for visualization of data |
US8525837B2 (en) * | 2008-04-29 | 2013-09-03 | Teledyne Lecroy, Inc. | Method and apparatus for data preview |
US20090313537A1 (en) * | 2008-06-17 | 2009-12-17 | Microsoft Corporation | Micro browser spreadsheet viewer |
US20100238176A1 (en) | 2008-09-08 | 2010-09-23 | Apple Inc. | Systems, methods, and devices for flash exposure control using preflash statistics |
US9158453B2 (en) | 2008-09-30 | 2015-10-13 | Rockwell Automation Technologies, Inc. | Human-machine interface having multiple touch trend manipulation capabilities |
US9229922B2 (en) * | 2008-09-30 | 2016-01-05 | Apple Inc. | Token representation of references and function arguments |
US10685177B2 (en) | 2009-01-07 | 2020-06-16 | Litera Corporation | System and method for comparing digital data in spreadsheets or database tables |
US8689095B2 (en) | 2009-02-09 | 2014-04-01 | Microsoft Corporation | Grid presentation in web-based spreadsheet services |
US8319801B2 (en) * | 2009-05-08 | 2012-11-27 | International Business Machines Corporation | Magnifying content on a graphical display |
US8766928B2 (en) * | 2009-09-25 | 2014-07-01 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US20110115814A1 (en) | 2009-11-16 | 2011-05-19 | Microsoft Corporation | Gesture-controlled data visualization |
US8836726B1 (en) | 2009-12-04 | 2014-09-16 | The Mathworks, Inc. | Automatic scaling of axes for displaying moving data |
US8786639B2 (en) | 2010-01-06 | 2014-07-22 | Apple Inc. | Device, method, and graphical user interface for manipulating a collection of objects |
US8786559B2 (en) * | 2010-01-06 | 2014-07-22 | Apple Inc. | Device, method, and graphical user interface for manipulating tables using multi-contact gestures |
US8996978B2 (en) * | 2010-05-14 | 2015-03-31 | Sap Se | Methods and systems for performing analytical procedures by interactions with visual representations of datasets |
US20120005045A1 (en) * | 2010-07-01 | 2012-01-05 | Baker Scott T | Comparing items using a displayed diagram |
US8773370B2 (en) * | 2010-07-13 | 2014-07-08 | Apple Inc. | Table editing systems with gesture-based insertion and deletion of columns and rows |
US8423909B2 (en) | 2010-07-26 | 2013-04-16 | International Business Machines Corporation | System and method for an interactive filter |
US8972467B2 (en) * | 2010-08-31 | 2015-03-03 | Sovanta Ag | Method for selecting a data set from a plurality of data sets by means of an input device |
US9747270B2 (en) * | 2011-01-07 | 2017-08-29 | Microsoft Technology Licensing, Llc | Natural input for spreadsheet actions |
US9244606B2 (en) * | 2010-12-20 | 2016-01-26 | Apple Inc. | Device, method, and graphical user interface for navigation of concurrently open software applications |
US20120158623A1 (en) | 2010-12-21 | 2012-06-21 | Microsoft Corporation | Visualizing machine learning accuracy |
US9058365B2 (en) | 2010-12-22 | 2015-06-16 | Sap Se | Systems and methods providing touchscreen report navigation |
US8615511B2 (en) | 2011-01-22 | 2013-12-24 | Operational Transparency LLC | Data visualization interface |
US9075493B2 (en) | 2011-03-07 | 2015-07-07 | Sas Institute, Inc. | Techniques to present hierarchical information using orthographic projections |
US9021397B2 (en) * | 2011-03-15 | 2015-04-28 | Oracle International Corporation | Visualization and interaction with financial data using sunburst visualization |
US8863019B2 (en) * | 2011-03-29 | 2014-10-14 | International Business Machines Corporation | Modifying numeric data presentation on a display |
US8762867B1 (en) * | 2011-05-16 | 2014-06-24 | Mellmo Inc. | Presentation of multi-category graphical reports |
US20120313957A1 (en) * | 2011-06-09 | 2012-12-13 | Microsoft Corporation | Staged Animated Transitions for Aggregation Charts |
US9946429B2 (en) * | 2011-06-17 | 2018-04-17 | Microsoft Technology Licensing, Llc | Hierarchical, zoomable presentations of media sets |
US20120324388A1 (en) | 2011-06-17 | 2012-12-20 | Business Objects Software Limited | Pie chart graphical user interface |
US8832588B1 (en) * | 2011-06-30 | 2014-09-09 | Microstrategy Incorporated | Context-inclusive magnifying area |
US20130009963A1 (en) * | 2011-07-07 | 2013-01-10 | Microsoft Corporation | Graphical display of data with animation |
US9086794B2 (en) * | 2011-07-14 | 2015-07-21 | Microsoft Technology Licensing, Llc | Determining gestures on context based menus |
US8719725B2 (en) * | 2011-07-18 | 2014-05-06 | Oracle International Corporation | Touch optimized pivot table |
US9063637B2 (en) * | 2011-09-23 | 2015-06-23 | Microsoft Technology Licensing, Llc | Altering a view of a document on a display of a computing device |
US8839089B2 (en) * | 2011-11-01 | 2014-09-16 | Microsoft Corporation | Multi-dimensional data manipulation and presentation |
US8990686B2 (en) | 2011-11-02 | 2015-03-24 | Microsoft Technology Licensing, Llc | Visual navigation of documents by object |
US9652448B2 (en) | 2011-11-10 | 2017-05-16 | Blackberry Limited | Methods and systems for removing or replacing on-keyboard prediction candidates |
US9262849B2 (en) * | 2011-11-14 | 2016-02-16 | Microsoft Technology Licensing, Llc | Chart animation |
US9400600B2 (en) * | 2011-12-16 | 2016-07-26 | Samsung Electronics Co., Ltd. | Method, apparatus, and graphical user interface for providing visual effects on a touchscreen display |
US10191641B2 (en) | 2011-12-29 | 2019-01-29 | Apple Inc. | Device, method, and graphical user interface for navigation of information in a map-based interface |
US20130194272A1 (en) | 2012-02-01 | 2013-08-01 | Ming C. Hao | Placing pixels according to attribute values in positions in a graphical visualization that correspond to geographic locations |
US20130275904A1 (en) * | 2012-04-11 | 2013-10-17 | Secondprism Inc. | Interactive data visualization and manipulation |
US9323443B2 (en) * | 2012-05-02 | 2016-04-26 | International Business Machines Corporation | Drilling of displayed content in a touch screen device |
US8527909B1 (en) | 2012-05-29 | 2013-09-03 | Sencha, Inc. | Manipulating data visualizations on a touch screen |
US9164972B2 (en) | 2012-06-07 | 2015-10-20 | Microsoft Technology Licensing, Llc | Managing objects in panorama display to navigate spreadsheet |
US9563674B2 (en) * | 2012-08-20 | 2017-02-07 | Microsoft Technology Licensing, Llc | Data exploration user interface |
US10001897B2 (en) * | 2012-08-20 | 2018-06-19 | Microsoft Technology Licensing, Llc | User interface tools for exploring data visualizations |
US9152618B2 (en) | 2012-08-31 | 2015-10-06 | Microsoft Technology Licensing, Llc | Cell view mode for outsized cells |
US9110974B2 (en) * | 2012-09-10 | 2015-08-18 | Aradais Corporation | Display and navigation of structured electronic documents |
US9824469B2 (en) | 2012-09-11 | 2017-11-21 | International Business Machines Corporation | Determining alternative visualizations for data based on an initial data visualization |
US9513792B2 (en) * | 2012-10-10 | 2016-12-06 | Sap Se | Input gesture chart scaling |
US20140109012A1 (en) | 2012-10-16 | 2014-04-17 | Microsoft Corporation | Thumbnail and document map based navigation in a document |
US20140113268A1 (en) * | 2012-10-22 | 2014-04-24 | Microsoft Corporation | Interactive visual assessment after a rehearsal of a presentation |
US9690449B2 (en) | 2012-11-02 | 2017-06-27 | Microsoft Technology Licensing, Llc | Touch based selection of graphical elements |
US9922018B2 (en) * | 2012-11-12 | 2018-03-20 | Microsoft Technology Licensing, Llc | Scrollbar for zooming on rows and columns of a spreadsheet and interpreting cells |
US9755995B2 (en) | 2012-11-20 | 2017-09-05 | Dropbox, Inc. | System and method for applying gesture input to digital content |
US9158766B2 (en) * | 2012-11-29 | 2015-10-13 | Oracle International Corporation | Multi-touch interface for visual analytics |
US9213478B2 (en) * | 2012-12-21 | 2015-12-15 | Business Objects Software | Visualization interaction design for cross-platform utilization |
US11086508B2 (en) * | 2013-01-31 | 2021-08-10 | Hewlett-Packard Development Company, L.P. | Electronic device with touch gesture adjustment of a graphical representation |
GB201301888D0 (en) | 2013-02-03 | 2013-03-20 | Neutrino Concepts Ltd | User interface |
US9495777B2 (en) | 2013-02-07 | 2016-11-15 | Oracle International Corporation | Visual data analysis for large data sets |
US9070227B2 (en) | 2013-03-04 | 2015-06-30 | Microsoft Technology Licensing, Llc | Particle based visualizations of abstract information |
US9443336B2 (en) * | 2013-03-12 | 2016-09-13 | Sas Institute Inc. | Proportional highlighting of data |
US10140269B2 (en) | 2013-03-12 | 2018-11-27 | Microsoft Technology Licensing, Llc | Viewing effects of proposed change in document before committing change |
US10372292B2 (en) * | 2013-03-13 | 2019-08-06 | Microsoft Technology Licensing, Llc | Semantic zoom-based navigation of displayed content |
US10359919B2 (en) * | 2013-03-14 | 2019-07-23 | Microsoft Technology Licensing, Llc | Staged animation of charts for data updates |
US9760262B2 (en) | 2013-03-15 | 2017-09-12 | Microsoft Technology Licensing, Llc | Gestures involving direct interaction with a data visualization |
US20140320539A1 (en) * | 2013-04-30 | 2014-10-30 | Hewlett-Packard Development Company, L.P. | Semantic zoom-in or drill-down in a visualization having cells with scale enlargement and cell position adjustment |
US9709978B2 (en) * | 2013-05-09 | 2017-07-18 | Rockwell Automation Technologies, Inc. | Using cloud-based data for virtualization of an industrial automation environment with information overlays |
US9201589B2 (en) * | 2013-05-21 | 2015-12-01 | Georges Antoine NASRAOUI | Selection and display of map data and location attribute data by touch input |
US10360297B2 (en) * | 2013-06-14 | 2019-07-23 | Microsoft Technology Licensing, Llc | Simplified data input in electronic documents |
US9733785B2 (en) * | 2013-06-24 | 2017-08-15 | Oracle International Corporation | Facilitating touch screen users to select elements identified in a two dimensional space |
US20140380178A1 (en) * | 2013-06-24 | 2014-12-25 | Oracle International Corporation | Displaying interactive charts on devices with limited resources |
US10775971B2 (en) | 2013-06-28 | 2020-09-15 | Successfactors, Inc. | Pinch gestures in a tile-based user interface |
US9665259B2 (en) * | 2013-07-12 | 2017-05-30 | Microsoft Technology Licensing, Llc | Interactive digital displays |
CN105308480B (en) * | 2013-07-22 | 2019-05-21 | 企业服务发展公司有限责任合伙企业 | Data are presented in scalable format |
US9449408B2 (en) * | 2013-07-25 | 2016-09-20 | Sas Institute Inc. | Visualizing high-cardinality data |
US20150058801A1 (en) * | 2013-08-23 | 2015-02-26 | General Electric Company | Multi-touch inspection tool |
WO2015035571A1 (en) * | 2013-09-11 | 2015-03-19 | Nokia Corporation | Apparatus for enabling displaced effective input and associated methods |
US9389777B2 (en) * | 2013-11-08 | 2016-07-12 | Business Objects Software Ltd. | Gestures for manipulating tables, charts, and graphs |
US20150169531A1 (en) | 2013-12-17 | 2015-06-18 | Microsoft Corporation | Touch/Gesture-Enabled Interaction with Electronic Spreadsheets |
US10416871B2 (en) * | 2014-03-07 | 2019-09-17 | Microsoft Technology Licensing, Llc | Direct manipulation interface for data analysis |
US20150278315A1 (en) | 2014-04-01 | 2015-10-01 | Microsoft Corporation | Data fitting selected visualization type |
US10095389B2 (en) | 2014-08-22 | 2018-10-09 | Business Objects Software Ltd. | Gesture-based on-chart data filtering |
US10049141B2 (en) | 2014-10-10 | 2018-08-14 | salesforce.com,inc. | Declarative specification of visualization queries, display formats and bindings |
US20160104307A1 (en) | 2014-10-14 | 2016-04-14 | Microsoft Technology Licensing, Llc. | Data visualization extensibility architecture |
US9904456B2 (en) | 2014-12-02 | 2018-02-27 | Business Objects Software Ltd. | Gesture-based visualization of data grid on mobile device |
US9576383B2 (en) | 2014-12-30 | 2017-02-21 | Sap Se | Interactive chart authoring through chart merging |
US9892531B2 (en) | 2015-07-01 | 2018-02-13 | Oracle International Corporation | Chart data-binding design time user experience with dynamic sample generation |
US10049475B2 (en) * | 2015-12-14 | 2018-08-14 | Microsoft Technology Licensing, Llc | Utilizing selective triggering events for optimizing chart visualization |
-
2015
- 2015-01-22 US US14/603,330 patent/US9857952B2/en active Active
- 2015-01-22 US US14/603,302 patent/US10706597B2/en active Active
- 2015-01-22 US US14/603,322 patent/US11017569B2/en active Active
- 2015-01-22 US US14/603,312 patent/US10521092B2/en active Active
-
2017
- 2017-12-29 US US15/859,235 patent/US10657685B2/en active Active
-
2020
- 2020-05-18 US US16/877,373 patent/US20200279419A1/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100214300A1 (en) * | 2009-02-25 | 2010-08-26 | Quinton Alsbury | Displaying Bar Charts With A Fish-Eye Distortion Effect |
Also Published As
Publication number | Publication date |
---|---|
US20180121068A1 (en) | 2018-05-03 |
US10706597B2 (en) | 2020-07-07 |
US20170010776A1 (en) | 2017-01-12 |
US11017569B2 (en) | 2021-05-25 |
US9857952B2 (en) | 2018-01-02 |
US20170010785A1 (en) | 2017-01-12 |
US10521092B2 (en) | 2019-12-31 |
US20170010792A1 (en) | 2017-01-12 |
US20170010786A1 (en) | 2017-01-12 |
US10657685B2 (en) | 2020-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200279419A1 (en) | Methods and devices for manipulating graphical views of data | |
US11340757B2 (en) | Clock faces for an electronic device | |
JP7397881B2 (en) | Systems, methods, and user interfaces for interacting with multiple application windows | |
US11720230B2 (en) | Interactive data visualization user interface with hierarchical filtering based on gesture location on a chart | |
US10347027B2 (en) | Animated transition between data visualization versions at different levels of detail | |
AU2018203847B2 (en) | Systems and methods for multitasking on an electronic device with a touch-sensitive display | |
US20170069118A1 (en) | Interactive Data Visualization User Interface with Multiple Interaction Profiles | |
US10347018B2 (en) | Interactive data visualization user interface with hierarchical filtering based on gesture location on a chart | |
US11704330B2 (en) | User interface for generating data visualizations that use table calculations | |
AU2021204190B2 (en) | Interactive data visualization user interface with gesture-based data field selection | |
JP2013011981A (en) | Display control method, program, and display unit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: TC RETURN OF APPEAL |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: NOTICE OF APPEAL FILED |
|
STCV | Information on status: appeal procedure |
Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER |
|
STCV | Information on status: appeal procedure |
Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED |
|
STCV | Information on status: appeal procedure |
Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS |