US20160055232A1 - Gesture-based on-chart data filtering - Google Patents
Gesture-based on-chart data filtering Download PDFInfo
- Publication number
- US20160055232A1 US20160055232A1 US14/466,095 US201414466095A US2016055232A1 US 20160055232 A1 US20160055232 A1 US 20160055232A1 US 201414466095 A US201414466095 A US 201414466095A US 2016055232 A1 US2016055232 A1 US 2016055232A1
- Authority
- US
- United States
- Prior art keywords
- data
- filtering
- gesture
- category
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G06F17/30601—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/242—Query formulation
- G06F16/2428—Query predicate definition using graphical user interfaces, including menus and forms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/206—Drawing of charts or graphs
Definitions
- This document generally relates to methods and systems for data filtering and display. More particularly, various embodiments described herein relate to gesture-based filtering of data displayed on a mobile device.
- FIG. 1 illustrates a method for two-step data-point filtering.
- FIGS. 2A and 2B illustrate the filtering of an individual data point according to some embodiments.
- FIGS. 3A and 3B illustrate category-based filtering in aggregation mode according to some embodiments.
- FIGS. 4A-4D illustrate value-range-based filtering according to some embodiments.
- FIGS. 5A-5F illustrate value-range-based filtering on a two-value-axes chart according to some embodiments.
- FIGS. 6A and 6B illustrate the retention of filtered-out data in, and recovery thereof from, a trash bin according to some embodiments.
- FIG. 7 illustrates an example system providing charting and filtering functionality in accordance with various embodiments.
- FIGS. 8A and 8B illustrate methods for data charting and filtering in accordance with various embodiments.
- FIGS. 9A and 9B illustrate the format in which data may be provided in a generic manner and for an example data set, respectively, in accordance with some embodiments.
- FIGS. 9C and 9D illustrate the data of FIG. 9B following pre-processing in accordance with some embodiments.
- FIG. 9E illustrates the binding of contexts as defined in FIGS. 9C and 9D to the screen objects of a displayed data chart in accordance with some embodiments.
- FIG. 10 illustrates rendering functions associated with various screen objects of a data chart in accordance with some embodiments.
- FIG. 11 illustrates the data set of FIG. 9C following filtering of an individual data point in accordance with some embodiments.
- FIGS. 12A and 12B illustrate the data set of FIG. 9B and the associated category axis, respectively, following category-based filtering in accordance with some embodiments.
- FIG. 13A illustrates the definition of a value range in the context of value-range-based filtering in accordance with some embodiments.
- FIG. 13B illustrates the data set of FIG. 9B following value-range-based filtering as depicted in FIG. 13A in accordance with some embodiments.
- FIGS. 14A-14D illustrate an exemplary data structure for storing and retrieving data sets in accordance with some embodiments.
- Gestures include, and in many embodiments mean, touch-based gestures as are nowadays commonly used in operating mobile devices with touch screens, such as smart phones or tablets (e.g., Apple's iPhoneTM or iPadTM, Samsung's GalaxyTM, or Google's NexusTM).
- touch-based gestures as are nowadays commonly used in operating mobile devices with touch screens, such as smart phones or tablets (e.g., Apple's iPhoneTM or iPadTM, Samsung's GalaxyTM, or Google's NexusTM).
- “hold-and-swipe” gestures provide a user-friendly, intuitive means for selecting and eliminating data with a smooth continuous gesture.
- hold generally refers to the continued touching of an on-screen object for a certain period of time (e.g., one second or more), and the term “swipe” refers to a motion of the user's finger (or a stylus or other touching object) across the screen in generally one direction (e.g., to the right or upwards).
- a “hold-and-swipe” gesture is a composite of a “hold” followed immediately (i.e., without disengagement of the finger or other touching object from the screen) by a “swipe,” which is typically perceived by the user as a single gestures (as distinguished from a two-part gesture involving two discrete touches).
- touch-based gestures may also be used to filter data in accordance herewith, as described below.
- solutions provided herein are primarily intended for use with mobile devices and touch-based gestures, various embodiments may also be applicable to desktop computers and other stationary systems, as well as to user-input modalities that do not involve touching a screen.
- a hold-and-swipe gesture may be accomplished with a traditional computer mouse by clicking on a screen object and, while holding the left mouse button down, moving the mouse, thereby, e.g., “swiping” a cursor symbol across the screen.
- data filtering is performed on a 2D-axes chart that includes a value axis and a category axis.
- FIG. 1 A simple example is shown in FIG. 1 .
- the profit made by a company in four different countries is depicted in the form of a bar diagram.
- the four countries, which correspond to four qualitative categories, are distributed along the horizontal axis (x-axis), and the profit values, in thousands of dollars, are specified along the vertical axis (y-axis).
- the data for each country is depicted as a bar whose length (or upper end) indicates the profit attributable to that country.
- the same data may, alternatively, be displayed with a horizontal value axis and a vertical category axis. To avoid confusion, however, the examples used throughout this disclosure consistently display categories along the horizontal axis.
- data may be categorized in more than one dimensions.
- sales and profit data may be provided by country (as in FIG. 1 ), year, and type of product; annual financial-performance data for a company may be provided by quarter and division; weather statistics may be provided by geographic region, time of day, and time of year; and energy-usage data may be provided by country and source of energy, to name just a few.
- the different category dimensions may be reflected by multiple rows of category labels along the category axis. Each position along the category axis then corresponds to one combination of categories, e.g., the profits from the sale of trousers in China in 2013.
- data filtering in accordance with embodiments can be performed on each of the category dimensions, or on combinations thereof. For example, if data for only one of two years is of interest, the data for the other year may be deleted from the chart. Alternatively, data may be aggregated across multiple categories.
- data may also be filtered based on value ranges.
- a user reviewing sales data may be interested in only the lowest and the highest performers, and may therefore want to hide a large middle range of values from view. Conversely, a user may not be interested in extreme outliers, and delete them from the graph while retaining data falling between specified minimum and maximum values.
- value-based filtering is also applicable to charts with two value axes. Two-value-axes charts may be used, for example, to depict one quantitative parameter as a function of another quantitative parameter (as opposed to a qualitative category); an example is the average weight of children as a function of their age.
- two-value-axes charts may be used to visualize three-dimensional (3D) (or higher-dimensional) data sets, using the two value axes for two of the dimensions and a suitable symbol for the third dimensions.
- 3D three-dimensional
- data may be depicted with circles whose location along the horizontal and vertical value axes specifies two values and whose diameter reflects the third value.
- the color of the circle may be used to indicate a certain category.
- a bubble chart may depict the gross domestic product (GDP) and life expectancy for several countries, with bubble diameters proportional to the countries' respective population sizes.
- GDP gross domestic product
- data that has been filtered out of a chart is tracked in a manner that allows the user to easily retrieve the data.
- the deleted data sets may be retained in a “trash bin,” and may be restored to the chart with a special “recovery” gesture, such as a hold-and-swipe in the opposite direction as was used to delete the data.
- FIGS. 1 through 5E illustrate various methods of data filtering, showing both the contents displayed on the mobile device screen (or a relevant portion thereof) and, overlaid thereon, a schematic representation of the user's hand and its motions.
- FIG. 1 a conventional method for filtering a data point with a two-part gesture is depicted.
- the user has selected profit data for China for elimination from the chart, e.g., by briefly touching (or “tapping”) the respective data column on the touch display of her mobile device.
- a pop-up “tool-tip” i.e., a small window identifying the category and value of the selected data point and providing “keep” and “exclude” options,” appears.
- the “exclude” symbol the data point is removed from the displayed chart.
- FIGS. 2A and 2B illustrate data-point filtering in accordance with various embodiments.
- profit data is shown for three category dimensions: country, year, and type of product.
- the user has selected the data for trousers sold in Japan in 2013 for deletion by touching the corresponding data bar; the data bar may be highlighted to visualize the selection.
- the user may signal that the selected bar is to be deleted.
- the screen is refreshed with an updated chart, shown in FIG. 2B .
- the deleted data point is missing from the updated chart, whereas all other data points are unaffected.
- the profit data for Japan in 2012 includes data for trousers, as does the data for both 2012 and 2013 for countries other than Japan.
- FIGS. 3A and 3B illustrate an example of such category-based filtering.
- the user has selected, among three category dimensions, the “year” dimension to be filtered out.
- categorization of the data by year can be removed.
- FIG. 3B is a chart in which data is aggregated across the years (i.e., in the specific depicted example, the profit values for 2012 and 2013 are added for each combination of country and product type).
- FIGS. 4A-4D Another usage scenario is data filtering based on values or value ranges; an example is illustrated in FIGS. 4A-4D .
- an editing mode may be triggered.
- the user may define a data range to be deleted by moving two fingers that touch the value axis at different points relative to each other along the axis to adjust the range; for example, as schematically depicted in FIG. 4B , the user may use his thumb and index finger to select the lower and upper boundaries, respectively, of the range.
- the selected range may be visualized on-screen, e.g., by depicting the boundary lines of the range and/or a semi-transparent band thereacross.
- the boundary values of the selected range may also be textually displayed. If the user does not adjust the range manually (e.g., with a two-finger gesture as described), a default value range, e.g., having a width of one unit, may be selected based on the location where the axis was touched. Once the value range has been selected, the user may swipe one or both fingers to the right, as shown in FIG. 4C , to complete the filtering. The result is an updated chart, illustrated in FIG. 4D , from which any data points whose values fall in the selected range are missing.
- a default value range e.g., having a width of one unit
- FIGS. 5A-5F illustrate value filtering on the two axes of a bubble chart.
- a value range on the vertical axis may be selected and deleted in the same manner as described above with respect to FIGS. 4A-4C ; this is shown in FIGS. 5A and 5B , and the result of the filtering is shown in FIG. 5C .
- a value range on the horizontal axis may be filtered out by touching the horizontal axis, adjusting the range with two fingers (or using a default range), and then swiping the finger(s) upward to effect the deletion, as shown in FIGS. 5D and 5E .
- the result of value filtering along the horizontal value axis is provided in FIG. 5F .
- the filtered-out data may be retained in a virtual “trash bin,” from which it may be restored by the user.
- Each data set in the trash bin is associated with a particular filtering action.
- data is retained for only a limited number of filtering actions, e.g., the last three actions, the last nine actions, or the last twenty actions. If the total number of filtering actions performed by the user on a particular data set exceeds the specified maximum number, the oldest action is typically deleted to allow data for the most recent action to be saved without exceeding the maximum.
- the trash bin may store data for an unlimited number of filtering actions, subject only to the availability of sufficient memory on the mobile device (or other computing device).
- data can be recovered from the trash bin after the user has activated a recovery mode, e.g., by tapping a suitable symbol on the data chart, as shown in FIG. 6A .
- a recovery mode e.g., by tapping a suitable symbol on the data chart, as shown in FIG. 6A .
- the user may restore data from the trash bin, e.g., with a gesture mirroring that which was employed to move the data to the trash bin in the first place. For instance, if data was deleted using a hold-and-swipe gesture to the right, this filtering action may be reversed with a hold-and-swipe gesture to the left, as shown in FIG. 6B .
- filtering actions can be reversed only in an order reverse to the order in which they were performed on the original set, such that, e.g., the most recently deleted data needs to be restored before the data in the second-to-last filtering action can be recovered.
- any filtered-out data can be restored at any time (as long as it is still in the trash bin), and an updated active data set is reconstructed from the present active data set and the recovered data; in other words, the recovered data is added back into the chart.
- each of the different inactive data sets captures the complete state of the chart prior to a particular filtering action and can be restored at any time, which reverses all filtering actions from the most recent on up to and including that particular filtering action.
- the data-filtering and -recovery methods described above can be modified and extended in various ways, as a person of skill in the art will readily appreciate. For example, different types of gestures may be used.
- selection of data for deletion may be based on touching the relevant screen portion (e.g., a data point, category label, or value range) simultaneously with two fingers and then swiping both fingers.
- a two-finger touch-and swipe may be used in place of a hold-and-swipe gesture.
- individual data points may be filtered with a double-tap on the data point, whereas data collections including, e.g., all data points within a certain category or value range may be filtered with a hold-and-swipe gesture.
- the gesture may be composed of multiple sequentially performed gestures or gesture parts; thus, a first gesture part (e.g., a tap on the value axis) may be used to initially activate value-range filtering mode, a second gesture part (e.g., a two-finger gesture) may be used to adjust the value range, and, finally, a third gesture part (e.g., a hold-and-swipe gesture on the selected value range) may cause the filtering.
- a first gesture part e.g., a tap on the value axis
- a second gesture part e.g., a two-finger gesture
- a third gesture part e.g., a hold-and-swipe gesture on the selected value range
- a user may have the option to select gestures from a gesture library, or even define his own gesture(s); the selected gesture(s) may then be bound to certain filtering or recovery functionalities. For example, to accommodate left-handed users, a hold-and-swipe to the left may be selected for data deletion, and a hold-and-swipe to the right in trash-bin view may serve to recover filtered data.
- users may have the ability to define certain gesture parameters. For instance, with a hold-and-swipe gesture, the user may be allowed to set the requisite hold time to his comfort level.
- Gesture-based filtering may be enhanced by various visual (or audio) cues that provide feedback to the user and/or guide her gestures. For example and without limitation, a user's selection of a data point, category, or range may be visualized by highlighting the selected data, emphasizing it with a bright border, or displaying a cursor symbol (e.g., a symbolic hand) overlaid onto the selected object. Further, to indicate that a swipe action in a certain direction will cause deletion of the data, an arrow pointing in that direction may be displayed. Alternatively, the user may be informed of possible actions and the corresponding gestures (such as “swipe to the right to remove the selected data”) with a text message popping up on the screen or an audio output.
- a user's selection of a data point, category, or range may be visualized by highlighting the selected data, emphasizing it with a bright border, or displaying a cursor symbol (e.g., a symbolic hand) overlaid onto the selected object. Further,
- FIG. 7 conceptually illustrates, in simplified block-diagram form, an example smartphone architecture.
- the smartphone 700 includes a touch screen 702 (e.g., a capacitive touch screen), a microphone 704 and speaker 706 , one or more network interfaces 708 and associated antennas 710 for establishing wireless connections (e.g., Bluetooth, WiFi, or GPS), one or more processors 712 , 714 a , 714 b, 714 c and memory components 716 integrated in a system-on-chip (SoC) 718 and communicating with each other via a bus 720 (e.g., an AMBA AXI Interconnect bus), and a battery 722 .
- a bus 720 e.g., an AMBA AXI Interconnect bus
- a low-power general-purpose processor 712 e.g., an ARM processor
- a low-power general-purpose processor 712 is typically used, often in conjunction with dedicated special-purposes processors 714 a , 714 b, 714 c that are optimized for their respective functionalities; such processors may include dedicated controllers 714 a , 714 b , 714 c for various hardware components (e.g., an audio controller 714 a , a touch-screen controller 714 b , and a memory controller 714 c ) that implement their functionality directly in hardware (rather than executing software implementing such functionality) as well as video and 3D accelerators (not shown) that perform specialized image-processing functions.
- the memory components 716 may include volatile memory (e.g., SRAM or DRAM) as well as non-volatile memory (e.g., flash memory, ROM, EPROM, etc.).
- the memory typically stores an operating system 730 (e.g., Google's AndroidTM or Apple's iOSTM) and one or more higher-level software applications (conceptually illustrated as various modules), as well as data associated therewith.
- the various software applications may include the following modules: a data manager 732 that controls the retrieving, storing, and optionally pre-processing of data in preparation for display and/or filtering; a charting module 734 that creates screen objects (e.g., axes, labels, and data objects (such as points, bars, bubbles, etc.) from the data and causes their display; a filtering module 736 that implements data filtering and data recovery procedures based on user input; a gesture module 738 in communication with the touch-screen controller 714 b that detects user input via touch-based gestures; and an event manager 740 that triggers various functions (e.g.
- the charting module 734 may have read access, and the data manger 732 and/or filtering module 736 may have read and write access to the data, which may be stored in one or more data sets 742 , 744 .
- an active data set 742 reflects updates based on any filtering actions that have taken place, and a trash bin 744 stores one or more previous (and now inactive) data sets corresponding to the data sets prior to filtering or, alternatively, the portions thereof that have been filtered out.
- the gesture-detection functionality of module 738 is provided natively, i.e., as part of the software originally installed on the smartphone by the smartphone vendor.
- Web browsers running on touch-enabled Android or iOS phones typically have built-in functionality to recognize certain touch events, such as “touchstart” (which fires when the user touches the screen), “touchend” (which fires when the finger is removed from the screen), and “touchmove” (which fires when a finger already placed on the screen is moved thereacross).
- Additional applications may be installed at a later time by the user, e.g., by downloading them from an application server on the Internet (using one of the wireless connections enabled on the smartphone).
- the user may, for instance, download a charting application that integrates the functionality of the data manager 732 , event manager 740 , and charting module 734 , allowing the user to download, view, and chart data, and optionally to interact with and navigate the data chart via gestures, e.g., to zoom in and out of portions or scroll through the chart.
- the charting application may utilize the native gesture-detection functionality via a gesture API 746 .
- the filtering module 736 may be provided in the form of a separately downloadable plug-in to the charting application, and may include procedures 750 a , 750 b , 750 c for the various filtering or recovery actions (e.g., data-point filtering, category-based filtering, and value-range-based filtering, as well as data recovery from the trash bin).
- the filtering module may define different gesture-based filtering events for the various filtering actions, and bind corresponding event listeners 752 a , 752 b , 752 c to the applicable respective screen objects managed by the charting module 734 (e.g., bind a category-filtering event listener to a category axis).
- the respective event listener 752 a , 752 b , 752 c dispatches a filtering event to the appropriate filtering procedure 750 a , 750 b , 750 c .
- the charting and filtering functionalities are integrated into a single application.
- the functionality described herein can be grouped and organized in many different ways, and need not be implemented with the specific modules depicted in FIG.
- the filtering application may define filtering gestures based on native gestures (e.g., as a composite gesture including a sequence of native primitive gestures within certain parameter ranges) or independently therefrom.
- the filtering application includes a customization module that allows the user to specify gesture parameters (e.g., the hold period used to select an object) and/or define her own gestures.
- a computing device in accordance with an embodiment hereof includes one or more processors, memory, and, in communication therewith, a screen and one or more input devices; the type of components used may vary depending on the device.
- a PC may rely more heavily on a general-purpose processor than on special-purpose controllers, and may utilize an x86 processor rather than an ARM processor.
- touch-screen devices the screen may double as the input device (or one of the input devices).
- filtering gestures may alternatively be performed with traditional input devices such as a mouse.
- any computer-readable (or “machine-readable”) medium or multiple media, whether volatile, non-volatile, removable, or non-removable.
- Example computer-readable media include, but are not limited to, solid-state memories, optical media, and magnetic media.
- the machine-executable instructions stored on such media may generally be implemented in any suitable programming language (or multiple languages), for example and without limitation, in Objective-C, C, C++, Java, Python, PHP, Perl, Ruby, and many others known to those of skill in the art.
- FIGS. 8A and 8B provide an overview of methods for data charting and filtering in accordance with various embodiments; actions executed by the user, which provide input to the system, are illustrated with oblique parallelograms.
- the process generally begins when the user requests the data subsequently to be charted, e.g., by downloading it from a web site, accessing it from a file, importing it from another application running on the user's mobile (or other) device, or in any other manner ( 800 ).
- the charting application e.g., its data manager 732
- the data is pre-processed to prepare it for charting and/or filtering ( 804 ), as explained in more detail for one example implementation below.
- an “axis element” is generally any portion of a chart axis and its associated labels that can be selected by a user for filtering actions in accordance herewith.
- Axis elements include, e.g., a value axis or a value range (corresponding to a portion of the value axis), an individual category label, or a row of labels of one category type. For example, FIG.
- 3 illustrates three category-label rows corresponding to three types of categories: product type, year, and country; each of these rows may constitute an axis element. Further, within the year-label row, there are two category labels (although each displayed multiple times), one for 2012 and one for 2013, and each of these two labels may constitute a separate axis element.
- event listeners may be bound to each of the screen objects, (i.e., to each data point and each axis element). Then, when a user performs a defined filtering gesture on one of the screen objects ( 812 ), the event listener associated with that object detects this gesture ( 814 ) and dispatches an event message to the corresponding filtering procedure (which acts as an event handler) ( 816 ). The filtering procedure thereafter accomplishes the desired filtering action on the active data set 742 , specifically on the portion of the active data that is associated with the selected screen object (which portion usually contains, in case of an axis element, multiple data points) ( 818 ).
- the filtering gesture was performed on an individual data point, that point may be deleted; if the filtering gesture was performed on a category-label row, data is aggregated (i.e., summed or averaged) across categories associated with that label; and if the gesture was performed on a value range, data falling within that range is deleted.
- the displayed data chart is re-rendered based on the updated active data set 742 ( 820 ).
- the filtering procedures may also cause the old data, i.e., the data set as it existed prior to filtering, or the filtered-out portion of the data set (herein referred to as “inactive”), to be stored ( 822 ). Multiple filtering actions may, thus, result in multiple stored inactive sets. Collectively, the inactive data sets are herein called the trash bin. If the user activates a recovery mode ( 824 ), e.g., by tapping a suitable symbol (as depicted, e.g., in FIG.
- an event listener associated with the displayed data chart may listen for a recovery gesture (which may be defined, e.g., as a hold-and-swipe gesture to the left).
- a recovery gesture which may be defined, e.g., as a hold-and-swipe gesture to the left.
- the inactive data set may be recovered ( 832 ). For example, if the inactive data set includes the complete data set prior to a particular filtering action, it may simply be used to replace the existing active data set, and deleted from the trash bin. Alternatively, if the inactive data set includes only the filtered-out portion of the data, such data may be recombined with the active data set to reverse the previous filtering action. Following the reversal of a previous filtering action and recovery of the old data, the updated active data set may again be charted ( 834 ).
- FIG. 9A Underlying this example is the assumption that the data is initially provided in the form of a table as depicted in FIG. 9A , where each row (except for the title row) corresponds to a different data point and each column corresponds to a type of category or a value axis. For instance, a table with four columns, as shown, may contain data that is categorized in three dimensions, with a fourth column listing the value for the respective combination of categories.
- FIG. 9B further illustrates this data structure for a concrete data set with three category dimensions (corresponding to country, year, and product) and a value dimension specifying the revenue.
- the data is processed ( 804 ) to create a “context” for each data point, value axis, and category-label row to be displayed.
- the processing may result in two tables: a first table storing all individual data points as before and adding a column that specifies the context, i.e., a unique identifier, for each point ( FIG. 9C ), and a second table storing, as respective category contexts, the names of all category types (i.e., one for each label row) as well as the value dimension ( FIG. 9D ) (which is herein also considered a type of category).
- These contexts are then bound to the screen objects as shown in FIG. 9E , i.e., data-point contexts are bound to the elements of the graph (e.g., an individual bar) and category contexts are bound to the axes.
- the filtering functions contemplated herein which are triggered by well-defined gesture-based events, are bound to the screen objects.
- four types of events are defined: data-point filter events, value-axis filter events, category-axis filter events, and data-recovery events.
- Each type of event is handled by a dedicated event handler, e.g., a filtering procedure specific to that event.
- the events and associated event handlers may be bound to the rendering procedures (which may be part of the charting module 734 ) for the respective screen objects, which are illustrated in FIG. 10 .
- the rendering procedures which may be part of the charting module 734
- These parameters include, for data-point, category, and value-axis filter events, the context bound to the screen object to which the gesture was applied (or a location of such context within the data table).
- value-axis filter events two additional parameters are typically provided to signify the minimum and maximum values of the selected value range. In trash-bin view, a data-recovery event may be dispatched without the need for specifying any parameters.
- the event listener associated with the rendering procedure for that data point may dispatch a data-point filter event message with the data-context parameter ⁇ row:12,column:4 ⁇ .
- the data-point filtering procedure then filters the data set by this data context, resulting in the updated data set shown in FIG. 11 , from which previous row 12 is missing.
- the old data set may be pushed into the trash bin, and the chart may be updated based on the new data set.
- the event listener associated with the rendering procedure for that category may dispatch a category filter event message with the parameter ⁇ category:‘year’ ⁇ .
- the category filtering procedure then aggregates the data by year, resulting in the updated data set shown in FIG. 12A , which now has only three category/value columns, and the updated category axis shown in FIG. 12B , which now includes only two category-label rows.
- Data-range filtering may involve multiple sub-events triggered by the different parts of the composite filtering gesture. For example, when the event listener associated with the value axis detects a long-press event on the value axis (e.g., by 500 ms or more), it may dispatch an event message to a value-range procedure that causes display of a value-range editor (e.g., as shown in FIG. 4B ) on the screen.
- the value-range editor itself may have an associated event listener that listens for range-adjustment events.
- the value-range editor may, for example, take the form of a horizontal band overlaid on the chart, which may be shifted up or down, or adjusted in width, with two fingers.
- the selected value range may initially be defined in terms of the pixels on-screen.
- the pixel coordinates, along a vertical value axis, of the axis origin (i.e., intersection with the category axis), lower boundary of the selected value range, upper boundary of the selected value range, and upper boundary of the value axis, respectively may be (0,0), (0,y 0 ), (0, y 1 ), and (0, y m ).
- the upper and lower values l, u associated with the selected value range can be calculated from the pixel coordinates as follows:
- a value-axis filter event message including the value-axis context ⁇ category:‘Revenue’ ⁇ and the computed boundaries of the value range, l and u, as parameters may be dispatched. Based on these parameters, the data-point contexts for data points falling within the selected value range may be determined, and the data set may be updated based thereon. The resulting data set is shown in FIG. 13B .
- filtered-out data can be recovered from a trash-bin, i.e., a repository of the previous data sets or filtered-out portions thereof.
- a recovery gesture e.g., a “swipe-back” gesture corresponding to a hold-and-swipe gesture in the opposite direction as is used for filtering
- an event message restoring the data set prior to the filtering action may be dispatched.
- FIGS. 14A-14D illustrate the data structure of the trash bin in accordance with one example embodiment.
- the trash bin takes the form of a stack to which a new data set can only be added from the top (corresponding to a “push” action, see FIG. 14A ) and from which only the top data set at any given time can be removed (corresponding to a “pop” action).
- a new data set can only be added from the top
- apop the top data set at any given time can be removed
- the initially empty data stack is filled one by one, as shown in FIG. 14B .
- successive recovery actions data sets are removed one by one, as shown in FIG. 14C , until the stack is empty again.
- the capacity of the stack may be limited, e.g., to a maximum of nine data sets.
- the first-stored data set (set number 0) may be deleted to make room for the recording of the most recent set; this is illustrated in FIG. 14D .
- different data structures may be implemented to provide functionality to reverse filtering actions.
Abstract
On-chart data filtering on computing devices such as, e.g., touch-enabled mobile devices can be enabled by methods, systems, and computer programs that facilitate detecting a filtering gesture performed on an axis element (such as a value range, category label, or category-label row) of a data chart displayed on-screen and, in response to the filtering gesture, filtering data associated with the axis element and updating the displayed data chart based on the filtering.
Description
- This document generally relates to methods and systems for data filtering and display. More particularly, various embodiments described herein relate to gesture-based filtering of data displayed on a mobile device.
- In today's data-driven world, business or other data is often accessed on mobile devices. With available mobile-device data-graphing applications, users can conveniently visualize the data, e.g., with two-dimensional (2D) axes charts that present quantitative data in the form of bars for various qualitative categories. Often, a user is interested in a particular data segment and would therefore like to filter out a portion of the data. In existing mobile-device solutions, however, data-filtering functionality is typically limited to individual data points and requires a series of steps. For example, to eliminate a particular data point from view, the user may need to first select the point and then, following the display of a pop-up tool box with “keep” and “exclude” options, choose “exclude” to delete the point. Filtering large amounts of data in this manner is a protracted and cumbersome process. Accordingly, more convenient means for data filtering on mobile devices are needed.
- The present disclosure illustrates embodiments by way of example and not limitation, and with reference to the following drawings:
-
FIG. 1 illustrates a method for two-step data-point filtering. -
FIGS. 2A and 2B illustrate the filtering of an individual data point according to some embodiments. -
FIGS. 3A and 3B illustrate category-based filtering in aggregation mode according to some embodiments. -
FIGS. 4A-4D illustrate value-range-based filtering according to some embodiments. -
FIGS. 5A-5F illustrate value-range-based filtering on a two-value-axes chart according to some embodiments. -
FIGS. 6A and 6B illustrate the retention of filtered-out data in, and recovery thereof from, a trash bin according to some embodiments. -
FIG. 7 illustrates an example system providing charting and filtering functionality in accordance with various embodiments. -
FIGS. 8A and 8B illustrate methods for data charting and filtering in accordance with various embodiments. -
FIGS. 9A and 9B illustrate the format in which data may be provided in a generic manner and for an example data set, respectively, in accordance with some embodiments. -
FIGS. 9C and 9D illustrate the data ofFIG. 9B following pre-processing in accordance with some embodiments. -
FIG. 9E illustrates the binding of contexts as defined inFIGS. 9C and 9D to the screen objects of a displayed data chart in accordance with some embodiments. -
FIG. 10 illustrates rendering functions associated with various screen objects of a data chart in accordance with some embodiments. -
FIG. 11 illustrates the data set ofFIG. 9C following filtering of an individual data point in accordance with some embodiments. -
FIGS. 12A and 12B illustrate the data set ofFIG. 9B and the associated category axis, respectively, following category-based filtering in accordance with some embodiments. -
FIG. 13A illustrates the definition of a value range in the context of value-range-based filtering in accordance with some embodiments. -
FIG. 13B illustrates the data set ofFIG. 9B following value-range-based filtering as depicted inFIG. 13A in accordance with some embodiments. -
FIGS. 14A-14D illustrate an exemplary data structure for storing and retrieving data sets in accordance with some embodiments. - Disclosed herein are mobile-device solutions for the gesture-based filtering of data categories and value ranges, in addition to individual data points. Gestures, as referred to herein, include, and in many embodiments mean, touch-based gestures as are nowadays commonly used in operating mobile devices with touch screens, such as smart phones or tablets (e.g., Apple's iPhone™ or iPad™, Samsung's Galaxy™, or Google's Nexus™). In some embodiments, “hold-and-swipe” gestures provide a user-friendly, intuitive means for selecting and eliminating data with a smooth continuous gesture. The term “hold,” as used in this context, generally refers to the continued touching of an on-screen object for a certain period of time (e.g., one second or more), and the term “swipe” refers to a motion of the user's finger (or a stylus or other touching object) across the screen in generally one direction (e.g., to the right or upwards). A “hold-and-swipe” gesture is a composite of a “hold” followed immediately (i.e., without disengagement of the finger or other touching object from the screen) by a “swipe,” which is typically perceived by the user as a single gestures (as distinguished from a two-part gesture involving two discrete touches). Of course, other touch-based gestures may also be used to filter data in accordance herewith, as described below. Further, although the solutions provided herein are primarily intended for use with mobile devices and touch-based gestures, various embodiments may also be applicable to desktop computers and other stationary systems, as well as to user-input modalities that do not involve touching a screen. For example, a hold-and-swipe gesture may be accomplished with a traditional computer mouse by clicking on a screen object and, while holding the left mouse button down, moving the mouse, thereby, e.g., “swiping” a cursor symbol across the screen.
- In various embodiments, data filtering is performed on a 2D-axes chart that includes a value axis and a category axis. A simple example is shown in
FIG. 1 . Herein, the profit made by a company in four different countries is depicted in the form of a bar diagram. The four countries, which correspond to four qualitative categories, are distributed along the horizontal axis (x-axis), and the profit values, in thousands of dollars, are specified along the vertical axis (y-axis). The data for each country is depicted as a bar whose length (or upper end) indicates the profit attributable to that country. As will be readily apparent to one of skill in the art, the same data may, alternatively, be displayed with a horizontal value axis and a vertical category axis. To avoid confusion, however, the examples used throughout this disclosure consistently display categories along the horizontal axis. - As will be apparent from various of the following figures (e.g.,
FIG. 2A ), data may be categorized in more than one dimensions. For example, sales and profit data may be provided by country (as inFIG. 1 ), year, and type of product; annual financial-performance data for a company may be provided by quarter and division; weather statistics may be provided by geographic region, time of day, and time of year; and energy-usage data may be provided by country and source of energy, to name just a few. For such multi-dimensionally categorized data, the different category dimensions may be reflected by multiple rows of category labels along the category axis. Each position along the category axis then corresponds to one combination of categories, e.g., the profits from the sale of trousers in China in 2013. As explained in detail below, data filtering in accordance with embodiments can be performed on each of the category dimensions, or on combinations thereof. For example, if data for only one of two years is of interest, the data for the other year may be deleted from the chart. Alternatively, data may be aggregated across multiple categories. - In embodiments, data may also be filtered based on value ranges. A user reviewing sales data, for instance, may be interested in only the lowest and the highest performers, and may therefore want to hide a large middle range of values from view. Conversely, a user may not be interested in extreme outliers, and delete them from the graph while retaining data falling between specified minimum and maximum values. As will be readily apparent to those of skill in the art, value-based filtering is also applicable to charts with two value axes. Two-value-axes charts may be used, for example, to depict one quantitative parameter as a function of another quantitative parameter (as opposed to a qualitative category); an example is the average weight of children as a function of their age. In addition, two-value-axes charts may be used to visualize three-dimensional (3D) (or higher-dimensional) data sets, using the two value axes for two of the dimensions and a suitable symbol for the third dimensions. In a bubble chart, for example, data may be depicted with circles whose location along the horizontal and vertical value axes specifies two values and whose diameter reflects the third value. In addition, the color of the circle may be used to indicate a certain category. To provide a concrete example, a bubble chart may depict the gross domestic product (GDP) and life expectancy for several countries, with bubble diameters proportional to the countries' respective population sizes.
- Note that, where the present disclosure speaks of “filtering out” (or “deleting,” “eliminating,” etc.) data points or ranges, this need not necessarily, and typically does not, imply that the underlying data itself is deleted from memory. Rather, the data that is not of interest to the user is simply removed from the displayed chart. In some embodiments, data that has been filtered out of a chart is tracked in a manner that allows the user to easily retrieve the data. For example, the deleted data sets may be retained in a “trash bin,” and may be restored to the chart with a special “recovery” gesture, such as a hold-and-swipe in the opposite direction as was used to delete the data.
-
FIGS. 1 through 5E illustrate various methods of data filtering, showing both the contents displayed on the mobile device screen (or a relevant portion thereof) and, overlaid thereon, a schematic representation of the user's hand and its motions. With renewed reference toFIG. 1 , a conventional method for filtering a data point with a two-part gesture is depicted. In the example shown, the user has selected profit data for China for elimination from the chart, e.g., by briefly touching (or “tapping”) the respective data column on the touch display of her mobile device. In response to the touch, a pop-up “tool-tip,” i.e., a small window identifying the category and value of the selected data point and providing “keep” and “exclude” options,” appears. By tapping the “exclude” symbol, the data point is removed from the displayed chart. - By contrast,
FIGS. 2A and 2B illustrate data-point filtering in accordance with various embodiments. In the depicted example, profit data is shown for three category dimensions: country, year, and type of product. As shown inFIG. 2A , the user has selected the data for trousers sold in Japan in 2013 for deletion by touching the corresponding data bar; the data bar may be highlighted to visualize the selection. By swiping her finger to the right, the user may signal that the selected bar is to be deleted. As a result of this deletion gesture, the screen is refreshed with an updated chart, shown inFIG. 2B . The deleted data point is missing from the updated chart, whereas all other data points are unaffected. Thus, the profit data for Japan in 2012 includes data for trousers, as does the data for both 2012 and 2013 for countries other than Japan. - In many usage scenarios, a user is not interested in filtering out a single, or a few individual points, but an entire category. When a category is filtered, the data may be averaged or aggregated across the selected category dimension. For example, the user may not care about year-to-year fluctuations, and be interested in average yearly profits, or profits aggregated over multiple years.
FIGS. 3A and 3B illustrate an example of such category-based filtering. Here, the user has selected, among three category dimensions, the “year” dimension to be filtered out. By performing a touch-and-slide gesture on the row of category labels that contains the year labels, categorization of the data by year can be removed. The result, shown inFIG. 3B is a chart in which data is aggregated across the years (i.e., in the specific depicted example, the profit values for 2012 and 2013 are added for each combination of country and product type). - Another usage scenario is data filtering based on values or value ranges; an example is illustrated in
FIGS. 4A-4D . By touching the value axis, an editing mode may be triggered. In this mode, the user may define a data range to be deleted by moving two fingers that touch the value axis at different points relative to each other along the axis to adjust the range; for example, as schematically depicted inFIG. 4B , the user may use his thumb and index finger to select the lower and upper boundaries, respectively, of the range. The selected range may be visualized on-screen, e.g., by depicting the boundary lines of the range and/or a semi-transparent band thereacross. To provide precise feedback about the selected range to the user, the boundary values of the selected range may also be textually displayed. If the user does not adjust the range manually (e.g., with a two-finger gesture as described), a default value range, e.g., having a width of one unit, may be selected based on the location where the axis was touched. Once the value range has been selected, the user may swipe one or both fingers to the right, as shown inFIG. 4C , to complete the filtering. The result is an updated chart, illustrated inFIG. 4D , from which any data points whose values fall in the selected range are missing. - As will be readily appreciated by one of skill in the art, value-based filtering is also applicable to charts with two value axes.
FIGS. 5A-5F , for example, illustrate value filtering on the two axes of a bubble chart. A value range on the vertical axis may be selected and deleted in the same manner as described above with respect toFIGS. 4A-4C ; this is shown inFIGS. 5A and 5B , and the result of the filtering is shown inFIG. 5C . Similarly, a value range on the horizontal axis may be filtered out by touching the horizontal axis, adjusting the range with two fingers (or using a default range), and then swiping the finger(s) upward to effect the deletion, as shown inFIGS. 5D and 5E . The result of value filtering along the horizontal value axis is provided inFIG. 5F . - Regardless of whether the depicted data is filtered point by point, by category, or by value range, the filtered-out data (or entire data sets prior to the filtering) may be retained in a virtual “trash bin,” from which it may be restored by the user. Each data set in the trash bin is associated with a particular filtering action. In some embodiments, to limit memory requirements, data is retained for only a limited number of filtering actions, e.g., the last three actions, the last nine actions, or the last twenty actions. If the total number of filtering actions performed by the user on a particular data set exceeds the specified maximum number, the oldest action is typically deleted to allow data for the most recent action to be saved without exceeding the maximum. In other embodiments, the trash bin may store data for an unlimited number of filtering actions, subject only to the availability of sufficient memory on the mobile device (or other computing device).
- In some embodiments, data can be recovered from the trash bin after the user has activated a recovery mode, e.g., by tapping a suitable symbol on the data chart, as shown in
FIG. 6A . Once the recovery mode has been activated, the user may restore data from the trash bin, e.g., with a gesture mirroring that which was employed to move the data to the trash bin in the first place. For instance, if data was deleted using a hold-and-swipe gesture to the right, this filtering action may be reversed with a hold-and-swipe gesture to the left, as shown inFIG. 6B . In some embodiments, filtering actions can be reversed only in an order reverse to the order in which they were performed on the original set, such that, e.g., the most recently deleted data needs to be restored before the data in the second-to-last filtering action can be recovered. In other embodiments, any filtered-out data can be restored at any time (as long as it is still in the trash bin), and an updated active data set is reconstructed from the present active data set and the recovered data; in other words, the recovered data is added back into the chart. In yet other embodiments, each of the different inactive data sets captures the complete state of the chart prior to a particular filtering action and can be restored at any time, which reverses all filtering actions from the most recent on up to and including that particular filtering action. - The data-filtering and -recovery methods described above can be modified and extended in various ways, as a person of skill in the art will readily appreciate. For example, different types of gestures may be used. In some embodiments, instead of relying on a hold-period of a certain length (e.g., half a second, one second, or two seconds) to select an object, as is done with a hold-and-swipe gesture as described above, selection of data for deletion may be based on touching the relevant screen portion (e.g., a data point, category label, or value range) simultaneously with two fingers and then swiping both fingers. Thus, a two-finger touch-and swipe may be used in place of a hold-and-swipe gesture. In other embodiments, individual data points may be filtered with a double-tap on the data point, whereas data collections including, e.g., all data points within a certain category or value range may be filtered with a hold-and-swipe gesture. For more complex actions, such as, e.g., value-based filtering, the gesture may be composed of multiple sequentially performed gestures or gesture parts; thus, a first gesture part (e.g., a tap on the value axis) may be used to initially activate value-range filtering mode, a second gesture part (e.g., a two-finger gesture) may be used to adjust the value range, and, finally, a third gesture part (e.g., a hold-and-swipe gesture on the selected value range) may cause the filtering.
- In some embodiments, a user may have the option to select gestures from a gesture library, or even define his own gesture(s); the selected gesture(s) may then be bound to certain filtering or recovery functionalities. For example, to accommodate left-handed users, a hold-and-swipe to the left may be selected for data deletion, and a hold-and-swipe to the right in trash-bin view may serve to recover filtered data. Alternatively or additionally to allowing users to select their own gestures, users may have the ability to define certain gesture parameters. For instance, with a hold-and-swipe gesture, the user may be allowed to set the requisite hold time to his comfort level.
- Gesture-based filtering may be enhanced by various visual (or audio) cues that provide feedback to the user and/or guide her gestures. For example and without limitation, a user's selection of a data point, category, or range may be visualized by highlighting the selected data, emphasizing it with a bright border, or displaying a cursor symbol (e.g., a symbolic hand) overlaid onto the selected object. Further, to indicate that a swipe action in a certain direction will cause deletion of the data, an arrow pointing in that direction may be displayed. Alternatively, the user may be informed of possible actions and the corresponding gestures (such as “swipe to the right to remove the selected data”) with a text message popping up on the screen or an audio output.
- The above-described functionality can generally be implemented in hardware, firmware, software, or any suitable combination thereof. In some embodiments, data filtering functionality as described herein is provided on a touch-enabled computing device, such as a touch-enabled smartphone. Touch-enabled smartphones are readily commercially available from various vendors, such as Samsung, Apple, Huawei, or Lenovo, among others.
FIG. 7 conceptually illustrates, in simplified block-diagram form, an example smartphone architecture. Thesmartphone 700 includes a touch screen 702 (e.g., a capacitive touch screen), amicrophone 704 andspeaker 706, one ormore network interfaces 708 and associatedantennas 710 for establishing wireless connections (e.g., Bluetooth, WiFi, or GPS), one ormore processors memory components 716 integrated in a system-on-chip (SoC) 718 and communicating with each other via a bus 720 (e.g., an AMBA AXI Interconnect bus), and abattery 722. To reduce the processor power requirements and thereby extend battery lifetime (i.e., the time thebattery 722 lasts until it needs to be recharged), a low-power general-purpose processor 712 (e.g., an ARM processor) is typically used, often in conjunction with dedicated special-purposes processors dedicated controllers audio controller 714 a, a touch-screen controller 714 b, and amemory controller 714 c) that implement their functionality directly in hardware (rather than executing software implementing such functionality) as well as video and 3D accelerators (not shown) that perform specialized image-processing functions. Thememory components 716 may include volatile memory (e.g., SRAM or DRAM) as well as non-volatile memory (e.g., flash memory, ROM, EPROM, etc.). - The memory typically stores an operating system 730 (e.g., Google's Android™ or Apple's iOS™) and one or more higher-level software applications (conceptually illustrated as various modules), as well as data associated therewith. For instance, in the context of data charting and filtering in accordance herewith, the various software applications may include the following modules: a
data manager 732 that controls the retrieving, storing, and optionally pre-processing of data in preparation for display and/or filtering; acharting module 734 that creates screen objects (e.g., axes, labels, and data objects (such as points, bars, bubbles, etc.) from the data and causes their display; afiltering module 736 that implements data filtering and data recovery procedures based on user input; agesture module 738 in communication with the touch-screen controller 714 b that detects user input via touch-based gestures; and anevent manager 740 that triggers various functions (e.g. of the charting and filtering modules) based on gestures recognized by thegesture module 738. Thecharting module 734 may have read access, and thedata manger 732 and/orfiltering module 736 may have read and write access to the data, which may be stored in one ormore data sets active data set 742 reflects updates based on any filtering actions that have taken place, and atrash bin 744 stores one or more previous (and now inactive) data sets corresponding to the data sets prior to filtering or, alternatively, the portions thereof that have been filtered out. - In various embodiments, the gesture-detection functionality of
module 738 is provided natively, i.e., as part of the software originally installed on the smartphone by the smartphone vendor. Web browsers running on touch-enabled Android or iOS phones, for example, typically have built-in functionality to recognize certain touch events, such as “touchstart” (which fires when the user touches the screen), “touchend” (which fires when the finger is removed from the screen), and “touchmove” (which fires when a finger already placed on the screen is moved thereacross). Additional applications may be installed at a later time by the user, e.g., by downloading them from an application server on the Internet (using one of the wireless connections enabled on the smartphone). Thus, the user may, for instance, download a charting application that integrates the functionality of thedata manager 732,event manager 740, and chartingmodule 734, allowing the user to download, view, and chart data, and optionally to interact with and navigate the data chart via gestures, e.g., to zoom in and out of portions or scroll through the chart. To facilitate gesture-based user input, the charting application may utilize the native gesture-detection functionality via agesture API 746. - The
filtering module 736 may be provided in the form of a separately downloadable plug-in to the charting application, and may includeprocedures corresponding event listeners respective event listener appropriate filtering procedure modules FIG. 7 , but can be provided by different, fewer, or more modules (if modularized at all), and can utilize native functionality to a greater or lesser extent. For example, the filtering application may define filtering gestures based on native gestures (e.g., as a composite gesture including a sequence of native primitive gestures within certain parameter ranges) or independently therefrom. In some embodiments, the filtering application includes a customization module that allows the user to specify gesture parameters (e.g., the hold period used to select an object) and/or define her own gestures. - Furthermore, although a specific system for data charting and filtering is described above with respect to
FIG. 7 , this is but one possible embodiment, and many variations and modifications thereof, as well as very different system embodiments, are contemplated. For example, although mobile devices and mobile applications constitute an important application scenario, the data filtering functionality described herein may also be implemented on a stationary device, such as a desktop personal computer (PC). In general, a computing device in accordance with an embodiment hereof includes one or more processors, memory, and, in communication therewith, a screen and one or more input devices; the type of components used may vary depending on the device. (For instance, a PC may rely more heavily on a general-purpose processor than on special-purpose controllers, and may utilize an x86 processor rather than an ARM processor.) In touch-screen devices, the screen may double as the input device (or one of the input devices). However, in various embodiments, filtering gestures may alternatively be performed with traditional input devices such as a mouse. - Further, the various software components that provide the charting, filtering, and/or data-recovery functionality described herein can generally be provided on any computer-readable (or “machine-readable”) medium, or multiple media, whether volatile, non-volatile, removable, or non-removable. Example computer-readable media include, but are not limited to, solid-state memories, optical media, and magnetic media. The machine-executable instructions stored on such media may generally be implemented in any suitable programming language (or multiple languages), for example and without limitation, in Objective-C, C, C++, Java, Python, PHP, Perl, Ruby, and many others known to those of skill in the art.
-
FIGS. 8A and 8B provide an overview of methods for data charting and filtering in accordance with various embodiments; actions executed by the user, which provide input to the system, are illustrated with oblique parallelograms. The process generally begins when the user requests the data subsequently to be charted, e.g., by downloading it from a web site, accessing it from a file, importing it from another application running on the user's mobile (or other) device, or in any other manner (800). Upon receipt, the charting application (e.g., its data manager 732) stores the data in memory (802). In some embodiments, the data is pre-processed to prepare it for charting and/or filtering (804), as explained in more detail for one example implementation below. Once the user requests the data to be charted (e.g., by clicking on a charting icon) (806), screen objects corresponding to each data point and each axis element are created (808) and displayed on the screen to form the data chart (810). As understood herein, an “axis element” is generally any portion of a chart axis and its associated labels that can be selected by a user for filtering actions in accordance herewith. Axis elements include, e.g., a value axis or a value range (corresponding to a portion of the value axis), an individual category label, or a row of labels of one category type. For example,FIG. 3 illustrates three category-label rows corresponding to three types of categories: product type, year, and country; each of these rows may constitute an axis element. Further, within the year-label row, there are two category labels (although each displayed multiple times), one for 2012 and one for 2013, and each of these two labels may constitute a separate axis element. - As mentioned above, event listeners may be bound to each of the screen objects, (i.e., to each data point and each axis element). Then, when a user performs a defined filtering gesture on one of the screen objects (812), the event listener associated with that object detects this gesture (814) and dispatches an event message to the corresponding filtering procedure (which acts as an event handler) (816). The filtering procedure thereafter accomplishes the desired filtering action on the
active data set 742, specifically on the portion of the active data that is associated with the selected screen object (which portion usually contains, in case of an axis element, multiple data points) (818). For example, if the filtering gesture was performed on an individual data point, that point may be deleted; if the filtering gesture was performed on a category-label row, data is aggregated (i.e., summed or averaged) across categories associated with that label; and if the gesture was performed on a value range, data falling within that range is deleted. Following the filtering, the displayed data chart is re-rendered based on the updated active data set 742 (820). - In addition to updating the active data set 742 (i.e., the one being displayed), the filtering procedures may also cause the old data, i.e., the data set as it existed prior to filtering, or the filtered-out portion of the data set (herein referred to as “inactive”), to be stored (822). Multiple filtering actions may, thus, result in multiple stored inactive sets. Collectively, the inactive data sets are herein called the trash bin. If the user activates a recovery mode (824), e.g., by tapping a suitable symbol (as depicted, e.g., in
FIG. 6A ), an event listener associated with the displayed data chart may listen for a recovery gesture (which may be defined, e.g., as a hold-and-swipe gesture to the left). Upon performance of the recovery gesture by the user (828) and detection of the gesture by the event listener (830), the inactive data set may be recovered (832). For example, if the inactive data set includes the complete data set prior to a particular filtering action, it may simply be used to replace the existing active data set, and deleted from the trash bin. Alternatively, if the inactive data set includes only the filtered-out portion of the data, such data may be recombined with the active data set to reverse the previous filtering action. Following the reversal of a previous filtering action and recovery of the old data, the updated active data set may again be charted (834). - A particular implementation example is now described in more detail. Underlying this example is the assumption that the data is initially provided in the form of a table as depicted in
FIG. 9A , where each row (except for the title row) corresponds to a different data point and each column corresponds to a type of category or a value axis. For instance, a table with four columns, as shown, may contain data that is categorized in three dimensions, with a fourth column listing the value for the respective combination of categories.FIG. 9B further illustrates this data structure for a concrete data set with three category dimensions (corresponding to country, year, and product) and a value dimension specifying the revenue. - In some embodiments, the data is processed (804) to create a “context” for each data point, value axis, and category-label row to be displayed. As shown in
FIGS. 9C and 9D , the processing may result in two tables: a first table storing all individual data points as before and adding a column that specifies the context, i.e., a unique identifier, for each point (FIG. 9C ), and a second table storing, as respective category contexts, the names of all category types (i.e., one for each label row) as well as the value dimension (FIG. 9D ) (which is herein also considered a type of category). These contexts are then bound to the screen objects as shown inFIG. 9E , i.e., data-point contexts are bound to the elements of the graph (e.g., an individual bar) and category contexts are bound to the axes. - Furthermore, the filtering functions contemplated herein, which are triggered by well-defined gesture-based events, are bound to the screen objects. In some embodiments, four types of events are defined: data-point filter events, value-axis filter events, category-axis filter events, and data-recovery events. Each type of event is handled by a dedicated event handler, e.g., a filtering procedure specific to that event. The events and associated event handlers may be bound to the rendering procedures (which may be part of the charting module 734) for the respective screen objects, which are illustrated in
FIG. 10 . Then, whenever the user executes a defined filtering gesture on an appropriate screen object, the corresponding filter event is dispatched to the event handler, along with suitable parameters. These parameters include, for data-point, category, and value-axis filter events, the context bound to the screen object to which the gesture was applied (or a location of such context within the data table). For value-axis filter events, two additional parameters are typically provided to signify the minimum and maximum values of the selected value range. In trash-bin view, a data-recovery event may be dispatched without the need for specifying any parameters. - With reference to
FIGS. 9C and 11 , when the user “swipes out” a data point (e.g., by performing a hold-and-swipe gesture as defined above), such as, for instance, the data point with context 11 (seeFIG. 9C ), the event listener associated with the rendering procedure for that data point may dispatch a data-point filter event message with the data-context parameter {row:12,column:4}. The data-point filtering procedure then filters the data set by this data context, resulting in the updated data set shown inFIG. 11 , from whichprevious row 12 is missing. The old data set may be pushed into the trash bin, and the chart may be updated based on the new data set. - With reference to
FIGS. 9C , 12A and 12B, when the user swipes out a category, such as, for instance, the year category (seeFIG. 9C ), the event listener associated with the rendering procedure for that category may dispatch a category filter event message with the parameter {category:‘year’}. The category filtering procedure then aggregates the data by year, resulting in the updated data set shown inFIG. 12A , which now has only three category/value columns, and the updated category axis shown inFIG. 12B , which now includes only two category-label rows. - Data-range filtering may involve multiple sub-events triggered by the different parts of the composite filtering gesture. For example, when the event listener associated with the value axis detects a long-press event on the value axis (e.g., by 500 ms or more), it may dispatch an event message to a value-range procedure that causes display of a value-range editor (e.g., as shown in
FIG. 4B ) on the screen. The value-range editor itself may have an associated event listener that listens for range-adjustment events. The value-range editor may, for example, take the form of a horizontal band overlaid on the chart, which may be shifted up or down, or adjusted in width, with two fingers. The selected value range may initially be defined in terms of the pixels on-screen. For example, as illustrated inFIG. 13A , the pixel coordinates, along a vertical value axis, of the axis origin (i.e., intersection with the category axis), lower boundary of the selected value range, upper boundary of the selected value range, and upper boundary of the value axis, respectively, may be (0,0), (0,y0), (0, y1), and (0, ym). Denoting the values associated with the origin and upper axis boundaries with L and U, respectively, the upper and lower values l, u associated with the selected value range can be calculated from the pixel coordinates as follows: -
l=L+(U−L)·y 0 /y m -
u=U+(U−L)·y 1 /y m - Upon detection of a swipe-out gesture on the selected value range, a value-axis filter event message including the value-axis context {category:‘Revenue’} and the computed boundaries of the value range, l and u, as parameters may be dispatched. Based on these parameters, the data-point contexts for data points falling within the selected value range may be determined, and the data set may be updated based thereon. The resulting data set is shown in
FIG. 13B . - In some embodiments, as mentioned above, filtered-out data can be recovered from a trash-bin, i.e., a repository of the previous data sets or filtered-out portions thereof. For this purpose, an event listener that listens for a recovery gesture (e.g., a “swipe-back” gesture corresponding to a hold-and-swipe gesture in the opposite direction as is used for filtering) may be used. When such a recovery gesture is performed, e.g., anywhere on the charting canvas, an event message restoring the data set prior to the filtering action may be dispatched.
FIGS. 14A-14D illustrate the data structure of the trash bin in accordance with one example embodiment. Herein, the trash bin takes the form of a stack to which a new data set can only be added from the top (corresponding to a “push” action, seeFIG. 14A ) and from which only the top data set at any given time can be removed (corresponding to a “pop” action). Thus, as successive filtering actions are performed, the initially empty data stack is filled one by one, as shown inFIG. 14B . Conversely, as successive recovery actions are performed, data sets are removed one by one, as shown inFIG. 14C , until the stack is empty again. The capacity of the stack may be limited, e.g., to a maximum of nine data sets. If, in this case, a tenth data set (set number 9) is pushed into the trash bin, the first-stored data set (set number 0) may be deleted to make room for the recording of the most recent set; this is illustrated inFIG. 14D . Of course, different data structures may be implemented to provide functionality to reverse filtering actions. - While various specific embodiments are described herein, these embodiments are intended to be illustrative only, rather than limiting. For example, different types of gestures than described herein may be employed for on-chart filtering in accordance herewith, and implementation details of systems providing the filtering functionality described herein may vary. It will be appreciated that many variations, modifications, and additions are possible without departing from the scope of embodiments of the present disclosure.
Claims (20)
1. A method comprising, using a computer:
detecting a filtering gesture performed on an axis element of a data chart displayed on a screen;
in response to the filtering gesture, filtering data associated with the axis element; and
updating the displayed data chart based on the filtering.
2. The method of claim 1 , wherein the axis element comprises a category-label row.
3. The method of claim 2 , wherein the filtering comprises aggregating the data across categories associated with the category-label row.
4. The method of claim 1 , wherein the axis element comprises a value range.
5. The method of claim 4 , wherein the filtering comprises deleting data associated with the value range.
6. The method of claim 4 , wherein the filtering gesture is a composite gesture comprising a first gesture part selecting a value axis, a second gesture part defining the value range, and a third gesture part performed on the defined value range.
7. The method of claim 1 , wherein the filtering comprises a touch-based gesture.
8. The method of claim 1 , wherein the filtering comprises a hold-and-swipe gesture.
9. The method of claim 1 , further comprising storing the filtered-out data.
10. The method of claim 9 , further comprising, upon detection of a recovery gesture performed on the chart, reversing the filtering by restoring the filtered-out data.
11. The method of claim 1 , wherein the data associated with the axis element comprises a plurality of data points.
12. A system comprising:
a screen displaying a data chart;
a processor; and
memory storing (i) a data set corresponding to the data chart displayed on the touch screen and (ii) application logic comprising processor-executable instructions which, when executed by the processor, cause the processor to
detect a filtering gesture performed on an axis element of the data chart displayed on the screen; and
in response to the filtering gesture, filter the data set associated with the axis element and update the displayed data chart based on the filtering.
13. The system of claim 12 , wherein the screen is a touch screen and the filtering gesture comprises a touch-based gesture.
14. The system of claim 12 , wherein the filtering gesture comprises a hold-and-swipe gesture.
15. The system of claim 12 , wherein the instructions comprise an event listener bound to the axis element.
16. The system of claim 12 , wherein the memory further stores at least one inactive data set comprising filtered data.
17. The system of claim 16 , wherein the instructions further comprise instructions which, when executed by the processor, cause the processor to restore one of the at least one inactive data sets in response to a recovery gesture performed on the data chart, thereby reversing the filtering.
18. The system of claim 12 , wherein the axis element comprises a category-label row, and the application logic causes the processor to filter the data set by aggregating the data across categories associated with the category-label row.
19. The system of claim 12 , wherein the axis element comprises a value range, and the application logic causes the processor to filter the data set by deleting data associated with the value range.
20. A non-transitory computer-readable medium storing processor-executable instructions which, when executed by a processor, cause the processor to:
detect a filtering gesture performed on an axis element of a data chart displayed on a screen; and
in response to the filtering gesture, filter data set associated with the axis element and update the displayed data chart based on the filtering.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/466,095 US10095389B2 (en) | 2014-08-22 | 2014-08-22 | Gesture-based on-chart data filtering |
EP14004374.6A EP2990924B1 (en) | 2014-08-22 | 2014-12-22 | Gesture-based on-chart data filtering |
CN201410850025.0A CN105373522B (en) | 2014-08-22 | 2014-12-31 | Gesture-based on-chart data screening |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/466,095 US10095389B2 (en) | 2014-08-22 | 2014-08-22 | Gesture-based on-chart data filtering |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160055232A1 true US20160055232A1 (en) | 2016-02-25 |
US10095389B2 US10095389B2 (en) | 2018-10-09 |
Family
ID=52272808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/466,095 Active 2035-10-21 US10095389B2 (en) | 2014-08-22 | 2014-08-22 | Gesture-based on-chart data filtering |
Country Status (3)
Country | Link |
---|---|
US (1) | US10095389B2 (en) |
EP (1) | EP2990924B1 (en) |
CN (1) | CN105373522B (en) |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160103581A1 (en) * | 2014-10-10 | 2016-04-14 | Salesforce.Com, Inc. | Chart selection tooltip |
US20160154575A1 (en) * | 2014-12-02 | 2016-06-02 | Yingyu Xie | Gesture-Based Visualization of Data Grid on Mobile Device |
US20160274750A1 (en) * | 2014-09-08 | 2016-09-22 | Tableau Software, Inc. | Animated Transition between Data Visualization Versions at Different Levels of Detail |
US20160343154A1 (en) * | 2014-09-08 | 2016-11-24 | Tableau Software, Inc. | Interactive Data Visualization User Interface with Hierarchical Filtering Based on Gesture Location |
US20170010792A1 (en) * | 2014-09-08 | 2017-01-12 | Tableau Software Inc. | Methods and Devices for Adjusting Chart Magnification Asymmetrically |
USD777749S1 (en) * | 2015-04-06 | 2017-01-31 | Domo, Inc. | Display screen or portion thereof with a graphical user interface for analytics |
US20170236314A1 (en) * | 2016-02-12 | 2017-08-17 | Microsoft Technology Licensing, Llc | Tagging utilizations for selectively preserving chart elements during visualization optimizations |
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US10001898B1 (en) | 2011-07-12 | 2018-06-19 | Domo, Inc. | Automated provisioning of relational information for a summary data visualization |
CN108268201A (en) * | 2018-01-23 | 2018-07-10 | 国网甘肃省电力公司 | A kind of electric power data chart processing method based on dragging re-computation |
US20190050116A1 (en) * | 2017-08-11 | 2019-02-14 | Salesforce.Com, Inc. | Multi-selection on a chart |
US10311608B2 (en) | 2016-12-08 | 2019-06-04 | Microsoft Technology Licensing, Llc | Custom multi axis chart visualization |
US10347017B2 (en) * | 2016-02-12 | 2019-07-09 | Microsoft Technology Licensing, Llc | Interactive controls that are collapsible and expandable and sequences for chart visualization optimizations |
US10380771B2 (en) * | 2017-05-16 | 2019-08-13 | Sap Se | Data insights for visualizations based on dimensions |
US10380770B2 (en) * | 2014-09-08 | 2019-08-13 | Tableau Software, Inc. | Interactive data visualization user interface with multiple interaction profiles |
US10474352B1 (en) | 2011-07-12 | 2019-11-12 | Domo, Inc. | Dynamic expansion of data visualizations |
US10635262B2 (en) | 2014-09-08 | 2020-04-28 | Tableau Software, Inc. | Interactive data visualization user interface with gesture-based data field selection |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US10685462B2 (en) * | 2017-01-13 | 2020-06-16 | International Business Machines Corporation | Automatic data extraction from a digital image |
US10896532B2 (en) | 2015-09-08 | 2021-01-19 | Tableau Software, Inc. | Interactive data visualization user interface with multiple interaction profiles |
US10936186B2 (en) * | 2018-06-28 | 2021-03-02 | Sap Se | Gestures used in a user interface for navigating analytic data |
US20210150375A1 (en) * | 2019-11-20 | 2021-05-20 | Business Objects Software Ltd. | Analytic Insights For Hierarchies |
US11069101B2 (en) * | 2017-06-23 | 2021-07-20 | Casio Computer Co., Ltd. | Data processing method and data processing device |
US20210349594A1 (en) * | 2019-01-24 | 2021-11-11 | Vivo Mobile Communication Co., Ltd. | Content deleting method, terminal, and computer readable storage medium |
US11237635B2 (en) | 2017-04-26 | 2022-02-01 | Cognixion | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
US11275792B2 (en) | 2019-11-01 | 2022-03-15 | Business Objects Software Ltd | Traversing hierarchical dimensions for dimension-based visual elements |
US11402909B2 (en) | 2017-04-26 | 2022-08-02 | Cognixion | Brain computer interface for augmented reality |
US11449211B2 (en) * | 2017-09-21 | 2022-09-20 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for data loading |
US11526526B2 (en) * | 2019-11-01 | 2022-12-13 | Sap Se | Generating dimension-based visual elements |
US20230177751A1 (en) * | 2021-12-03 | 2023-06-08 | Adaptam Inc. | Method and system for improved visualization of charts in spreadsheets |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107168961B (en) * | 2016-03-07 | 2020-06-26 | 阿里巴巴集团控股有限公司 | Data display method and device for chart |
GB2556068A (en) * | 2016-11-16 | 2018-05-23 | Chartify It Ltd | Data interation device |
CN109614172B (en) * | 2017-09-30 | 2021-11-30 | 北京国双科技有限公司 | Data screening method and related device |
CN109308204B (en) * | 2018-08-02 | 2021-09-10 | 温军华 | Chart generation method and device for responding to window clicking |
CN109697625A (en) * | 2018-08-02 | 2019-04-30 | 吴波 | A kind of data processing method and device |
CN111259637A (en) * | 2020-01-13 | 2020-06-09 | 北京字节跳动网络技术有限公司 | Data processing method, data processing device, computer equipment and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040002983A1 (en) * | 2002-06-26 | 2004-01-01 | Hitoshi Ashida | Method and system for detecting tables to be modified |
US20060031187A1 (en) * | 2004-08-04 | 2006-02-09 | Advizor Solutions, Inc. | Systems and methods for enterprise-wide visualization of multi-dimensional data |
US7139766B2 (en) * | 2001-12-17 | 2006-11-21 | Business Objects, S.A. | Universal drill-down system for coordinated presentation of items in different databases |
US20070136406A1 (en) * | 2005-12-12 | 2007-06-14 | Softky William R | Method and system for numerical computation visualization |
US20080071748A1 (en) * | 2006-09-18 | 2008-03-20 | Infobright Inc. | Method and system for storing, organizing and processing data in a relational database |
US7369127B1 (en) * | 2004-05-06 | 2008-05-06 | The Mathworks, Inc. | Dynamic control of graphic representations of data |
US7922098B1 (en) * | 2005-03-09 | 2011-04-12 | Diebold, Incorporated | Banking system controlled responsive to data bearing records |
US20110115814A1 (en) * | 2009-11-16 | 2011-05-19 | Microsoft Corporation | Gesture-controlled data visualization |
US20120210243A1 (en) * | 2011-02-11 | 2012-08-16 | Gavin Andrew Ross Uhma | Web co-navigation |
US8793701B2 (en) * | 2009-05-26 | 2014-07-29 | Business Objects Software Limited | Method and system for data reporting and analysis |
US8957915B1 (en) * | 2012-06-14 | 2015-02-17 | Cinemagram Inc. | Method, apparatus and system for dynamic images |
US9513792B2 (en) * | 2012-10-10 | 2016-12-06 | Sap Se | Input gesture chart scaling |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7788606B2 (en) | 2004-06-14 | 2010-08-31 | Sas Institute Inc. | Computer-implemented system and method for defining graphics primitives |
US8368699B2 (en) * | 2009-02-25 | 2013-02-05 | Mellmo Inc. | Displaying bar charts with a fish-eye distortion effect |
US20110283242A1 (en) | 2010-05-14 | 2011-11-17 | Sap Ag | Report or application screen searching |
US8996978B2 (en) | 2010-05-14 | 2015-03-31 | Sap Se | Methods and systems for performing analytical procedures by interactions with visual representations of datasets |
US8990732B2 (en) | 2010-05-14 | 2015-03-24 | Sap Se | Value interval selection on multi-touch devices |
US8457353B2 (en) | 2010-05-18 | 2013-06-04 | Microsoft Corporation | Gestures and gesture modifiers for manipulating a user-interface |
US8583664B2 (en) | 2010-05-26 | 2013-11-12 | Microsoft Corporation | Exposing metadata relationships through filter interplay |
US8683332B2 (en) | 2010-09-20 | 2014-03-25 | Business Objects Software Limited | Distance filtering gesture touchscreen |
US9747270B2 (en) | 2011-01-07 | 2017-08-29 | Microsoft Technology Licensing, Llc | Natural input for spreadsheet actions |
US9003298B2 (en) | 2010-09-21 | 2015-04-07 | Microsoft Corporation | Web page application controls |
CN102568403B (en) * | 2010-12-24 | 2014-06-04 | 联想(北京)有限公司 | Electronic instrument and object deletion method thereof |
US8994732B2 (en) * | 2011-03-07 | 2015-03-31 | Microsoft Corporation | Integration of sketch-based interaction and computer data analysis |
US20130027401A1 (en) * | 2011-07-27 | 2013-01-31 | Godfrey Hobbs | Augmented report viewing |
US10198485B2 (en) | 2011-10-13 | 2019-02-05 | Microsoft Technology Licensing, Llc | Authoring of data visualizations and maps |
US20130239012A1 (en) | 2012-03-12 | 2013-09-12 | Sap Portals Israel Ltd | Common denominator filter for enterprise portal pages |
US8527909B1 (en) | 2012-05-29 | 2013-09-03 | Sencha, Inc. | Manipulating data visualizations on a touch screen |
US9164972B2 (en) | 2012-06-07 | 2015-10-20 | Microsoft Technology Licensing, Llc | Managing objects in panorama display to navigate spreadsheet |
US20140078134A1 (en) | 2012-09-18 | 2014-03-20 | Ixonos Oyj | Method for determining three-dimensional visual effect on information element using apparatus with touch sensitive display |
US20140089860A1 (en) | 2012-09-24 | 2014-03-27 | Sap Ag | Direct manipulation of data displayed on screen |
CN103345359B (en) * | 2013-06-03 | 2016-05-11 | 珠海金山办公软件有限公司 | A kind of gesture is switched the mthods, systems and devices of chart ranks |
CN103744576B (en) * | 2013-11-08 | 2016-12-07 | 维沃移动通信有限公司 | A kind of method and system at the operation interface for realizing mobile terminal |
-
2014
- 2014-08-22 US US14/466,095 patent/US10095389B2/en active Active
- 2014-12-22 EP EP14004374.6A patent/EP2990924B1/en active Active
- 2014-12-31 CN CN201410850025.0A patent/CN105373522B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7139766B2 (en) * | 2001-12-17 | 2006-11-21 | Business Objects, S.A. | Universal drill-down system for coordinated presentation of items in different databases |
US20040002983A1 (en) * | 2002-06-26 | 2004-01-01 | Hitoshi Ashida | Method and system for detecting tables to be modified |
US7369127B1 (en) * | 2004-05-06 | 2008-05-06 | The Mathworks, Inc. | Dynamic control of graphic representations of data |
US20060031187A1 (en) * | 2004-08-04 | 2006-02-09 | Advizor Solutions, Inc. | Systems and methods for enterprise-wide visualization of multi-dimensional data |
US7922098B1 (en) * | 2005-03-09 | 2011-04-12 | Diebold, Incorporated | Banking system controlled responsive to data bearing records |
US20070136406A1 (en) * | 2005-12-12 | 2007-06-14 | Softky William R | Method and system for numerical computation visualization |
US20080071748A1 (en) * | 2006-09-18 | 2008-03-20 | Infobright Inc. | Method and system for storing, organizing and processing data in a relational database |
US8793701B2 (en) * | 2009-05-26 | 2014-07-29 | Business Objects Software Limited | Method and system for data reporting and analysis |
US20110115814A1 (en) * | 2009-11-16 | 2011-05-19 | Microsoft Corporation | Gesture-controlled data visualization |
US20120210243A1 (en) * | 2011-02-11 | 2012-08-16 | Gavin Andrew Ross Uhma | Web co-navigation |
US8957915B1 (en) * | 2012-06-14 | 2015-02-17 | Cinemagram Inc. | Method, apparatus and system for dynamic images |
US9513792B2 (en) * | 2012-10-10 | 2016-12-06 | Sap Se | Input gesture chart scaling |
Cited By (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9792017B1 (en) | 2011-07-12 | 2017-10-17 | Domo, Inc. | Automatic creation of drill paths |
US10474352B1 (en) | 2011-07-12 | 2019-11-12 | Domo, Inc. | Dynamic expansion of data visualizations |
US10726624B2 (en) | 2011-07-12 | 2020-07-28 | Domo, Inc. | Automatic creation of drill paths |
US10001898B1 (en) | 2011-07-12 | 2018-06-19 | Domo, Inc. | Automated provisioning of relational information for a summary data visualization |
US9857952B2 (en) | 2014-09-08 | 2018-01-02 | Tableau Software, Inc. | Methods and devices for adjusting chart magnification |
US20160274750A1 (en) * | 2014-09-08 | 2016-09-22 | Tableau Software, Inc. | Animated Transition between Data Visualization Versions at Different Levels of Detail |
US20170010776A1 (en) * | 2014-09-08 | 2017-01-12 | Tableau Software Inc. | Methods and Devices for Adjusting Chart Filters |
US11126327B2 (en) | 2014-09-08 | 2021-09-21 | Tableau Software, Inc. | Interactive data visualization user interface with gesture-based data field selection |
US11017569B2 (en) * | 2014-09-08 | 2021-05-25 | Tableau Software, Inc. | Methods and devices for displaying data mark information |
US20170010792A1 (en) * | 2014-09-08 | 2017-01-12 | Tableau Software Inc. | Methods and Devices for Adjusting Chart Magnification Asymmetrically |
US10380770B2 (en) * | 2014-09-08 | 2019-08-13 | Tableau Software, Inc. | Interactive data visualization user interface with multiple interaction profiles |
US11720230B2 (en) | 2014-09-08 | 2023-08-08 | Tableau Software, Inc. | Interactive data visualization user interface with hierarchical filtering based on gesture location on a chart |
US20160343154A1 (en) * | 2014-09-08 | 2016-11-24 | Tableau Software, Inc. | Interactive Data Visualization User Interface with Hierarchical Filtering Based on Gesture Location |
US20170010785A1 (en) * | 2014-09-08 | 2017-01-12 | Tableau Software Inc. | Methods and devices for displaying data mark information |
US10521092B2 (en) * | 2014-09-08 | 2019-12-31 | Tableau Software, Inc. | Methods and devices for adjusting chart magnification asymmetrically |
US10706597B2 (en) * | 2014-09-08 | 2020-07-07 | Tableau Software, Inc. | Methods and devices for adjusting chart filters |
US10635262B2 (en) | 2014-09-08 | 2020-04-28 | Tableau Software, Inc. | Interactive data visualization user interface with gesture-based data field selection |
US10347027B2 (en) * | 2014-09-08 | 2019-07-09 | Tableau Software, Inc. | Animated transition between data visualization versions at different levels of detail |
US10347018B2 (en) * | 2014-09-08 | 2019-07-09 | Tableau Software, Inc. | Interactive data visualization user interface with hierarchical filtering based on gesture location on a chart |
US10120544B2 (en) * | 2014-10-10 | 2018-11-06 | Salesforce.Com, Inc. | Chart selection tooltip |
US20160103581A1 (en) * | 2014-10-10 | 2016-04-14 | Salesforce.Com, Inc. | Chart selection tooltip |
US9904456B2 (en) * | 2014-12-02 | 2018-02-27 | Business Objects Software Ltd. | Gesture-based visualization of data grid on mobile device |
US20160154575A1 (en) * | 2014-12-02 | 2016-06-02 | Yingyu Xie | Gesture-Based Visualization of Data Grid on Mobile Device |
USD777749S1 (en) * | 2015-04-06 | 2017-01-31 | Domo, Inc. | Display screen or portion thereof with a graphical user interface for analytics |
US10896532B2 (en) | 2015-09-08 | 2021-01-19 | Tableau Software, Inc. | Interactive data visualization user interface with multiple interaction profiles |
US10748312B2 (en) * | 2016-02-12 | 2020-08-18 | Microsoft Technology Licensing, Llc | Tagging utilizations for selectively preserving chart elements during visualization optimizations |
US20170236314A1 (en) * | 2016-02-12 | 2017-08-17 | Microsoft Technology Licensing, Llc | Tagging utilizations for selectively preserving chart elements during visualization optimizations |
US10347017B2 (en) * | 2016-02-12 | 2019-07-09 | Microsoft Technology Licensing, Llc | Interactive controls that are collapsible and expandable and sequences for chart visualization optimizations |
US11232655B2 (en) | 2016-09-13 | 2022-01-25 | Iocurrents, Inc. | System and method for interfacing with a vehicular controller area network |
US10650621B1 (en) | 2016-09-13 | 2020-05-12 | Iocurrents, Inc. | Interfacing with a vehicular controller area network |
US10311608B2 (en) | 2016-12-08 | 2019-06-04 | Microsoft Technology Licensing, Llc | Custom multi axis chart visualization |
US10685462B2 (en) * | 2017-01-13 | 2020-06-16 | International Business Machines Corporation | Automatic data extraction from a digital image |
US11561616B2 (en) | 2017-04-26 | 2023-01-24 | Cognixion Corporation | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
US11762467B2 (en) | 2017-04-26 | 2023-09-19 | Cognixion Corporation | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
US11237635B2 (en) | 2017-04-26 | 2022-02-01 | Cognixion | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
US11402909B2 (en) | 2017-04-26 | 2022-08-02 | Cognixion | Brain computer interface for augmented reality |
US10380771B2 (en) * | 2017-05-16 | 2019-08-13 | Sap Se | Data insights for visualizations based on dimensions |
US11069101B2 (en) * | 2017-06-23 | 2021-07-20 | Casio Computer Co., Ltd. | Data processing method and data processing device |
US10802673B2 (en) * | 2017-08-11 | 2020-10-13 | Salesforce.Com, Inc. | Multi-selection on a chart |
US20190050116A1 (en) * | 2017-08-11 | 2019-02-14 | Salesforce.Com, Inc. | Multi-selection on a chart |
US11449211B2 (en) * | 2017-09-21 | 2022-09-20 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for data loading |
CN108268201A (en) * | 2018-01-23 | 2018-07-10 | 国网甘肃省电力公司 | A kind of electric power data chart processing method based on dragging re-computation |
US10936186B2 (en) * | 2018-06-28 | 2021-03-02 | Sap Se | Gestures used in a user interface for navigating analytic data |
US20210349594A1 (en) * | 2019-01-24 | 2021-11-11 | Vivo Mobile Communication Co., Ltd. | Content deleting method, terminal, and computer readable storage medium |
US11579767B2 (en) * | 2019-01-24 | 2023-02-14 | Vivo Mobile Communication Co., Ltd. | Content deleting method, terminal, and computer readable storage medium |
US11526526B2 (en) * | 2019-11-01 | 2022-12-13 | Sap Se | Generating dimension-based visual elements |
US11275792B2 (en) | 2019-11-01 | 2022-03-15 | Business Objects Software Ltd | Traversing hierarchical dimensions for dimension-based visual elements |
US20210150375A1 (en) * | 2019-11-20 | 2021-05-20 | Business Objects Software Ltd. | Analytic Insights For Hierarchies |
US11803761B2 (en) * | 2019-11-20 | 2023-10-31 | Business Objects Software Ltd | Analytic insights for hierarchies |
US20230177751A1 (en) * | 2021-12-03 | 2023-06-08 | Adaptam Inc. | Method and system for improved visualization of charts in spreadsheets |
Also Published As
Publication number | Publication date |
---|---|
CN105373522A (en) | 2016-03-02 |
EP2990924A1 (en) | 2016-03-02 |
EP2990924B1 (en) | 2020-11-11 |
US10095389B2 (en) | 2018-10-09 |
CN105373522B (en) | 2022-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10095389B2 (en) | Gesture-based on-chart data filtering | |
US10175854B2 (en) | Interaction in chain visualization | |
US10261660B2 (en) | Orbit visualization animation | |
US10775971B2 (en) | Pinch gestures in a tile-based user interface | |
US11010032B2 (en) | Navigating a hierarchical data set | |
US9665263B2 (en) | Snap navigation of a scrollable list | |
US20150370462A1 (en) | Creating calendar event from timeline | |
US20130111380A1 (en) | Digital whiteboard implementation | |
US10838607B2 (en) | Managing objects in panorama display to navigate spreadsheet | |
KR20130141378A (en) | Organizing graphical representations on computing devices | |
US10564836B2 (en) | Dynamic moveable interface elements on a touch screen device | |
US20170212660A1 (en) | Consolidated orthogonal guide creation | |
US20140040797A1 (en) | Widget processing method and apparatus, and mobile terminal | |
US10936186B2 (en) | Gestures used in a user interface for navigating analytic data | |
US20160231876A1 (en) | Graphical interaction in a touch screen user interface | |
WO2016045500A1 (en) | Method, apparatus and system for selecting target object in target library | |
US20140351745A1 (en) | Content navigation having a selection function and visual indicator thereof | |
CN103699381A (en) | Method and system for setting Widget based on Firefox OS (Operating System) platform | |
US10656788B1 (en) | Dynamic document updating application interface and corresponding control functions | |
US20140123066A1 (en) | User interface controls for specifying interval data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BUSINESS OBJECTS SOFTWARE LTD., IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YANG, RUI;XIE, YINGYU;ZHANG, ZIMO;AND OTHERS;REEL/FRAME:033590/0510 Effective date: 20140821 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |