WO2017054004A1 - Systems and methods for data visualization using tree-dimensional displays - Google Patents

Systems and methods for data visualization using tree-dimensional displays Download PDF

Info

Publication number
WO2017054004A1
WO2017054004A1 PCT/US2016/053842 US2016053842W WO2017054004A1 WO 2017054004 A1 WO2017054004 A1 WO 2017054004A1 US 2016053842 W US2016053842 W US 2016053842W WO 2017054004 A1 WO2017054004 A1 WO 2017054004A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
visualization
objects
attributes
virtual space
Prior art date
Application number
PCT/US2016/053842
Other languages
English (en)
French (fr)
Other versions
WO2017054004A8 (en
Inventor
Stanislav G. DJORGOVSKI
Ciro DONALEK
Scott DAVIDOFF
Original Assignee
California Instutute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by California Instutute Of Technology filed Critical California Instutute Of Technology
Priority to JP2018502734A priority Critical patent/JP2018533099A/ja
Priority to EP16849900.2A priority patent/EP3353751A4/en
Publication of WO2017054004A1 publication Critical patent/WO2017054004A1/en
Publication of WO2017054004A8 publication Critical patent/WO2017054004A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Definitions

  • the present invention relates generally to data visualization and more specifically to the visualization of complex, multidimensional data using three- dimensional display technologies including (but not limited to) virtual reality (VR), mixed reality (MR), and augmented reality (AR) displays.
  • VR virtual reality
  • MR mixed reality
  • AR augmented reality
  • Data visualization commonly refers to techniques utilized to communicate data or information by encoding it as visual objects that can be displayed via a computer. Visualization is an essential component of any data analysis and/or data mining process. In many instances, a graphical representation of the geometry and topology of a data distribution can enable selection of appropriate analysis tools revealing further insights, and the interpretation of the results. In the era of "big data", the key bottleneck in the extraction of actionable knowledge from high dimensionality data sets is often a user's (in)ability to visualize patterns in more than 3 dimensions.
  • Computer displays typically display information in two dimensions (2D).
  • 2D two dimensions
  • 3D three-dimensional
  • 3D three-dimensional
  • stereoscopy in which images rendered from different viewpoints are displayed separately to the left and right eye.
  • the two images are then combined in the brain to give the perception of 3D depth.
  • a number of head mounted 3D display technologies are currently available. Paul Milgram and Fumio Kishino, in a paper entitled “A Taxonomy of Mixed Reality Visual Displays" published in IEICE Transactions on Information Systems, Vol. E77-D, No.
  • VR Virtual Reality
  • MR Mixed Reality
  • AR Augmented Reality
  • MR Mixed Reality
  • AR and MR displays can be implemented using transparent display technology and/or by capturing images of a scene and using the captured images to render displays combining the real world scene and the virtual objects.
  • AR is typically used to describe 3D display technologies that display virtual objects that provide contextually relevant information to a real world scene.
  • AR is often used to refer to an experience in which real world objects are augmented or supplemented by computer-generated sensory input.
  • MR sometimes referred to as hybrid reality, typically involves the merging of real and virtual worlds to produce new environments and visualizations where real and virtual objects co-exist and interact in real time.
  • AR, MR, and VR displays can all have a similar goal of immersing a user in an environment that is either partially or entirely virtual.
  • AR and MR users continue to be in touch with the real world while interacting with virtual objects around them.
  • VR the user is isolated from the real world while immersed in a world that is completely synthesized (although may include virtual analogues to real world objects).
  • presentation of data via a 3D display as a multidimensional (i.e., with the number of displayed data dimensions of 3 or greater) data visualization enables identification of meaningful structures in data (e.g., clusters, correlations, outliers) that may contain an actionable knowledge and that often reside in higher dimensional spaces and which are not readily observable through visualization of data via conventional 2D display technologies.
  • immersive AR, MR, and VR environments naturally support collaborative data visualization and exploration, and are conducive to scientists interacting with their data along side of their colleagues in shared virtual spaces.
  • a data set with 3 column entries would be 3-dimensional
  • a data set with 20 columns would be 20-dimensional. Either dataset can be represented on a 3D display device.
  • An additional distinction is the dimensionality of the data space within which the data are being rendered or visualized.
  • Up to 3 dimensions (axes) of such a data visualization space can be spatial; additional dimensions may be encoded through the colors, transparencies, shapes and sizes of the data points.
  • more than three data dimensions can be visualized in a multidimensional data space via a 3D display device. If a data set has N dimensions, a subset of k of them may be visualized at any given time, with k ⁇ N. If k > 3, up to 3 dimensions can be encoded as spatial positions (XYZ) in a data visualization space, with the remainder being represented through characteristics of the data points such as colors, sizes, and shapes.
  • each data item (data point) is represented as an individual geometrical object, e.g., a dot, a square, a sphere, etc., with some spatial coordinates (XYZ), and other visible properties (e.g., colors, sizes, etc.) encoding the additional data dimensions.
  • the challenge is in maximizing the number of simultaneously visualized data dimensions k that can be readily understood by a human.
  • three-dimensional data visualization systems can provide data visualizations in a variety of display contexts.
  • multidimensional data is rendered for in a 3D data visualization space that can be viewed, navigated, and manipulated using a traditional 2D display device (e.g., a flat screen).
  • a traditional 2D display device e.g., a flat screen
  • an optimized rendering of up to 10 or more data dimensions is used to generate a 3D data visualization of the multidimensional data space.
  • the three-dimensional data visualization systems can provide an enhanced intuitive comprehension of the multidimensional data space, when displayed using a 3D display device (e.g., a VR/AR headset).
  • Immersion in the multidimensional data space using an immersive 3D display can enhance the human ability to understand the geometry and the relationships (clusters, correlations, outliers, anomalies, gaps, etc.) that may be present in the data as compared to a traditional data visualization methodology involving the use of a 2D display.
  • One embodiment of the invention includes: a display device; and a computing system, including memory containing a 3D data visualization application and a processing system.
  • the 3D data visualization application directs the processing system to: load a set of data points into a visualization table in the memory, where each data point includes values in multiple data dimensions and an additional visibility value is assigned to each data point in a visibility dimension within the visualization table; create representations of a set of 3D objects corresponding to the set of data points, where each 3D object has a set of visualization attributes that determine the manner in which the 3D object is rendered and the visualization attributes include a location of the 3D object within a virtual space having three spatial dimensions; receive mappings of data dimensions to visualization attributes; determine the visualization attributes of the set of 3D objects based upon the selected mappings of data dimensions to 3D object attributes, where the selected mappings of data dimensions to visualization attributes determine a location for each visible 3D object within the virtual space; update the visibility dimension in the visualization table for each of the plurality of 3D object to reflect the visibility of
  • the display device is a 3D display device, and interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input comprises rendering stereo images displayed via the 3D display device.
  • the 3D data visualization application is implemented using a 3D rendering engine.
  • the implementation of the 3D data visualization application further relies upon scripts that execute via the 3D rendering engine.
  • the visualization attributes include at least one attribute selected from the group of: X Coordinate, Y Coordinate, Z Coordinate, Shape, Size, Color Palette, Color Map, Color Scale, Transparency. ID, URL, Mask, Show By, Motion of the 3D Object, Sonification, Haptic Feedback, and Vibrotactile Feedback.
  • receiving mappings of data dimensions to visualization attributes further includes receiving user selections of mappings of data dimensions to visualization attributes.
  • receiving mappings of data dimensions to visualization attributes further includes retrieving a stored set of mappings of data dimensions to visualization attributes.
  • interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input further includes: generating at least one 3D object based upon the visualization attributes of a plurality of visible 3D objects; and interactively rendering 3D data visualizations of the at least one group 3D object within the virtual space from viewpoints determined based upon received user input.
  • interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input further includes: modifying 3D objects forming part of a virtual environment within the virtual space in response to the user input so that the 3D objects corresponding to the set of data points remain stationary within the virtual space and appear to change relative to the virtual environment in the 3D data visualization due to the modification to the 3D objects forming part of the virtual environment; and rending the visible 3D objects corresponding to the set of data points and the 3D objects forming part of the virtual environment.
  • modifying 3D objects forming part of a virtual environment within the virtual space in response to the user input comprises at least one modification selected from the group including: modifying the size of the 3D objects forming part of the virtual environment in response a user instruction to resize the 3D objects corresponding to the set of data points to create the impression that the 3D objects corresponding to the set of data points are changing in size relative to the virtual environment; moving the positions of the 3D objects forming part of the virtual environment in response a user instruction to move the 3D objects corresponding to the set of data points to create the impression that the 3D objects corresponding to the set of data points are moving relative to the virtual environment; and moving the positions of the 3D objects forming part of the virtual environment in response a user instruction to rotate the 3D objects corresponding to the set of data points to create the impression that the 3D objects corresponding to the set of data points are rotating relative to the virtual environment.
  • interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input includes: illuminating at least some of the 3D objects, where each illuminated 3D object is illuminated using a directional illumination source originating at a user viewpoint; and rendering at least the illuminated 3D objects based upon the user viewpoint.
  • illuminating at least some of the 3D objects further includes: determining a field of view; illuminating 3D objects within the field of view of the user using a directional illumination source originating at the user viewpoint; and rendering the illuminated 3D objects within the field of view of the user.
  • interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input includes: rotating at least some of the 3D objects based upon a user viewpoint, so that the appearance of the rotated 3D objects is invariant with user viewpoint; and rendering the rotated 3D objects based upon the user viewpoint.
  • interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input includes: determining a location within the virtual space of at least one interaction primitive based upon a user viewpoint; and rendering the at least one interaction primitive based upon the user viewpoint.
  • a still further additional embodiment also includes determining a transparency of at least one interaction primitive based upon the user viewpoint.
  • the 3D objects include 3D objects having depth perception preserving shapes.
  • the depth perception preserving shapes are characterized by a first dimension that is invariant and second dimension that is a visualization attribute that varies based upon a mapped data dimension.
  • at least one of the depth perception preserving shapes is pill shaped.
  • receiving mappings of data dimensions to visualization attributes includes: receiving a selection of a target feature; determining the importance of at least a subset of multiple data dimensions to the target feature; and generating mappings of data dimensions having high importance to specific visualization attributes.
  • determining the importance of at least a subset of multiple data dimensions to the target feature further includes: identifying data dimensions that are numerical and data dimensions that are categorical; generating mappings of numerical data dimensions having high importance to a first set of visualization attributes; generating mappings of categorical data dimensions having high importance to a second set of visualization attributes.
  • the first set of visualization attributes comprises X, Y, Z position, and size.
  • the second set of visualization attributes comprises shape.
  • the 3D data visualization application further directs the processing system to: receive at least one updated mapping of a data dimension to a visualization attribute; determine updated visualization attributes for the set of 3D objects based upon the selected mappings of data dimensions to 3D object attributes, where the updated mappings of data dimensions to visualization attributes determine a location for each visible 3D object within an updated virtual space; generate trajectories for the set of visible 3D objects from their locations in the virtual space to their updated locations in the virtual space; and interactively render animations of the movements of 3D objects along their generated trajectories from their locations in the virtual space to their locations in the updated virtual space from viewpoints determined based upon received user input.
  • the 3D data visualization application further directs the processing system to determine updated visibility values for each of the plurality of 3D object to reflect the visibility of each 3D object based upon the updated mapping.
  • interactively rendering animations of the movements of 3D objects along their generated trajectories further comprises varying the time at which different sets of 3D objects commence moving along their trajectories during a rendered animation.
  • the time at which different sets of 3D objects commence moving along their trajectories during a rendered animation is determined based upon user input.
  • interactively rendering animations of the movements of 3D objects along their generated trajectories further includes varying the speed with which different sets of 3D objects move along their trajectories during a rendered animation.
  • interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input includes: determining a location within the virtual space of at least one affordance, where user input directing movement of a 3D data visualization onto one of the at least one affordances initiates modification of the 3D data visualization; detecting movement of a 3D data visualization onto one of the at least one affordances; modifying the 3D data visualization based upon the one of the at least one affordances; and rendering the modified 3D data visualization based upon the user viewpoint.
  • modifying the 3D data visualization based upon the one of the at least one affordances comprises resizing the 3D data visualization.
  • modifying the 3D data visualization based upon the one of the at least one affordances includes: applying a data analysis process to the set of data points in the visualization table corresponding to the 3D objects visualized within the 3D visualization; and modifying visualization attributes of the 3D objects visualized within the 3D visualization based upon at least one result of the data analysis process; and rendering a modified 3D data visualization including the modified visual attributes of the 3D objects based upon the user viewpoint.
  • the data analysis process is a clustering process.
  • modifying the 3D data visualization based upon the one of the at least one affordances includes rendering a new 3D data visualization of a set of 3D objects represented by at least one selected 3D object in the 3D data visualization moved onto one of the at least one affordances.
  • Still yet another further additional embodiment further includes: an input device having an elongated handle and an input button.
  • the 3D data visualization application further directs the processing system to: obtain a pose input and a button state input from the wand; modify the 3D data visualization based upon the pose input and the button state input in a manner determined based upon a user interface context; and render the modified 3D data visualization based upon the user viewpoint.
  • modifying the 3D data visualization based upon the pose input and the button state input in a manner determined based upon a user interface context includes: determining a location for the 3D data visualization within the virtual world based upon the pose input and the button status input indicating that the button is not being pressed; and rotating the 3D data visualization within the virtual world based upon the pose input and the button status input indicating that the button is being pressed.
  • the memory further comprises avatar metadata including a set of visualization attributes that determine the manner in which an avatar is rendered and the visualization attributes include a location of the avatar within the virtual space, and interactively rendering 3D data visualizations of the 3D objects within the virtual space from viewpoints determined based upon received user input comprises rendering avatars within the 3D data visualization based upon the viewpoints and the avatar metadata.
  • the avatar metadata further comprises pose information, and rendering avatars within the 3D data visualization based upon the
  • the avatar metadata further comprises rendering the poses of avatars within the 3D data visualization based upon the pose information within the avatar metadata.
  • FIGS. 1A - 1 E illustrate a set of eight clusters laid out with their data centers at the corners of a virtual cube.
  • FIGS. 2A - 2D illustrate the different perspective a user can obtain by moving within a 3D visualizations of a multidimensional data space and the use of additional visualization attributes to visualize additional data dimensions in accordance with various embodiments of the invention.
  • FIGS. 3A - 3E illustrate visualization of a 3D graph from multiple viewpoints in which data is visualized as 3D line plots in accordance with an embodiment of the invention.
  • FIGS. 4A- 4E illustrate visualization of a 3D graph from multiple viewpoints in which data is visualized as a 3D surface in accordance with an embodiment of the invention.
  • FIGS. 5A - 5D conceptually illustrates rendering of a 3D graph from different viewpoints of different users in accordance with an embodiment of the invention.
  • FIG. 5E illustrates a 3D data visualization showing avatars of multiple users within a virtual space in accordance with an embodiment of the invention.
  • FIGS. 6A and 6B conceptually illustrate systems for generating 3D visualizations of a multidimensional data space in accordance with various embodiments of the invention.
  • FIG. 7 conceptually illustrates a multidimensional data visualization computing system implemented on a single computing device in accordance with an embodiment of the invention.
  • FIG. 8A is a flow chart illustrating a process for generating a multidimensional data visualization in accordance with an embodiment of the invention.
  • FIG. 8B is a flow chart illustrating a process for rendering a 3D data visualization using group 3D objects in accordance with an embodiment of the invention.
  • FIGS. 9A - 9C illustrates a 3D visualization of a multidimensional data space in which data dimensions are mapped to shape and size attributes of 3D objects in accordance with various embodiments of the invention.
  • FIG. 10 shows a small set of 3D object shapes that are designed to be recognizable even in highly dense plots.
  • FIG. 1 1 A shows the variation in appearance of a 3D object having a spherical shape under constant illumination by three static point sources when viewed from different directions.
  • FIG. 1 1 B shows the same 3D data object from the same viewpoints shown in FIG. 1 1A with the 3D object illuminated using a directional illumination source originating at (or near) the viewpoint of the user.
  • FIG. 12A - 12C illustrate the similarity in appearance of similar 3D objects as a user moves through a virtual space, because the illumination of the objects changes with the pose of the user, in accordance with various embodiments of the invention.
  • FIG. 13 is a flow chart showing a process for updating the illumination of 3D objects (or individual vertices or surfaces of 3D objects) as the field of view of a user within a virtual space changes in accordance with an embodiment of the invention.
  • FIG. 14 conceptually illustrates Directional illumination of multiple 3D objects and/or vertices or surfaces of group 3D objects within a virtual space.
  • FIG. 15A illustrates a 3D graph including interaction primitives in the form of grids, axes, and axis labels generated by a 3D data visualization system in accordance with an embodiment of the invention.
  • FIG. 15B illustrates a user interface showing recommendations concerning mappings of specific data dimensions to particular attributes of 3D objects visible in a 3D data visualization in accordance with an embodiment of the invention.
  • FIGS. 16A - 16D is a sequence of 3D data visualizations in which the X attribute of the 3D data objects is modified from a first data dimension (i.e. "Age”) to a second data dimension (i.e. "YearsOnJob”) in accordance with an embodiment of the invention.
  • Age a first data dimension
  • YearsOnJob a second data dimension
  • FIGS. 17A - 17F illustrate affordances within a VR user interface that enable a user to control the size of a 3D data visualization within the virtual world generated by a 3D data visualization system in accordance with various embodiments of the invention.
  • 3D graphs i.e. 3D data visualization systems
  • methods of generating visualizations of multidimensional data spaces in accordance with a number of embodiments of the invention can utilize 3D display technologies to address many of the challenges of effective interactive visualization of high-dimensional data.
  • 3D graph is used in a general sense to reference any 3D object or group of 3D objects that collectively describe a set of data.
  • a distinction can be drawn between the 3D object or objects that make up a 3D graph and other 3D objects, which may be utilized within a 3D visualization of multidimensional data to represent a virtual environment in which the 3D graph is contained.
  • Systems and methods in accordance with several embodiments of the invention enable the visualization of more complex data spaces and can extend the human ability to interpret additional dimensions by utilizing 3D display technologies to place a user inside the visualization, and making the act of data visualization a first person experience.
  • This approach can activate the human senses of proprioception (how people senses the relative position of their body parts) and kinesthesia (how people sense the extent of their own body in motion), which describe and explain the human body's experience within an external environment.
  • 3D data visualization systems in accordance with many embodiments of the invention utilize techniques including (but not limited to) shape selection and illumination models that preserve similarity between similar 3D objects within the field of view of the user and enhance a user's ability to differentiate between variations in size due to variations in the size attribute of a 3D object and variations in size due to differences in distances to 3D objects.
  • a user's ability to perceive structure within data is further enhanced by utilizing animation to enable a user to observe modifications to the attributes of 3D objects corresponding to specific 3D data points as the 3D objects migrate from one 3D visualization of a multidimensional data space to a 3D visualization of a different multidimensional data space.
  • 3D data visualization systems in accordance with a number of embodiments of the invention is enhanced by providing affordances within a 3D user interface that a user can use to automatically modify the rendering of high dimensionality data in a 3D data visualization.
  • the user can simply drag the 3D visualization of a multidimensional data space over the affordance and a particular action is performed (e.g. resizing of the 3D data visualization, or k- means clustering of the data points).
  • a particular action e.g. resizing of the 3D data visualization, or k- means clustering of the data points.
  • 3D data visualization systems in accordance with many embodiments of the invention enable data exploration to be performed in a collaborative manner.
  • multiple users who may or may not be in the same physical location can independently explore the same shared, virtual, multidimensional data space.
  • a single user can lead a "broadcast" interactive session in which all users view the 3D data visualization space from the same viewpoint controlled by a lead user.
  • FIGS. 1A - 1 E illustrate a set of eight clusters laid out with their data centers at the corners of a virtual cube.
  • the simple 2D projections shown in FIGS. 1A - 1 C do not readily reveal all structures within the data.
  • cluster patterns are more readily discerned.
  • 3D data visualization systems in accordance with many embodiments of the invention provide the user with an additional capability to directly interact with 3D cues from motion and parallax that enable a user to more clearly discern structures that may not be readily apparent depending upon the viewpoint from which a particular 3D visualization is rendered.
  • the ability for the user to readily shift the viewpoint of a 3D data visualization in real time can reveal visual cues that cause the user to explore the data space from a different viewpoint that yields additional insights into the data.
  • the ability to visually observe structure can be particularly useful in circumstances where machine learning algorithms trained to identify structures within data (e.g. k-means clustering) fail due to the presence of outliers that can be readily identified by a human user through visual inspection from one or more viewpoints.
  • FIGS. 2A and 2B illustrate the different perspective a user can obtain by moving within a 3D visualizations of a multidimensional data space (as opposed to being constrained to look at data visualized in three spatial dimensions from outside the data space).
  • a 3D visualizations of a multidimensional data space as opposed to being constrained to look at data visualized in three spatial dimensions from outside the data space.
  • the viewpoint the user moves from the viewpoint shown in FIG. 2A toward data of interest and the viewpoint shown in FIG. 2B, structure within a particular subset of the data is visible in greater detail.
  • 3D data visualization systems in accordance with various embodiments of the invention can support any of a number of different input modalities via which a user can provide instructions controlling the zoom, relative position and/or orientation of the 3D data visualization.
  • the 3D data visualizations shown in FIGS. 2A and 2B are rendered by
  • mapping data dimensions to characteristics of 3D objects that include the visibility of the 3D object (some data points may not be shown based upon filtering criterion), the location of the 3D object within a 3D data visualization, the size of the rendered 3D object, and/or the color of the 3D object.
  • higher dimensionality visualizations again can be generated by using data dimension mappings to determine additional characteristics of the 3D objects including (but not limited to) the shape used to render the 3D object, the texture of the 3D object, and/or the transparency of the 3D object.
  • FIG. 2C illustrates a 3D visualization of the data set shown in FIG. 2B using transparency to represent an additional data dimension.
  • FIG. 2D illustrates a 3D visualization of the data set shown in FIG.
  • data dimensions can be mapped to non-visual aspects of an immersive experience including (but not limited to) motion, sonification, haptic feedback, and/or vibrotactile feedback.
  • FIGS. 2A - 2D are two-dimensional projections of the underlying 3D data visualization.
  • 3D data visualization systems in accordance with many embodiments of the invention provide interactive 3D visualizations that enable interaction and motion parallax, which are lost when 3D data is projected in the manner utilized to generate FIGS. 2A and 2D.
  • a video sequence illustrating interactive data exploration of a 3D data visualization generated by a 3D data visualization system in accordance with an embodiment the invention is available at http://www.virtualitics.com/patentA irtualitics1.mp4 and http://www.virtualitics.com/patentA/irtualitics2.mp4 and a 3D video sequence of the same interactive session is available http://www.virtualitics.com/patentA/irtualitics3.mp4.
  • a comparison of the 2D and 3D video sequences provides a sense of the benefits of motion parallax in interpreting the structure of the data used to generate the 3D data visualization by the 3D data visualization system.
  • 3D data visualizations can include 3D line plots (see, for example, FIGS. 3A and 3B) and/or 3D surfaces (see, for example, FIGS. 4A and 4B).
  • FIGS. 3A and 3B illustrate visualization of a 3D graph from multiple viewpoints in which data is visualized as a series of 3D line plots.
  • two dimensional projections of the 3D line plots are shown in FIGS. 3C - 3E.
  • FIGS. 4A and 4B illustrate visualization of a 3D graph from multiple viewpoints in which data is visualized as a 3D surface.
  • two dimensional projections of the 3D surfaces are shown in FIGS. 4C - 4E.
  • systems and methods in accordance with different embodiments of the invention are not limited to specific types of 3D data visualizations and can be utilized to generate any of a variety of 3D data visualizations.
  • Systems and methods for performing 3D data visualization that enable a user's cognition system to interpret highly dimensional data and interact with high dimensional data in accordance with various embodiments of the invention are discussed further below.
  • 3D data visualization systems in accordance with certain embodiments of the invention can be configured for exploration of a 3D graph by a single user or by multiple users.
  • the 3D data visualization system includes a 3D rendering engine that maps data dimensions to characteristics of 3D virtual objects that are then rendered for visualization within a virtual space by the 3D rendering engine.
  • a machine vision system and/or sensor system can be utilized to track the pose of one or more users and more specifically track the head position of the user(s). Head positions can be utilized to determine a viewpoint from which to render a 3D display of the virtual
  • head positions and/or poses of the users can be utilized to render the 3D displays presented to each user and to render the avatars of individual users within the data space.
  • FIGS. 5A - 5D Rendering of a 3D graph from different viewpoints of different users in accordance with an embodiment of the invention is conceptually illustrated in FIGS. 5A - 5D.
  • a 3D data visualization showing avatars of multiple users within a virtual space in accordance with an embodiment of the invention is illustrated in FIG. 5E.
  • a 3D graph 500 is shown in which data points are visualized as 3D objects 502 and the viewpoints from which other users are exploring the virtual space is indicated by avatars 504, 506.
  • a user's ability to orient themselves within a virtual space can be enhanced by providing intuitive interaction primitives such as grid lines 508, and 510 axes labels.
  • collaborating users can independently move through the virtual space or a set of users can experience the same visualization of the virtual space controlled by a single user's interactions with the virtual space.
  • the specific collaborative exploration modes supported by a 3D data visualization system is largely dependent upon the requirements of a given application.
  • FIG. 6A A multidimensional data visualization system that can be utilized to generate a visualization of multidimensional data within three spatial dimensions for a user and/or to facilitate collaborative multidimensional data exploration in such a 3D space by multiple users in accordance with an embodiment of the invention is illustrated in FIG. 6A.
  • the 3D data visualization system 600 includes a 3D data visualization computing system 602 that is configured to communicate with a 3D display 604, which in the illustrated embodiment is a head mounted display.
  • the 3D data visualization computing system 602 can also be connected to a camera system 606 that is utilized to capture image data of the user from which the pose and/or head position of the user can be determined.
  • the camera system can also be used as an input modality to detect gesture based inputs. Additional and/or alternative input modalities can be provided including (but not limited to) user input devices, and microphones to detect speech inputs.
  • the camera system can incorporate
  • pose can be utilized to describe any representation of both a user's position in three dimensional space and orientation.
  • a simple representation of pose is a head location and viewing direction.
  • More complex pose representations can describe a user's body position using joint locations of an articulated skeleton.
  • the 3D data visualization computing system 602, the 3D display 604, and the camera system 606 are an integral unit.
  • the 3D data visualization computing system 602, 3D display 604, and camera system 606 can be implemented in a head mounted display such as (but not limited to) the HoloLens distributed by Microsoft Corporation of Redmond, Washington.
  • the 3D data visualization computing system 602 and the 3D display 604 communicate via a wireless data connection in a manner similar to that utilized by the Oculus Rift 3D display distributed by Oculus VR, LLC of Menlo Park, California.
  • 3D data visualizations can be constructed as a set of virtual objects displayed within a mixed reality context using a MR headset (e.g. HoloLens) and/or displayed in a completely immersive environment using a VR 3D display (e.g. Oculus).
  • MR headset e.g. HoloLens
  • VR 3D display e.g. Oculus
  • the 3D data visualization computing system can leverage distributed processing.
  • at least some processing associated with rendering a 3D data visualization is performed by a processor within a head mounted display.
  • additional processing is performed by a local computer system with which the head mounted display communicates.
  • processing is performed by a remote computer system (e.g. computing resources within a cloud computing cluster) with which the head mounted display communicates via the Internet (potentially via a local computer system).
  • 1.9 with various embodiments of the invention are not limited to a single computing device and can encompass a single computing device, and/or a combination of a computing system within a head mounted display, a local computing system, and/or a remote computing system.
  • a 3D data visualization computing system used within a given 3D data visualization system is largely dependent upon the requirements of a specific application.
  • FIG. 6B A multidimensional data visualization system in which multiple users are able to simultaneously explore a 3D visualization of a multidimensional data space in accordance with an embodiment of the invention is illustrated in FIG. 6B.
  • the 3D data visualization system 650 two local computer systems 652 that communicate via a server computing system 654 across a network 656.
  • Each of the local computer systems 652 is connected to a 3D display 658 and a camera system 660 in a manner similar to that described above with reference to FIG. 6A.
  • the local computer systems 652 each build a 3D model of the multidimensional data space and render video sequences (which may be 2D or 3D) responsive to changes in the pose of the users.
  • the local computer systems 652 are configured to enable independent data exploration by the users and pose information can be shared between the local computer systems 652 via the server computing system 654. The pose information can then be utilized to render an avatar(s) within the virtual space that indicates the location from which a specific user is viewing the virtual space.
  • the local computer systems 652 support a broadcast mode in which one user navigates through the virtual space and the pose of the navigating user is broadcast via the server computing system 654 to the local computer systems 652 of the others users within the virtual space.
  • the local computer systems 652 that receive pose information from the navigating user can use the pose information to render a multidimensional data visualization from the viewpoint of the navigating user for display via another user's 3D display.
  • a broadcast mode is supported by rendering a 3D video sequence and streaming the 3D video sequence to the local computer systems 652 of other users.
  • the server computing system 654 system includes sufficient computing capacity (e.g. graphics processing units) to generate 3D data visualizations for each of the users and to stream 3D video sequences over the network 656 to the local computers for display via the 3D displays based upon pose information received from a local computer system 652.
  • Computer systems that can generate 3D visualizations of multidimensional data can take a variety of forms including implementations in which all of the computing is performed by a single computing device to complex systems in which processing is distributed across head mounted displays, local computer systems and/or cloud based server systems. The specific distribution of different processes is largely dependent upon the number of users and the requirements of a given application.
  • FIG. 7 A multidimensional data visualization computing system implemented on a single computing device in accordance with an embodiment of the invention is illustrated in FIG. 7.
  • the multidimensional data visualization computing system 700 may be a personal computer, a laptop computer, a head mounted display device and/or any other computing device with sufficient processing power to render 3D displays at a sufficient frame rate to satisfy the interactive 3D data visualization requirements of a specific application.
  • the 3D data visualization computing system 700 includes a processor 702.
  • the term processor 702 is used to refer to one or more devices within the computing device that can be configured to perform computations via machine readable instructions stored within the memory 704 of the 3D data visualization computing system.
  • the processor 702 can include one or more microprocessor (CPUs), one or more graphics processing units (GPUs), and one or more digital signal processors (DSPs).
  • the processor 702 can include any of a variety of application specific circuitry developed to accelerate the 3D data visualization computing system.
  • the 3D data visualization computing system 700 includes a network interface 706 to communicate with remote computing systems (e.g. the computing systems of other users and/or a remote server computing system) and an input/output (I/O) interface 708 that can be utilized to communicate with a variety of devices including (but not limited to) a 3D display, and/or a camera system.
  • remote computing systems e.g. the computing systems of other users and/or a remote server computing system
  • I/O interface 708 can be utilized to communicate with a variety of devices including (but not limited to) a 3D display, and/or a camera system.
  • the specific communication and I/O capabilities required of a computing system used to generated 3D visualizations of multidimensional data is typically determined based upon the demands of a given application.
  • 3D data visualizations are generated by a 3D data visualization application 710 that executes within a computing environment created by an operating system 712.
  • the 3D data visualization application 710 leverages a 3D rendering engine 714 to generate 3D data visualizations that can be displayed via a 3D display.
  • the 3D data visualization application 710 loads a multi-dimensional data set 716 into in-memory data structures 718 that are stored within low-latency memory of the 3D data visualization computing system.
  • the multi-dimensional data set 716 may be locally stored in a file and/or database.
  • the multidimensional data is stored remotely (e.g. in a distributed database) and some or all of the multi-dimensional data is loaded into the in-memory data structures 718 maintained by the 3D data visualization application 710.
  • the multidimensional data is loaded into at least one visualization table.
  • additional data dimensions can be added to the multidimensional data as it is loaded into the at least one visualization table by the 3D data visualization application 710.
  • a visualization table includes a visibility dimension and the 3D data visualization application continuously modifies the visibility value of individual items within the multi-dimensional data set contained within the visualization table to reflect whether a 3D object corresponding to the item is visible within a current 3D visualization of the multi-dimensional data contained within the visualization table.
  • any of a variety of additional dimensions can be added to the multidimensional data by the 3D data visualization application as appropriate to the requirements of a given application.
  • mappings of data dimensions to attributes of 3D objects within a 3D data visualization thus effectively creating a multidimensional data visualization.
  • the mappings are stored as data dimension mappings 720.
  • the 3D data visualization application 710 can use the data dimension mappings 720 to provide attributes of 3D objects to the 3D rendering engine 714.
  • the 3D rendering engine 714 instantiates 3D objects within a 3D model 722 stored in memory and can update the attributes of the 3D objects.
  • 3D objects can be instantiated within the 3D model 722 by the 3D rendering engine 714 based upon the number of data points loaded into the in-memory data structures 718.
  • the 3D rendering engine 714 can generate 3D data visualizations rapidly in response to selection of data dimensions for visualization by the user, because the 3D objects are instantiated and the 3D rendering engine 714 simply needs to modify the attributes of the 3D objects within the 3D model 722 to generate the visualization.
  • the 3D objects are instantiated in response to definition of the attributes of the 3D objects by the user.
  • the 3D rendering engine 714 can utilize the 3D model 722 to render stereo images that can be presented via a 3D display.
  • the 3D data visualization application uses a display driver 724 to display the rendered viewpoints 726 via a 3D display.
  • the specific rendered viewpoints can be determined by pose data 728 received from a remote computing system (e.g. in broadcast mode) or based upon pose data 728 determined by the 3D data visualization computing system from image and/or other sensor data.
  • the 3D visualization application 710 receives pose data 728 from a machine vision application 730 that obtains image data from a camera system using one or more camera drivers 732.
  • the machine vision application 720 configures the processor 702 to extract a user's pose including (but not limited to) the location and orientation of the user's head from the captured image data.
  • the user's pose can be utilized to determine viewpoints 726 from which to render images from the 3D model 722.
  • user pose is also utilized to control elements of the 3D model including (but not limited to) illumination of 3D objects, speed of movement through the virtual space, and/or visibility of interaction primitives.
  • the specific ways in which user pose can be utilized to modify the rendering of 3D visualizations of multi-dimensional data in accordance with various embodiments of the invention are discussed further below.
  • the 3D rendering engine 714 can also use avatar metadata 734 that includes pose information for each avatar and (optionally) identifying information for the avatar to incorporate avatars within the 3D model 722 in such a way that avatars located within the field of view of a user are visible within viewpoints 726 rendered by the 3D rendering engine.
  • the 3D rendering engine 714 forms part of a 3D graphics engine or 3D game engine that enables implementation of the 3D data visualization application 710 within the 3D graphics engine using a mechanism such as (but not limited to) a scripting language.
  • the 3D rendering engine forms part of the 3D data visualization application.
  • the 3D data visualization application, 3D rendering engine, and/or machine vision application can be implemented independently, as a single application, or within or as a plugin for another application such as (but not limited to) a web browser application.
  • the specific manner in which the 3D data visualization application is implemented is largely dependent upon the requirements of a given computing system(s) and/or use case.
  • Processes for generating 3D data visualizations in accordance with many embodiments of the invention involve loading data into in-memory data structures and then mapping data dimensions to attributes of 3D objects to enable rendering of 3D data visualizations via 3D displays.
  • a process for generating a multidimensional data visualization in accordance with an embodiment of the invention is illustrated in FIG. 8A.
  • the process 800 includes loading (802) data points into in-memory data structures such as (but not limited to) a visualization table.
  • a 3D object is instantiated (804) with respect to each of the data points.
  • instantiating 3D objects prior to receiving mappings of data dimensions to attributes of the 3D objects can decrease latency with which a 3D data visualization can be rendered.
  • the 3D objects are not instantiated until data mappings are defined that determine the attributes of the 3D objects.
  • the timing of the instantiation of 3D objects relative to the rendering of a 3D display is largely dependent upon the requirements of a given application.
  • the process of loading the data points into in-memory data structures involves creating an additional data dimension that describes the visibility of a specific data point within a 3D data visualization.
  • the visibility data dimension for individual data points can be updated by the process 800 to indicate that a given data point should not be part of a 3D data visualization.
  • visibility is a distinct concept from being within the field of view of the user and instead refers to a decision made by the process 800 not to render the data point within the 3D graph.
  • Reasons for excluding a data point can include (but are not limited to) the data point possessing no value or an invalid value with respect to one of the data dimensions mapped to an attribute of a 3D object.
  • any of a variety of reasons can be utilized to determine that specific data points should not be included within a 3D visualization as appropriate to the requirements of a given application.
  • a visibility data dimension added during data ingest provides a mechanism for reflecting the decision not to visualize an individual data point.
  • the process 800 includes determining (806) attributes of 3D objects using data dimension mappings.
  • a user interface can present information concerning the data dimensions that describe the data points and enable a user to select the specific data dimensions to map to attributes of 3D objects.
  • data dimension mappings determine characteristics of 3D objects including (but not limited to) the visibility of the 3D object, the location of the 3D object within a virtual space, the shape used to render the 3D object, the size of the rendered 3D object within the virtual space, and/or the color of the 3D object.
  • visualizations of more than three data dimensions can be generated by using data dimension mappings to determine additional characteristics of the 3D objects including (but not limited to) the texture of the 3D object, and/or the transparency of the 3D object.
  • the list of attributes that can be defined includes (but are not limited to): X (floating point value), Y (floating point value), Z (floating point value), Shape (floating point value, string), Size (floating point value), Color Palette (floating point value, string), Color Map (floating point value, string), Color Scale (floating point value, string), and transparency (floating point value).
  • data dimension mappings can also be defined with respect to metadata describing the data points represented by the 3D objects including (but not limited to): ID (string), URL (string), Mask (floating point value used to indicate whether a data point is selected), Show By (Float, String, used to indicate whether data point is to be displayed based upon filters, e.g. show only data points with country value equal to "US").
  • Additional attributes associated with 3D objects can include (but are not limited to) subtle motions of the 3D object (e.g. binning data into different rates of jitter or twist), sonification, haptic feedback, and/or vibrotactile feedback.
  • any subset and/or combination of some or all of the above attributes can be combined with additional attributes in the visualization of data points.
  • the specific attributes utilized to visualize a data point within a 3D graph are largely dependent upon the requirements of a given 3D data visualization system.
  • mapping of data dimensions to attributes (806) is often performed by a user
  • a user and/or a 3D data visualization system may also use a previously stored 3D data visualization to define mappings of data dimensions to attributes.
  • a user can load a new data set or an updated data set (e.g. a data set to which new data points have been added) and utilize a previously selected set of mappings to visualize the data.
  • the attributes of 3D objects can be automatically determined 806 based upon mappings contained within a previously generated 3D data visualization.
  • users can share 3D data visualizations and the mappings within the shared 3D data visualizations are utilized to determine 806 the attributes of 3D objects. Users can share 3D data visualizations for independent use by others and/or as part of a broadcast 3D data visualization.
  • the process 800 renders (814) a 3D display based upon a viewpoint of a user. Accordingly, the user's pose is determined (808) and can be used to render a 3D display based upon the position of the user within the virtual space and the 3D objects within the field of view of the user. As is discussed further below with reference to Fig. 8B, additional computational efficiencies can be obtained during rendering by creating one or more meshes for 3D group objects based upon the meshes of a large number (or all) of the 3D objects. In this way, processes including (but not limited to) physics processes such as collision processing can be performed with respect to a much smaller number of 3D group objects. As can readily be appreciated, the extent to which 3D objects are aggregated into 3D group objects for the purposes of reducing the computation required to render a 3D data visualization is largely dependent upon the requirements of a given application.
  • Effective visualizations of 3D data enhance a user's ability to perceive structure within the data and avoid introducing variation in the appearance of 3D objects within the 3D graph that are unrelated to the characteristics of the data being visualized.
  • a variety of aspects of the 3D data visualization are modified based upon the pose of the user in order to enhance the ability of the user to perceive the structure of the data.
  • the process of rendering a 3D display includes illuminating (810) 3D objects based upon the user pose.
  • illuminating each 3D object within the field of view of the user using a directional illumination source originating at the user's viewpoint or slightly offset from the user's viewpoint can preserve the similarity in appearance of similar 3D objects across the field of view of the user.
  • Processes for illuminating 3D objects based upon the pose of the user in accordance with various embodiments of the invention are discussed below. Where 3D objects have different appearances based upon viewing direction, the orientations of the 3D objects within the field of view of the user can be reoriented to "face" the user (although as discussed below facing the user may actually involve orienting the 3D object at a consistent angle to better accentuate the characteristics of the 3D shape).
  • orientation may be fixed and/or may be used to visualize an additional data dimension (e.g. a data dimension is mapped to orientation relative to the user viewpoint, or to a motion such as but not limited to a rate of rotation).
  • user pose is used to modify a number of other aspects of the 3D data visualization including (but not limited) the transparency and/or location of interaction primitives.
  • interaction primitives such as grid lines and/or navigation affordances can be included within a virtual space to assist with orientation and navigation.
  • user pose determines the extent to which any interaction primitives are occluding 3D objects that represent data points. A variety of criteria can be utilized to determine whether to increase the transparency and/or to modify the visibility of the interaction primitives as appropriate to the specific user experience that a 3D data visualization system is aiming to achieve.
  • the 3D graph is contained within a virtual space that includes a virtual environment (e.g.
  • a virtual office cube or virtual office room As the user manipulates the 3D graph within the virtual environment (e.g. rotates the 3D graph or increases the size of the 3D graph), computational efficiencies can be achieved by maintaining the 3D graph as a stationary object(s) and modifying the meshes associated with the virtual environment (e.g. meshes depicting tables, chairs, desks, walls, etc.) relative to the 3D graph based upon the viewpoint of the user (e.g. resizing the virtual environment or rotating the virtual environment and user viewpoint relative to the 3D graph). Meshes associated with the virtual environment are typically simpler than the mesh of the 3D object(s) that make up the 3D graph.
  • shifting the viewpoint of the user and the 3D objects associated with the virtual environment relative to a stationary 3D graph can provide significant computational advantages while maintaining the ability of the user to perform manipulations with respect to the 3D graph including (but not limited to) rotating, moving, and/or resizing the 3D graph within the virtual environment.
  • the elements of a virtual space that can be modified in response to the pose of the user are not limited to illumination and the visibility of interaction primitives, but can include any of a variety of aspects of the 3D data visualization appropriate to the requirements of a given application including (but not limited to) modifying the rate at which the user moves through the virtual space and/or can interact with 3D objects within the virtual space based upon the position and/or pose of the user.
  • 3D data visualization systems in accordance with a number of embodiments of the invention can switch between different visualization modes based upon user pose and/or context.
  • the specific manner in which the 3D display is rendered (814) based upon the pose of the user is largely dependent upon the specific 3D display technology being utilized.
  • a stereo 3D display such as those utilized in many head mounted AR, MR, and VR headsets
  • two frames are rendered from different viewpoints that can be presented by each of the stereo displays to provide the user with simulated depth perception.
  • the process 800 continues to update the rendered 3D display based upon changes (808) in the user position and/or changes (818) in the mappings of data dimensions to attributes.
  • changes 808 in the user position and/or changes (818) in the mappings of data dimensions to attributes.
  • the interactivity of a 3D data visualization depends upon the rate at which updates to the visualization can be rendered.
  • the 3D data visualization system targets a frame rate of at least 30 frames per second.
  • target frame rates of at least 60 frames per second and/or at least 120 frames per second are supported. Updating 3D data visualizations at high frame rates involves significant computation. In many instances, the computation required to maintain high frame rates is too great and the 3D data visualization is unable to render one or more frames in time for display resulting in what is commonly referred to as a frame drop.
  • a graceful decay is supported in which portions of a frame within the center of a user's field of view are rendered and portions of the frame in the peripheral vision of the user are not updated.
  • the specific manner in which a given 3D data visualization system manages an inability to render all frames required at a target frame rate is dependent upon the requirements of a given application.
  • the likelihood that a target frame rate can be achieved can be increased by reducing the complexity of rendering a 3D data visualization.
  • computational efficiencies are achieved by creating group 3D objects that are essentially the aggregation of a number of visible 3D objects. Reducing the number of objects can decrease the computation associated with aspects of the rendering pipeline including the processing performed by the physics engine to detect collisions between 3D objects and the drawing process itself.
  • a single group 3D object is created using all of the 3D objects corresponding to visible data points within a 3D graph.
  • a number of group 3D objects that is smaller than the total number of visible 3D objects is created.
  • the group 3D objects are simply meshes having the shape of a set of 3D objects.
  • FIG. 8B A process for rendering a 3D data visualization using group 3D objects in accordance with an embodiment of the invention is illustrated in FIG. 8B.
  • the process 850 commences with the instantiation (852) of a set of 3D data objects that include multiple visualization attributes.
  • the visualization attributes of the 3D data objects can be determined (854) in a manner similar to that described above using a set of data dimension mappings.
  • the data dimension mappings define data dimension values in a visualization table that are processed to determine a specific visualization attribute of a 3D object.
  • One or more group 3D objects are created (856) by generating the mesh and texture of each group 3D object using the meshes and textures of multiple visible 3D objects. In several embodiments, as many as 100,000 3D objects are utilized to create (856) a group 3D object.
  • the specific number of 3D objects utilized to create a group 3D object typically depends upon the requirements of a given application.
  • User pose is determined (858), and the group 3D objects illuminated (860) based upon the user pose.
  • the group 3D objects are drawn on a per vertex basis and each vertex is illuminated using a directional light source with a direction determined based upon the line of sight from the user viewpoint (or a point close to the user viewpoint) to the vertex. Collision processing can then be performed
  • the virtual space can then be rendered (864) from the viewpoint of the user.
  • a 3D display is rendered.
  • the rendered display drives a 2D display device.
  • Utilizing group objects can significantly reduce processing associated with interactively rendering a 3D data visualization at high frame rates.
  • grouping objects results in the same processing overhead to modify a single 3D object as to modify all 3D objects. Accordingly, consideration of the number of 3D objects corresponding to data points combined into a single 3D group object can achieve a balance between reduced computational overhead when interacting with a 3D graph and preserving interactivity when the 3D graph changes due to updates in data mappings. Animation of movement from one 3D graph to another as mappings of data dimensions to attribute values change is discussed below.
  • Group 3D objects can be utilized to animate groups of data points in batches and achieve high frames during animations.
  • 3D data visualization systems in accordance with various embodiments of the invention have the capacity to generate visualizations of eight or more dimensions of data.
  • a challenge with representing highly dimensional data in 3D is that the 3D data visualization inherently introduces variation in the appearance of 3D objects that is unrelated to the underlying attributes of the 3D objects. For example, a user may have difficulty perceiving the relative size of 3D objects having different shapes that are located at different distances from the user. Illumination and, more specifically,
  • 3D data visualizations can be enhanced by utilizing shapes that preserve the ability of a user to differentiate variations in size due to depth and variations in size as an attribute of the data, and/or illumination models that illuminate 3D objects in the same way across the field of view of the user and do not involve casting of shadows on other 3D objects.
  • Use of depth perception preserving shapes and illumination models in the 3D visualization of data to increase the ability of users to perceive structure within data in accordance with various embodiments of the invention are discussed further below.
  • Preserving depth perception can be important in preserving the ability of a user to comprehend the dimensions of the data being visualized. For example, when size is utilized to visualize a data dimension, the size of a specific 3D objects rendered within the field of view of a user will depend both upon the size attribute of the 3D object and the distance of the 3D object from the user within the virtual space. Where a data dimension is also mapped to a shape attribute of the 3D objects, then the shape of the object can further confuse size comparisons (in a manner that is compounded by differences in depth). Experiments have shown that relative size perception for different shapes, such as cubes, spheres, or cylinders, is affected by various factors that include distance, alignment of the objects, color and illumination.
  • polyhedrons with many faces such as (but not limited to) icosahedrons that have sphere-like appearances are utilized as the shape of 3D objects with 3D data visualizations.
  • spheres are complex shapes to render in 3D and are typically rendered as polyhedra with hundreds of faces. Accordingly, utilization of polyhedra with smaller numbers of faces in the range of tens of faces as opposed to hundreds of faces can significantly reduce the computation associated with rendering a 3D graph.
  • a 3D visualization of a multidimensional data space in which data dimensions are mapped to shape and size attributes of 3D objects is shown from several viewpoints in FIGS. 9A - 9C.
  • data points are visualized using depth perception preserving 3D shapes that assist a user in determining the relative size of a 3D object given the distance of the 3D object from the user within the virtual space.
  • FIG. 10 shows a small set of 3D object shapes that are designed to be recognizable even in highly dense plots. The construction of these 3D object shapes was done with several criteria in mind: front, profile, top profile, protrusions, and corners versus curved regions.
  • the initial templates were basic shapes common to 2D plots. These include circles, triangles, stars, boxes, pluses and exes. 3D shapes were derived with these listed front profiles. Circles translate to spheres 1000 and tori 1008. Triangles translate to pyramids 1006 (or tetrahedral) and cones 1004. Boxes translate to cubes 1002 and cylinders 1010. While exes and pluses could be made into 3D objects in a similar fashion, each branch of the 3D shapes has a potential to extrude beyond an occluding object and be confused for a different type of 3D shape or a feature in a simpler 3D shape. The same difficulty can be encountered with star plot based 3D shapes.
  • top and bottom profiles demonstrate a variety of top and bottom profiles: circle, square, point, triangle, and ellipse. So, while some glyphs do share the same front profile, particularly the cylinder and cube, the number of protruding corners (or lack thereof) and top profiles allow them to remain recognizable even on densely populated plot regions. Furthermore, lighting can exacerbate the visual distinction between shapes, aiding in the ability to differentiate between them.
  • the cone 1004 and pyramid 1006, depending on orientation can exhibit same front profiles, i.e. a triangle. Thus the cone 1004 is chosen to point upward, while the pyramid 1006 is chosen to have one horizontal edge 1012 at the top, while pointing outward.
  • the extra features need not be included in 3D shapes utilized in accordance with various embodiments of the invention.
  • depth perception is preserved by limiting only one dimension of a 3D shape to be a visualization attribute.
  • the visualized 3D shapes have heights that are invariant, but widths that vary based upon the value of a mapped data dimension.
  • 3D shapes that are pill shaped i.e. cylindrical with rounded or hemispherical ends
  • the widths of the pill shaped 3D shapes i.e.
  • the diameters of the cylindrical portion of the pill shapes are varied based upon the value of a mapped data dimension and the heights of the pill shaped 3D shapes is invariant with data value. In this way, the width conveys information and the height provides a depth cue.
  • the specific shapes that are utilized will largely depend upon the requirements of a given application. The manner in which illumination of 3D shapes can be utilized to enhance 3D data visualization in accordance with various embodiments of the invention is discussed below.
  • Illumination models used within a 3D data visualization can significantly impact the ease with which a user can interpret visualized data. As noted above, the effectiveness of a 3D data visualization can be decreased where the visualization introduces variation between the visual appearance of 3D objects that is unrelated to the data dimensions that are being visualized.
  • FIG. 1 1A shows the variation in appearance of a 3D object having a spherical shape under constant illumination by three static point sources when viewed from different directions. As can readily be
  • 3D data visualization systems in accordance with a number of embodiments of the invention utilize an illumination model in which a separate directional light source originating at (or adjacent) the viewpoint of the user is used to illuminate each 3D data object within the field of view of the user when rendering the 3D data visualization.
  • a separate directional light source originating at (or adjacent) the viewpoint of the user is used to illuminate each 3D data object within the field of view of the user when rendering the 3D data visualization.
  • FIG. 1 1A The same 3D data object from the same viewpoints shown in FIG. 1 1A are illustrated in FIG.
  • a process for updating the illumination of 3D objects as the field of view of a user within a virtual space changes in accordance with an embodiment of the invention is illustrated in FIG. 13.
  • the process 1300 includes obtaining (1302) pose information and then using the pose information to determine the position and field of view of the user.
  • 3D objects within the field of view of the user can be identified and selected (1304, 1310).
  • the position of the 3D object relative to the location of the user within the virtual space can be used to determine (1306) a direction of illumination.
  • the direction of illumination is typically selected as the direction from the user location to the 3D object or the direction from a point adjacent the location of the user to the 3D object.
  • the direction of illumination can, however, vary based upon the requirements of a given application.
  • each 3D object within the field of view of the user is illuminated (1308) using a directional illumination source and the process completes, when the illumination of all 3D objects within the user's field of view is updated (1310).
  • a directional light mimics illumination by the sun and involves using an
  • FIG. 14 3.5 illumination model involving parallel light rays in a single direction.
  • Directional illumination of multiple 3D objects within a virtual space based upon pose of a viewer is conceptually illustrated in FIG. 14. While using a separate directional light source to illuminate each 3D object provides significant advantages in providing uniform illumination of 3D objects across the field of view of a user, other illumination models that achieve uniform illumination can also be utilized as appropriate to the requirements of a given application in accordance with various embodiments of the invention.
  • the 3D data visualization can be enhanced by configuring each of the 3D objects so that they do not cast shadows within the virtual space.
  • 3D data visualization systems in accordance with a number of embodiments of the invention add additional 3D objects within the virtual space in the form of interaction primitives that assist a user in maintaining an awareness of the user's position within the virtual space and orientation relative to the data.
  • User primitives that are utilized within 3D data visualization systems in accordance with various embodiments of the invention are discussed further below.
  • 3D data visualization systems in accordance with several embodiments of the invention utilize interaction primitives to provide visual anchors for users to enable them to maintain a sense of their relative orientation to visualized data.
  • the 3D graph containing 3D objects is bounded by a cube on which a grid pattern is visible on the interior surface of the cube.
  • the position of the user can be utilized to make one or more surfaces of the cube completely transparent so that the grid lines do not partially occlude the 3D objects within the 3D graph.
  • labelled axes are continuously shown within the field of view of the user to provide the user with a visual cue concerning the orientation of the data.
  • FIG. 15A A 3D graph including interaction primitives in the form of grids, axes, and axis labels generated by a 3D data visualization system in accordance with an embodiment of the invention is illustrated in FIG. 15A.
  • the 3D objects 1500 are contained within a 3D graph that is bounded by planar grids 1502.
  • Three color coded orthogonal axes 1504 provide visual anchors that are reinforced by directional axis labels 1506.
  • interaction primitives and uses of interaction primitives are described above to assist a user in maintaining a sense of orientation relative to a 3D graph during interactive exploration (particularly within the 3D graph), any of a variety of interaction primitives can be utilized that provide the user with visual cues regarding orientation as appropriate to the requirements of a given application in accordance with many embodiments of the invention.
  • a major problem associated with pattern recognition in high-dimensional data sets is the curse of dimensionality; this can be addressed by selecting only a subset of features that are rich in discriminatory power with respect to the data set being visualized.
  • Feature selection is often preferable to feature transformation (e.g., Principal Component Analysis) when the meaning of the features is important and the goal of the 3D data visualization is to find relationships between the features in order to better understand the data set.
  • feature transformation e.g., Principal Component Analysis
  • the data dimensions can be separated into data dimensions that are numerical (e.g. a volatility measure) or categorical (e.g. Region).
  • 3D data visualization systems in accordance with many embodiments of the invention are configured to automatically detect whether data is numerical or categorical during ingest and enable users to modify data dimension classification where incorrect (e.g. ZIP codes may be identified as numerical, but are actually categorical - "91 107" is not greater than "91 101").
  • incorrect e.g. ZIP codes may be identified as numerical, but are actually categorical - "91 107" is not greater than "91 101"
  • 3D data visualizations perform feature selection processes to provide recommendations concerning specific mappings of data dimensions to visualization attributes.
  • a user selects a
  • a feature selection process can then be performed with respect to both the numerical and categorical data dimensions to determine the dimensions that are most relevant to the feature of interest.
  • separate feature selection processes are performed with respect to the numerical data dimensions and with respect to the categorical data dimensions.
  • the feature selection process utilizes univariate feature selection.
  • any of a variety of feature selection processes can be utilized as appropriate to the requirements of a given application.
  • the feature selection process yields an ordered list of features.
  • the 3D data visualization system generates separate ordered lists of numerical and categorical data dimensions. In some embodiments, only a subset of data dimensions is considered in forming the ordered lists.
  • the ordering information can be utilized to generate or recommend specific mappings of data dimensions to particular visualization attributes.
  • a data visualization is generated in which the four most important numerical data dimensions are mapped to the X, Y, and Z spatial position coordinates of the 3D objects and the size attributes of the 3D objects.
  • the two most important categorical attributes are mapped to the Show By and shape visualization attributes.
  • a dialogue can be provided that makes recommendations and/or further recommendations concerning data dimensions that can be assigned to additional attributes.
  • the feature of interest that was utilized to generate the importance ordering of the other data dimensions can be mapped to the color visualization attribute.
  • the specific recommended mappings is determined based upon the relative relevance of different numerical and categorical data dimensions.
  • categorical data dimensions will be most important from a quantitative perspective and/or from the perspective of the user.
  • mapping categorical variables to the X, Y, and Z spatial position coordinates of the 3D objects the 3D objects can be utilized to generate a 3D swarm plot or other type of categorical scatter plot with non-overlapping points.
  • FIG. 15B 3.8 user interface showing recommendations concerning mappings of specific data dimensions to particular attributes of 3D objects visible in a 3D data visualization in accordance with an embodiment of the invention is illustrated in FIG. 15B.
  • any of a variety of techniques for recommending specific data dimensions to map to particular visualization attributes including (but not limited to) techniques that perform importance ordering utilizing the Fast Relief Algorithm, the Fisher Discriminant Ratio, Correlation-based Feature Selection, a Fast Correlation Based Filter, and/or Multi Class Feature Selection can be utilized as appropriate to the requirements of specific applications in accordance with various embodiments of the invention.
  • a specific data dimension is mapped to a visualization attribute
  • numerical data dimensions are mapped to a continuous visualization attribute such as (but not limited to) color in a non-linear manner so that the greatest differences in the colors of the 3D objects occurs with respect to the range of values that conveys the most information regarding the relationship between the data dimensions and other data dimensions or a target feature.
  • a data dimension is mapped to color it is mapped to a discrete number of colors to add with visual discrimination of the color attribute.
  • any of a variety of techniques can be utilized to map values of a data dimension to specific colors as appropriate to the requirements of a given application in accordance with various embodiments of the invention.
  • 3D data visualization systems use animation to illustrate correspondence between specific 3D objects representing discrete data points as data mappings change.
  • the ability to observe a point moving from a location in a first 3D graph to a second 3D graph in which one or more the attributes of the 3D object corresponding to the data point are modified can enable the user to observe relationships that may exist within the data that can be revealed by mapping alternative combinations of data dimensions to 3D object attributes.
  • a sequence of 3D data visualizations in which the X attribute of the 3D data objects is modified from a first data dimension (i.e. "Age") to a second data dimension (i.e. "YearsOnJob") in accordance with an embodiment of the invention is illustrated in FIGS. 16A - 16D.
  • additional insights into the data can be provided by animating different subsets of the data at different rates.
  • a clustering algorithm can be utilized to analyze the data in a first 3D graph and the animation can involve movement of the 3D data objects in the different clusters at different speeds.
  • the user can control the animation so that sets of 3D objects commence movement upon receipt of user input directing their movement. In this way, a user can isolate a specific set of 3D objects and observe the way in which they map from one 3D graph to another.
  • a user can repeat the animations in a loop and/o reverse the animation to obtain further insights.
  • any of a variety of animation techniques can be utilized to show changes in the attributes of 3D objects (which may include changes in shape, color, size, texture, etc. in addition to changes in position) from one 3D graph to another 3D graph as data dimension mappings are changed using 3D data visualization systems as appropriate to the requirements of specific applications in in accordance with various embodiments of the invention.
  • a user typically interacts with a 3D graph in different ways depending upon the amount of free space and freedom of movement of the user within the real world environment in which the 3D data visualization is being performed.
  • the user When a user interacts with a 3D graph while seated at a desk, the user typically prefers the 3D graph to be displayed in a compact manner (e.g. a one foot x one foot x one foot cube).
  • the 3D graph is within a virtual environment.
  • the virtual environment and/or a mixed reality environment can contain affordances that enable manipulation of data.
  • the appearance of the 3D graph moving with respect to the virtual environment and affordances within the virtual environment can be achieved by moving the 3D objects associated with the virtual environment and the user's viewpoint within the virtual space to create the appearance that the 3D graph is being manipulated relative to the environment without moving the 3D objects that form the 3D graph.
  • the manner in which the user interacts with the 3D graph can change as the relative scale of the 3D graph change changes with respect to the virtual environment and with the distance of the user's viewpoint from the 3D graph within the virtual environment.
  • a 3D data visualization system provides affordances within the 3D user interface displayed to the user that control the resizing of a 3D graph.
  • the 3D data visualization system can use information concerning the real world environment in which the 3D data visualization is being performed and can adjust the size of the 3D visualization of the multidimensional data space to fit within the environment.
  • FIG. 17A Affordances within a VR user interface that enable a user to control the size of a 3D data visualization within the virtual world generated by a 3D data visualization system in accordance with an embodiment of the invention is illustrated in FIG. 17A.
  • the virtual environment 1700 includes a virtual office cube with a first affordance 1702 that can resize a 3D graph to a predetermined "desktop" scale.
  • the virtual environment 1700 also includes a second affordance 1704 that can resize a 3D graph to a
  • 4.1 predetermined “sit down” scale which is larger than the "desktop” scale
  • a third affordance 1706 that can resize a 3D graph to the largest predetermined "standing” scale.
  • the 3D data visualization is resized.
  • the user can further manipulate the 3D data visualization including changing the scaling of the 3D data visualization (e.g. shrinking a 3D graph to "desktop” scale and then expanding the 3D graph to move through the data toward a specific cluster of interest).
  • FIGS. 17B - 17F The manner in which a 3D data visualization can be resized using affordances within a 3D user interface is conceptually illustrated in FIGS. 17B - 17F.
  • a user's pose (1708) and a 3D data visualization (1710) within the virtual environment are illustrated.
  • the affordance (1702) that can be utilized to resize the 3D data visualization to "desktop" scale is visible in FIG. 17C.
  • Movement of the 3D data visualization (1710) over the affordance (1702) and the resulting resized 3D data visualization (1712) are conceptually illustrated in FIGS. 17E and 17F. As noted above, resizing does not fix the size of the 3D graph.
  • the user can continue to modify the size of the 3D data visualization and the manner in which the 3D data visualization system responds to user inputs may change (e.g. the magnitude of a 3D gesture movement can be responded to differently based upon the size of the 3D data visualization).
  • the affordance simply provides a mechanism by which a user can move between different contexts and have the 3D data visualization system automatically modify the 3D data visualization in ways that can include (but are not limited to) reflecting differences in the real world space and/or real world freedom of movement available for the user to explore the data.
  • the increased scale with which a user can visualize the 3D data visualization when the user's context shifts to a standup mode in which the user has freedom of movement is conceptually illustrated in FIG. 17F.
  • 3D data visualization systems can support any number of different modes and/or contexts as appropriate to the requirements of a given application in accordance with various embodiments of the invention.
  • the affordances within the 3D user interface that automatically modify the 3D data visualization are not limited to affordances that simply resize the data.
  • a rich array of affordances are offered including (but not limited to) affordances that respond to movement of a 3D graph over the affordance by: applying a machine learning algorithm to the data (e.g. k-means clustering); and/or generating a new 3D visualization of multidimensional data represented by a single 3D object within an initial 3D visualization.
  • automated actions such as (but not limited to) resizing can be performed in response to predetermined inputs that can include (but are not limited to) gesture, inputs, speech inputs, inputs via one or more input devices, and/or any combination or sequence of inputs.
  • predetermined inputs can include (but are not limited to) gesture, inputs, speech inputs, inputs via one or more input devices, and/or any combination or sequence of inputs.
  • 3D data visualization systems are not limited to use of any specific affordances and can utilize any affordance, set of affordances, and/or input modalities to interact with a 3D graph as appropriate to the requirements of a given application.
  • the 3D data visualization system determines real world context and dynamically modifies the rendering a 3D data visualization to be realistically contained within the real world environment. For example, instead of a virtual cube, similar resizing operations can be performed within a real cube and/or office space.
  • a depth sensing camera system is utilized to acquire information concerning the volumes of a free space surrounding a user. In other embodiments, any of a variety of appropriate machine vision techniques can be utilized.
  • the 3D data visualization system detects changes in the volume of space related to the change in pose and/or viewpoint of the user and can resize a 3D data visualization in a manner that is appropriate to the new volume of free space visible from the user's viewpoint.
  • a 3D data visualization system will typically not place limits on the ability to which a user can resize a 3D data visualization based upon the volume of free space available to contain the visualization. Instead, a user can expand the 3D data visualization in a way that enables interactive exploration of the data.
  • a variety of input modalities are supported by 3D data visualization systems.
  • a user can interact with a 3D data visualization system using a desktop device using a conventional Windows Icons
  • a user transitions to interacting with the 3D data visualization system via an immersive 3D display, such as (but not limited to) an AR, MR, or VR headset, then user input can be obtained using a variety of additional input modalities.
  • 3D gesture based inputs can be observed using a machine vision system.
  • a user is provided with a wand-like user input device having an elongated handle that communicates wirelessly with the 3D data visualization system.
  • the wand has a single button and communicates via a wireless communication link.
  • the 3D data visualization system can obtain user input by tracking the pose of the wand and the status of the button.
  • the button can be interpreted by the 3D data visualization system as conveying different information.
  • a simple input modality involves allowing the user to move the position of the 3D data visualization relative to the user when the button is not being pressed, and rotate the 3D data visualization when the button is pressed.
  • user gaze direction and/or a remote control that simply includes one or more buttons can be used as user inputs.
  • any of a variety of processing can be initiated based upon a combination of a pose input and a button input as appropriate to a specific use interface context and the requirements of a given application in accordance with various embodiments of the invention.
  • any of a variety of additional input modalities can also be supported as appropriate to the needs of a given 3D data visualization application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Computing Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Processing Or Creating Images (AREA)
  • Architecture (AREA)
  • Image Generation (AREA)
PCT/US2016/053842 2015-09-24 2016-09-26 Systems and methods for data visualization using tree-dimensional displays WO2017054004A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018502734A JP2018533099A (ja) 2015-09-24 2016-09-26 三次元ディスプレイを用いたデータ可視化システム及び方法
EP16849900.2A EP3353751A4 (en) 2015-09-24 2016-09-26 SYSTEMS AND METHODS FOR VISUALIZING DATA USING THREE DIMENSIONAL DISPLAY DEVICES

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562232119P 2015-09-24 2015-09-24
US62/232,119 2015-09-24
US201662365837P 2016-07-22 2016-07-22
US62/365,837 2016-07-22

Publications (2)

Publication Number Publication Date
WO2017054004A1 true WO2017054004A1 (en) 2017-03-30
WO2017054004A8 WO2017054004A8 (en) 2018-02-08

Family

ID=58387512

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/053842 WO2017054004A1 (en) 2015-09-24 2016-09-26 Systems and methods for data visualization using tree-dimensional displays

Country Status (4)

Country Link
US (2) US9665988B2 (ja)
EP (1) EP3353751A4 (ja)
JP (1) JP2018533099A (ja)
WO (1) WO2017054004A1 (ja)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2669716C1 (ru) * 2017-05-12 2018-10-15 Общество с ограниченной ответственностью "ВИЗЕКС ИНФО" Система и способ для обработки и анализа больших объемов данных
JP2019046469A (ja) * 2017-08-31 2019-03-22 富士通株式会社 多変数データの画像化
US10417812B2 (en) 2015-09-24 2019-09-17 California Institute Of Technology Systems and methods for data visualization using three-dimensional displays
US10621762B2 (en) 2018-05-14 2020-04-14 Virtualitics, Inc. Systems and methods for high dimensional 3D data visualization
JP7461940B2 (ja) 2018-10-21 2024-04-04 オラクル・インターナショナル・コーポレイション 対話型データエクスプローラおよび3dダッシュボード環境

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10127574B2 (en) * 2011-11-30 2018-11-13 Cynthia Brown Internet marketing analytics system
US11232626B2 (en) * 2011-12-21 2022-01-25 Twenieth Century Fox Film Corporation System, method and apparatus for media pre-visualization
US10453242B2 (en) * 2014-10-29 2019-10-22 Hewlett-Packard Development Company, L.P. Visualization including multidimensional graphlets
US10419770B2 (en) 2015-09-09 2019-09-17 Vantrix Corporation Method and system for panoramic multimedia streaming
US11108670B2 (en) 2015-09-09 2021-08-31 Vantrix Corporation Streaming network adapted to content selection
US10694249B2 (en) * 2015-09-09 2020-06-23 Vantrix Corporation Method and system for selective content processing based on a panoramic camera and a virtual-reality headset
US11287653B2 (en) 2015-09-09 2022-03-29 Vantrix Corporation Method and system for selective content processing based on a panoramic camera and a virtual-reality headset
US10134179B2 (en) 2015-09-30 2018-11-20 Visual Music Systems, Inc. Visual music synthesizer
US20180300326A1 (en) * 2017-04-17 2018-10-18 The Boeing Company Three-Dimensional Massive Model Visualization Database System
JP6972647B2 (ja) * 2017-05-11 2021-11-24 富士フイルムビジネスイノベーション株式会社 三次元形状データの編集装置、及び三次元形状データの編集プログラム
US11620315B2 (en) * 2017-10-09 2023-04-04 Tableau Software, Inc. Using an object model of heterogeneous data to facilitate building data visualizations
WO2019113299A1 (en) * 2017-12-06 2019-06-13 Reconstructor Holdings Llc Methods and systems for representing relational information in 3d space
US10929476B2 (en) * 2017-12-14 2021-02-23 Palantir Technologies Inc. Systems and methods for visualizing and analyzing multi-dimensional data
US20190188893A1 (en) * 2017-12-18 2019-06-20 Dataview Vr, Llc Simulated reality data representation system and method
US10854003B2 (en) 2018-01-15 2020-12-01 Worcester Polytechnic Institute Visualization of network data as a three-dimensional hierarchical data structure in a mixed reality environment
US10855564B2 (en) 2018-01-15 2020-12-01 Worcester Polytechnic Institute Visualization of network data as a three-dimensional hierarchical data structure in a mixed reality environment
US10438414B2 (en) 2018-01-26 2019-10-08 Microsoft Technology Licensing, Llc Authoring and presenting 3D presentations in augmented reality
CN108346172B (zh) * 2018-02-26 2021-10-08 中译语通科技股份有限公司 多维空间数据点阵vr展示方法及系统
US11461629B1 (en) * 2018-03-05 2022-10-04 Meta Platforms, Inc. Neural network model visualization
US10963273B2 (en) 2018-04-20 2021-03-30 Facebook, Inc. Generating personalized content summaries for users
US11886473B2 (en) 2018-04-20 2024-01-30 Meta Platforms, Inc. Intent identification for agent matching by assistant systems
US10861238B2 (en) 2018-05-14 2020-12-08 Microsoft Technology Licensing, Llc Experiential representation of data in mixed reality
US11562514B2 (en) * 2018-09-07 2023-01-24 Siemens Healthcare Diagnostics Inc. Instrument analyzers, data displays, and display methods
US11107274B2 (en) * 2018-09-17 2021-08-31 Purdue Research Foundation Methods of processing three dimensional models
US11636650B2 (en) * 2018-09-24 2023-04-25 K2M, Inc. System and method for isolating anatomical features in computerized tomography data
US10353073B1 (en) 2019-01-11 2019-07-16 Nurulize, Inc. Point cloud colorization system with real-time 3D visualization
JP7261083B2 (ja) * 2019-05-09 2023-04-19 株式会社日立製作所 ソフトウェア解析支援システム
US11037126B2 (en) * 2019-07-09 2021-06-15 Visa International Service Association Systems and methods for assessing electronic payment readiness
US11010015B2 (en) 2019-08-07 2021-05-18 Fmr Llc Systems and methods for filtering data in virtual reality environments
CN112489185B (zh) * 2019-08-20 2023-12-12 黎欧思照明(上海)有限公司 一种基于空间数据采集的集成灯光建模方法
CN110599600B (zh) * 2019-09-16 2023-05-23 海南诺亦腾海洋科技研究院有限公司 数据的多维可视化模型的自动转化方法
CN111354085A (zh) * 2020-02-26 2020-06-30 广州奇境科技有限公司 沉浸式交互Box影像制作方法
CN111563357B (zh) * 2020-04-28 2024-03-01 纳威科技有限公司 电子器件的三维可视化显示方法及系统
US11487781B2 (en) 2020-05-08 2022-11-01 International Business Machines Corporation Visualizing sparse multi-dimensional data
US11282267B2 (en) 2020-07-02 2022-03-22 Cognizant Technology Solutions India Pvt. Ltd. System and method for providing automated data visualization and modification
US10949058B1 (en) 2020-09-03 2021-03-16 Fmr Llc Generating and manipulating three-dimensional (3D) objects in a 3D environment of an alternative reality software application
CN112419482B (zh) * 2020-11-23 2023-12-01 太原理工大学 深度点云融合的矿井液压支架群位姿三维重建方法
US20220375156A1 (en) * 2021-05-19 2022-11-24 Red Hat, Inc. Multi-sensory representation of datasets in a virtual reality environment
WO2022251745A1 (en) * 2021-05-28 2022-12-01 Imply Data, Inc. Dynamic query engine for data visualization
KR102319568B1 (ko) * 2021-05-31 2021-11-01 주식회사 트라이폴리곤 3d 모델링 방법 및 이러한 방법을 수행하는 장치
US11551402B1 (en) * 2021-07-20 2023-01-10 Fmr Llc Systems and methods for data visualization in virtual reality environments
US20230360292A1 (en) * 2022-05-04 2023-11-09 Red Hat, Inc. Visual database system for multidimensional data representation
US12007963B1 (en) 2023-04-25 2024-06-11 Bank Of America Corporation System and method for processing resources and associated metadata in the metaverse using knowledge graphs
CN117853824B (zh) * 2024-03-04 2024-05-07 北京国星创图科技有限公司 一种基于大数据的3d沙盘投影分析方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US20130097563A1 (en) * 2010-06-24 2013-04-18 Associacao Instituto Nacional De Matematica Pura E Aplicada Multidimensional-data-organization method
WO2014130044A1 (en) * 2013-02-23 2014-08-28 Hewlett-Packard Development Company, Lp Three dimensional data visualization
WO2014193418A1 (en) * 2013-05-31 2014-12-04 Hewlett-Packard Development Company, L.P. Three dimensional data visualization
US20150205840A1 (en) * 2014-01-17 2015-07-23 Crytek Gmbh Dynamic Data Analytics in Multi-Dimensional Environments

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5461708A (en) 1993-08-06 1995-10-24 Borland International, Inc. Systems and methods for automated graphing of spreadsheet information
US6154723A (en) 1996-12-06 2000-11-28 The Board Of Trustees Of The University Of Illinois Virtual reality 3D interface system for data creation, viewing and editing
US6456285B2 (en) * 1998-05-06 2002-09-24 Microsoft Corporation Occlusion culling for complex transparent scenes in computer generated graphics
US6750864B1 (en) 1999-11-15 2004-06-15 Polyvista, Inc. Programs and methods for the display, analysis and manipulation of multi-dimensional data implemented on a computer
CA2403300A1 (en) * 2002-09-12 2004-03-12 Pranil Ram A method of buying or selling items and a user interface to facilitate the same
AU2003226357A1 (en) * 2002-04-10 2003-10-27 Imagine Xd, Inc. System and method for visualizing data
US8131471B2 (en) 2002-08-08 2012-03-06 Agilent Technologies, Inc. Methods and system for simultaneous visualization and manipulation of multiple data types
US8042056B2 (en) * 2004-03-16 2011-10-18 Leica Geosystems Ag Browsers for large geometric data visualization
US7283654B2 (en) 2004-08-26 2007-10-16 Lumeniq, Inc. Dynamic contrast visualization (DCV)
US20070211056A1 (en) * 2006-03-08 2007-09-13 Sudip Chakraborty Multi-dimensional data visualization
KR101257849B1 (ko) * 2006-09-29 2013-04-30 삼성전자주식회사 3차원 그래픽 객체 렌더링 방법 및 장치, 이를 위한 렌더링객체 최소화 방법 및 장치
US9098559B2 (en) * 2007-08-31 2015-08-04 Sap Se Optimized visualization and analysis of tabular and multidimensional data
GB2458927B (en) * 2008-04-02 2012-11-14 Eykona Technologies Ltd 3D Imaging system
KR101590763B1 (ko) * 2009-06-10 2016-02-02 삼성전자주식회사 Depth map 오브젝트의 영역 확장을 이용한 3d 영상 생성 장치 및 방법
US8890946B2 (en) 2010-03-01 2014-11-18 Eyefluence, Inc. Systems and methods for spatially controlled scene illumination
WO2013016733A1 (en) 2011-07-28 2013-01-31 Schlumberger Canada Limited System and method for performing wellbore fracture operations
US9824469B2 (en) 2012-09-11 2017-11-21 International Business Machines Corporation Determining alternative visualizations for data based on an initial data visualization
US20150113460A1 (en) 2013-10-23 2015-04-23 Wal-Mart Stores, Inc. Data Analytics Animation System and Method
US9734595B2 (en) 2014-09-24 2017-08-15 University of Maribor Method and apparatus for near-lossless compression and decompression of 3D meshes and point clouds
US9665988B2 (en) 2015-09-24 2017-05-30 California Institute Of Technology Systems and methods for data visualization using three-dimensional displays
DE102015221998B4 (de) 2015-11-09 2019-01-17 Siemens Healthcare Gmbh Verfahren zur Unterstützung eines Befunders bei der Ortsbeschreibung einer Zielstruktur in einer Brust, Vorrichtung und Computerprogramm
US10977435B2 (en) 2015-12-28 2021-04-13 Informatica Llc Method, apparatus, and computer-readable medium for visualizing relationships between pairs of columns
US10482196B2 (en) 2016-02-26 2019-11-19 Nvidia Corporation Modeling point cloud data using hierarchies of Gaussian mixture models
US10362950B2 (en) 2016-06-24 2019-07-30 Analytics For Life Inc. Non-invasive method and system for measuring myocardial ischemia, stenosis identification, localization and fractional flow reserve estimation
US9953372B1 (en) 2017-05-22 2018-04-24 Insurance Zebra Inc. Dimensionality reduction of multi-attribute consumer profiles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057856A (en) * 1996-09-30 2000-05-02 Sony Corporation 3D virtual reality multi-user interaction with superimposed positional information display for each user
US20130097563A1 (en) * 2010-06-24 2013-04-18 Associacao Instituto Nacional De Matematica Pura E Aplicada Multidimensional-data-organization method
WO2014130044A1 (en) * 2013-02-23 2014-08-28 Hewlett-Packard Development Company, Lp Three dimensional data visualization
WO2014193418A1 (en) * 2013-05-31 2014-12-04 Hewlett-Packard Development Company, L.P. Three dimensional data visualization
US20150205840A1 (en) * 2014-01-17 2015-07-23 Crytek Gmbh Dynamic Data Analytics in Multi-Dimensional Environments

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3353751A4 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10417812B2 (en) 2015-09-24 2019-09-17 California Institute Of Technology Systems and methods for data visualization using three-dimensional displays
RU2669716C1 (ru) * 2017-05-12 2018-10-15 Общество с ограниченной ответственностью "ВИЗЕКС ИНФО" Система и способ для обработки и анализа больших объемов данных
JP2019046469A (ja) * 2017-08-31 2019-03-22 富士通株式会社 多変数データの画像化
US10621762B2 (en) 2018-05-14 2020-04-14 Virtualitics, Inc. Systems and methods for high dimensional 3D data visualization
US10872446B2 (en) 2018-05-14 2020-12-22 Virtualitics, Inc. Systems and methods for high dimensional 3D data visualization
US11455759B2 (en) 2018-05-14 2022-09-27 Virtualitics, Inc. Systems and methods for high dimensional 3D data visualization
JP7461940B2 (ja) 2018-10-21 2024-04-04 オラクル・インターナショナル・コーポレイション 対話型データエクスプローラおよび3dダッシュボード環境

Also Published As

Publication number Publication date
EP3353751A1 (en) 2018-08-01
US20170092008A1 (en) 2017-03-30
US9665988B2 (en) 2017-05-30
EP3353751A4 (en) 2019-03-20
WO2017054004A8 (en) 2018-02-08
JP2018533099A (ja) 2018-11-08
US10417812B2 (en) 2019-09-17
US20170193688A1 (en) 2017-07-06

Similar Documents

Publication Publication Date Title
US10417812B2 (en) Systems and methods for data visualization using three-dimensional displays
US10928974B1 (en) System and method for facilitating user interaction with a three-dimensional virtual environment in response to user input into a control device having a graphical interface
US11775080B2 (en) User-defined virtual interaction space and manipulation of virtual cameras with vectors
US9224237B2 (en) Simulating three-dimensional views using planes of content
US9437038B1 (en) Simulating three-dimensional views using depth relationships among planes of content
US9886102B2 (en) Three dimensional display system and use
CN105981076B (zh) 合成增强现实环境的构造
Capece et al. Graphvr: A virtual reality tool for the exploration of graphs with htc vive system
WO2022218146A1 (en) Devices, methods, systems, and media for an extended screen distributed user interface in augmented reality
CN116097316A (zh) 用于非模态中心预测的对象识别神经网络
McCrae et al. Exploring the design space of multiscale 3D orientation
Yan et al. Multitouching the fourth dimension
CN104820584B (zh) 一种面向层次化信息自然操控的3d手势界面的构建方法及系统
Zhang et al. Physically interacting with four dimensions
Piumsomboon Natural hand interaction for augmented reality.
Lee et al. A Responsive Multimedia System (RMS): VR Platform for Immersive Multimedia with Stories
Park et al. 3D Gesture-based view manipulator for large scale entity model review
Pavlopoulou et al. A Mixed Reality application for Object detection with audiovisual feedback through MS HoloLenses
Shao Research on the Digital Promotion and Development of the Achang Forging Skills in Yunnan
채한주 Designing Physical-Object-Oriented Interactions in Augmented Reality
Palamidese A virtual reality interface for space planning tasks
Chen Data Visualization, A Survey
Fernando et al. Master of Science (M. Sc.)
CN115997385A (zh) 基于增强现实的界面显示方法、装置、设备、介质和产品
Huang et al. OpenGL based intuitive interaction technology for 3D graphical system by 2D devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16849900

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2018502734

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016849900

Country of ref document: EP