US20160011742A1 - System And Method For Providing User Access - Google Patents

System And Method For Providing User Access Download PDF

Info

Publication number
US20160011742A1
US20160011742A1 US14/859,907 US201514859907A US2016011742A1 US 20160011742 A1 US20160011742 A1 US 20160011742A1 US 201514859907 A US201514859907 A US 201514859907A US 2016011742 A1 US2016011742 A1 US 2016011742A1
Authority
US
United States
Prior art keywords
plurality
objects
user
frequency
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/859,907
Inventor
Robb Fujioka
Daniel Miyahara
Justin Nishiki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mattel Inc
Original Assignee
FUHU Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US20997409P priority Critical
Priority to US12/722,058 priority patent/US9141261B2/en
Application filed by FUHU Inc filed Critical FUHU Inc
Priority to US14/859,907 priority patent/US20160011742A1/en
Publication of US20160011742A1 publication Critical patent/US20160011742A1/en
Assigned to MATTEL, INC. reassignment MATTEL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUHU, INC.
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/373Details of the operation on graphic patterns for modifying the size of the graphic pattern
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0464Positioning
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Abstract

The present invention provides a system and method for providing a display, a graphical user interface rendered on the display having associated with it a plurality of objects representative of one or more computing functions, and a processor for varying the position of at least one of the plurality of objects, wherein the processor provides for the continuous repositioning of at least one of the plurality of objects.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 12/722,058 entitled “SYSTEM AND METHOD FOR PROVIDING USER ACCESS” filed on Mar. 11, 2010, that claims the benefit of U.S. Provisional Patent Application No. 61/209,974 entitled “SYSTEM AND METHOD FOR PROVIDING USER ACCESS” filed on Mar. 11, 2009, both of which are hereby incorporated by reference.
  • FIELD
  • The present invention is directed to an icon management system and, more particularly, to a graphical user interface for the representation of and interaction with one or more objects, and a method of making and using same.
  • BACKGROUND
  • Existing operating systems (OS), graphical user interfaces (GUI), toolbars and docks are constrained in many ways, including at least because they have a rigidly structured layout and/or they are limited in the number of objects that they can represent in the available screen space. Further, the typical layout of icons in a computer environment generally restricts the icons to a certain portion of the display. For example, a typical arrangement of icons on a desktop of an OS would begin with restriction to the left quadrant of the screen, while a dock would be restricted to the lower portion of the display area. This restraint limits the number of icons which may be practically represented by the system and limits the ability of the user to easily, visually locate and access specific icons or objects.
  • Thus, there exists a need for an icon management system and, more particularly, to a graphical user interface and/or a graphical presentation of an OS for the representation of one or more objects, and a method of making and using same which may allow a user to more easily identify, organize and access system icons.
  • SUMMARY
  • The present invention includes at least a graphical user interface for the display and selection of icons and objects associated with system accessible applications and information.
  • Thus, the present invention provides an icon management system and, more particularly, a graphical user interface and/or graphical OS for the representation of and interaction with one or more objects, and a method of making and using same.
  • BRIEF DESCRIPTION OF THE FIGURES
  • The present invention will be described hereinbelow in conjunction with the following figures, in which like numerals represent like items, and wherein:
  • FIG. 1 illustrates an exemplary embodiment of the present invention;
  • FIG. 2 illustrates an exemplary embodiment of the present invention;
  • FIG. 3 illustrates an exemplary embodiment of the present invention; and
  • FIG. 4 illustrates an exemplary embodiment of the present invention;
  • FIG. 5 illustrates an exemplary embodiment of the present invention;
  • FIG. 6 illustrates an exemplary embodiment of the present invention; and
  • FIG. 7 illustrates an exemplary embodiment of the present invention;
  • FIG. 8 illustrates an exemplary embodiment of the present invention;
  • FIG. 9 illustrates an exemplary embodiment of the present invention; and
  • FIG. 10 illustrates an exemplary embodiment of the present invention.
  • DETAILED DESCRIPTION
  • In the description that follows, the invention is described with reference to acts and symbolic representations of operations that are performed by one or more computers, unless indicated otherwise. As such, it will be understood that such acts and operations, which are at times referred to as being computer-executed, include the manipulation by the processing unit of the computer of electrical signals representing data in a structured form. This manipulation transforms the data or maintains it at locations in the memory system of the computer, which reconfigures or otherwise alters the operation of the computer in a manner well understood by those skilled in the art. The data structures where data are maintained are physical locations of the memory that have particular properties defined by the format of the data. However, while the invention is being described in the foregoing context, it is not meant to be limiting as those of skill in the art will appreciate that several of the acts and operations described hereinafter may also be implemented in hardware.
  • It is also to be understood that the figures and descriptions of the present invention have been simplified to illustrate elements that are relevant for a clear understanding of the present invention, while eliminating, for purposes of clarity, many other elements found in a typical system and method. Those of ordinary skill in the art will recognize that other elements are desirable and/or required in order to implement the present invention. However, because such elements are well known in the art, and because they do not facilitate a better understanding of the present invention, a discussion of such elements is not provided herein. The disclosure herein below is directed to all such variations and modifications to planning technologies known, and as will be apparent, to those skilled in the art.
  • The graphical user interface (“GUI”) of the present invention provides a tool for the user of a computer to consolidate and access features provided by the computer, such as, for example: launching and managing applications; opening and managing running applications; accessing control functions between other computers and interfaces; navigating to uniform resource locators (URLs); and providing status and notifications on running processes and applications, including historical information. As such, the GUI of the present invention will be appreciated to provide graphical aspects of an OS in certain exemplary embodiments.
  • The GUI of the present invention is unrestricted as to layout and may accommodate any number of objects. As used herein, the term objects relates to the tiles, or icons, associated with a function, a collection of functions, or at least one piece of information. For example, an object may represent an application that allows a user to make phone calls via the Internet. An object may also represent a downloaded file, such as a movie. By way of further example, an object may be a link to a mail program, either resident on the local system, attached via a USB port, or directly via the Internet, for example. The objects may also be direct links to other sources of information or a conduit for instant communications. For example, an object may be linked such that information regarding the weather may be displayed once the object is activated. Similarly, an object may provide a direct communication link with a loved one, physician's office, and/or a web camera and microphone, for example. Similarly, an object may provide a link to other objects, such as sublevels of a hierarchical file system.
  • In an embodiment of the present invention, each object is represented by a circular “bubble”. Each bubble may be generally rigid and may have an interactive skin whereby the user of the GUI may manipulate the size and characteristics of each bubble. Alternatively, each bubble may be somewhat deformable, such as when “grabbed” by a cursor or when abutted by another bubble or bubbles. The characteristics of the bubble may include its physical characteristics, such as color and texture, a symbol or picture denoting the action of the object, and/or the size of the object, for example. As illustrated in FIG. 1, the each bubble within the GUI may be represented by circular skin of a thickness appropriate to be seen by the user of the GUI and to facilitate the ability of user to interact with the bubble. Thus, the skin may be between approximately 1/32″-¼″ thick, by way of non-limiting example.
  • The frequency of interaction between the users of the system and a particular bubble may enlarge some bubbles of higher frequency interaction versus those bubbles with a lower frequency of interaction. Likewise, those bubbles deemed most “important” to the user may be correspondingly enlarged. In an exemplary embodiment of the present invention, the bubbles may be divided into three size categories based on the frequency of access or use of the bubble. For example, a medium, or middle, sized bubble indicates less usage than the largest bubble, while the smallest bubble indicates even less use. The size of the large, medium and small bubbles may be 1-4 cm, ½-3 cm, and ¼-2.5 cm, respectively, by way of non-limiting example.
  • The objects of the bubbles may also be of any graphic type and may denote to the user the action assigned to a particular bubble. For example, a bubble with a storm cloud graphic, as illustrated by bubble 103, may allow for the accessing of weather information located on a third-party system, such as weather.com, for example. The bubble 103 may not only allow for the user to access the linked information, but may also have associated with it user information such that, for example, the information initially retrieved from weather.com would include a forecast local to the user. This user information may be locally held, for example by a cookie or similar cache resident on the local or semi-local system, or retrieved from a third party site where a user's preferences and other information are stored. This locally held information may include dynamic bubble content such as preview content, that is continuously updated. Such continuous updates may occur using a periodic cycle, such as every five minutes or daily, for example, and/or may push the information to the dynamic bubble as the information contained with the host provider changes. Such information may include, for example, current weather conditions, forecasted weather, live video streams and content, current time, market information. This information may be pushed or pulled from the source of the information. The source of the information may be related to, or unrelated to, the underlying content of the bubble, for example. This may include having the current weather conditions displayed on the bubble for linking to weather.com, as discussed. This may also include having a metric of web usage displayed on the bubble for downloading video content. Further, the content for display on the bubble may be internally generated via the program associated with the bubble, or otherwise generated for display.
  • The actions assigned to each bubble may also vary from initiating executable files to reorganizing certain system assessable functions and/or programs. For example, the bubble 104 in FIG. 1 may represent a particular user of the system and may provide the function of file organization. Thus, by activating the bubble, a user may be presented with a limited grouping or subgrouping of bubbles related to that user certain of which bubbles may be presented only upon accessing of a hierarchically superior bubble. As illustrated in FIG. 5, the user who had previously selected bubble 401 may be presented with a set of bubbles associated with the selected bubble. For example, the use denoted by bubble 104 may have associated with it bubbles 103, 106, and 108-111.
  • As discussed above, the frequency of interaction between the users of the system and a particular bubble may enlarge some bubbles of higher interaction versus those bubbles with a lower frequency of interaction. Similarly, when a bubble is isolated through sorting or accessing, the bubble's relative size may change in accordance with the interaction of each bubble made by or through the sorted or accessed bubble. For example, as illustrated in FIG. 5, filtering by bubble 104 results in the displayed bubble 106 being larger the bubble 108. This may be contrasted with the relative sizes of bubble 106 and bubble 108 as illustrated in FIG. 2, for example. This may mean that for all of the users of the system, bubble 108 encountered the highest amount of interaction versus bubble 106. In each illustration, however, the relative size of each bubble may be predicated on the number of interactions with each bubble displayed by the GUI with the relative size of each bubble interdependent on all such interactions.
  • The associations between certain bubbles may be set by the user and may include factors such as frequency of use, user preferences, location usage, system requirements, system functionality, availability of peripherals, and connectivity, for example. Similarly, a bubble may include, for example, a graphic showing a desk with a light or a kitchen sink which may, respectively denote an “office” and a “kitchen.” Each of these spaces may have associated with them a certain collection of bubbles that are desired to be used in those spaces. Regardless of whether linked to a particular user, space, and/or function, the present invention allows for the filtering of system functions based on user and system defined associations.
  • In an exemplary embodiment of the present invention, the bubbles may be in motion, or in a constant state of motion, and may move in a random direction until they come into contact with another bubble or edge of the screen, or another pre-set boundary. The reaction of the bubbles from a collision with another bubble or edge may be similar to that of billiards balls. Although the mathematics behind the relative action of objects in a “billiards” format is well known to those skilled in the art, a sample of various particulars associated with the present invention is included in the code herein, and as will be indicated therefrom to those skilled in the art.
  • A collision check is done whenever the calculations that control the action of the bubbles are recalculated. In an embodiment of the present invention, the calculations are recalculated each time the GUI is refreshed, which may be optimally set at 42 frames per second. Of course, the refresh rate may be reduced or increased depending on the capabilities of the system that is running the GUI and the resulting look at utility of the GUI. The collisions between bubbles and boundaries follows the nature of elastic collision; the directions of velocity after collision are tangent lines based off the line of collision and the line of intersection of the two objects. This calculation may be done in the following manner, without limitation:
  • package org.cove.ape { internal final class CollisionResolver { internal static function resolveParticleParticle( pa:AbstractParticle, pb:AbstractParticle, normal:Vector, depth:Number):void { // a collision has occured. set the current positions to sample locations pa.curr.copy(pa.samp); pb.curr.copy(pb.samp);  var mtd:Vector = normal.mult(depth);  var te:Number = pa.elasticity + pb.elasticity;  var sumInvMass:Number = pa.invMass + pb.invMass;  // the total friction in a collision is combined but clamped to [0, 1]  var tf:Number = clamp(1 − (pa.friction + pb.friction), 0, 1);  // get the collision components, vn and vt  var ca:Collision = pa.getComponents(normal);  var cb:Collision = pb.getComponents(normal);  // calculate the coefficient of restitution based on the mass, as the normal component  var vnA:Vector = (cb.vn.mult((te + 1) * pa.invMass).plus( ca.vn.mult(pb.invMass − te * pa.invMass))).divEquals(sumInvMass);  var vnB:Vector = (ca.vn.mult((te + 1) * pb.invMass).plus( cb.vn.mult(pa.invMass − te * pb.invMass))).divEquals(sumInvMass);  // apply friction to the tangental component  ca.vt.multEquals(tf);  cb.vt.multEquals(tf); // scale the mtd by the ratio of the masses. heavier particles move less var mtdA:Vector = mtd.mult( pa.invMass / sumInvMass); var mtdB:Vector = mtd.mult(−pb.invMass / sumInvMass); // add the tangental component to the normal component for the new velocity vnA.plusEquals(ca.vt); vnB.plusEquals(cb.vt); if (! pa.fixed) pa.resolveColiision(mtdA, vnA, normal, depth, −1, pb); if (! pb.fixed) pb.resolveCollision(mtdB, vnB, normal, depth, 1, pa); } internal static function clamp(inputNumber, min:Number, max:Number):Number { if (input> max) return max; if (input < min) return min;
  • In an embodiment of the present invention, the friction constant for the bubbles may be held at zero with an assigned mass of 1, regardless of the size of the object. Using these variables, all objects, regardless of size, may act uniformly under the same conditions. Similarly, when a collision occurs, elasticity may be what affects the resulting “bouncing” velocity on both of the colliding objects. In an embodiment of the present invention, the border or edge elasticity may be approximately 0.5 with the bubble elasticity at approximately 0.3, by way of non-limiting example. The elasticity may be combined during a collision and used to calculate the coefficient of restitution for the resulting velocity. The calculation may be done as illustrated below, without limitation:
  • //ca.vn = vector normal of collision for object A //cb.vn = vector normal of collision for object B //pa = object A //pb = object B //te = combined elasticity // calculate the coefficient of restitution based on the mass, as the normal component var vnA:Vector = (cb.vn.mult((te + 1)).plus(ca.vn.mult(1− te))).divEquals(1); var vnB:Vector = (ca.vn.mult(((te + 1)).plus(cb.vn.mult(1− te))).divEquals(1);
  • Further still, the present invention may utilize the effect of damping to provide the illusion of “friction to the surface”, although there is no friction applied to the objects. In an embodiment of the present invention, the damping value may be multiplied to every object's velocity during every update and may have a value of 0.97, by way of non-limiting example. Additionally, when a bubble is in motion damping may be applied to slow the bubble down. However, when the bubble's velocity slows to lower than the pre-set minimum velocity the object may retain its previous higher velocity. In an embodiment of the present invention, the minimum velocity of the bubbles may be 0.8 in any direction, without limitation. The limit for the minimum velocity may be set in the following way, by way of non-limiting example:
  • public class DragableCircleParticle extends Circle Particle {  private var mouseIsDown:Boolean;  private var timestamp:Date;  private var old_velocity_x:Number =.1;  private var old_velocity_y:Number = .1;  public var foreverMove:Boolean = true;  public var magnetOn:Boolean = false;  public function DragableCircleParticle(  x:Number,  y:Number,  radius:Number,  fixed:Boolean = false,  mass:Number = 1,  elasticity:Number = 0.0,  friction:Number = 0){ super(x, y, radius, fixed, mass, elasticity, friction); alwaysRepaint = fixed; mouseIsDown = false; sprite.addEventListener(MouseEvent.MOUSE_DOWN,mouseDownHandler); sprite.stage.addEventListener(MouseEvent.MOUSE_UP,mouseUpHandler); sprite.stage.addEventListener(MouseEvent.MOUSE_MOVE,mouseMoveHandler); sprite.doubleClickEnabled = true;  }  private function mouseClickHandler(e:MouseEvent):void  {  }  private function mouseDownHandler(evt:MouseEvent):void  { if (fixed == false) { mouseIsDown = true; // on mouse down set mouseIsDown to true curr.setTo(evt.stageX,evt.stageY); // set the current position of the particle to the position of the mouse prev.setTo(evt.stageX, evt.stageY); // set the previous position of the particle to the position of the mouse }  }  private function mouseUpHandler(evtMouseEvent):void  { mouseIsDown = false; // on mouse up set mouseIsDown to false  }  private function mouseMoveHandler(evtMouseEvent):void  { if(mouseIsDown) { // On mouse move if the mouse is down   prev.copy(curr); // set the previous position to the curent position of the particle curr.setTo(evt.stageX,evt.stageY); // set the current position to the position of the mouse }  }  public override function update(dt2:Number):void { if(!mouseIsDown) { // Don't update if the mouse is down  if (velocity && magnetOn)  { velocity = new Vector((1024/2)−px,(600/2) − py).normalize( ): velocity = new Vector(velocity.x * 3, velocity.y * 3); super.update(dt2); old_velocity_x = px: old_velocity_y = py;  }  else { super.update(dt2); if(velocity && foreverMove) {  if( (velocity.x > .08 II velocity.x < −.08) && (velocity.y > .08 II velocity.y < −.08) )  { old_velocity_x = velocity.x; old_velocity_y = velocity.y;  }  else  { velocity = new Vector(old_velocity_x+(old_velocity x*.1), old_velocity_y+(old_velocity_y*.1 ));  } }  } }  } }
  • Additionally, the present invention, as described above, may allow for the controlled manual movement of the bubbles by a user of the GUI. If a user drags a bubble around the display area, a new velocity may be calculated through each frame until the bubble is released. In accordance with the calculation features above, the speed by which a user drags a bubble around the display area may directly impact the velocity of the bubble upon release. Further, during the dragging process, damping may be deactivated to allow for the smooth movement of the bubble. Once released from the user's control, damping may once again be applied to the bubble.
  • // ca.vt = tangental component for object A // cb.vt = tangental component for object B // add the tangental component to the normal component for the new velocity vnA.plusEquals(ca.vt); vnB.plusEquals(cb.vt);
  • In an exemplary embodiment of the present invention, the GUI may be used with a touch screen sensitive display device. A touch screen may be of any practical size and may range from large wall hung displays to personal media devices. The touch activation of the bubbles may not differ from the interaction achieved using a standard cursor. For example, by touching and holding a particular bubble, the bubble may respond and more about the display without launching its associated function. To launch the function associated with the bubble, the user may tap or click, once or twice, in quick succession, as would be known to those skilled in the art. Similarly, a single or double tap or click on or about the area of the skin of the bubble may allow the user to access the characteristics of that individual bubble.
  • Further, the real-time connection between the touch interface and a bubble may be controlled, such as by selecting preferences in the interaction of the bubble via a user input device. The selecting of preferences may include a bubble staying connected to the finger that is moving the bubble, while still interacting with the rest of the bubbles as described hereinthroughout. Gestures of the finger may also be used to enable the movement of a bubble and/or bubbles, such as by flicking the bubble away upon release to provide more velocity and a direction to the movement of the correspondent bubble. Additional interactions with the touch screen may include flicking, stopping, relocating, and changing the size of the bubble, such as by grabbing at opposite edges and pulling outward to expand the size or pushing inward to decrease the size.
  • The number of objects which may be handled by the GUI is limitless. The amount of objects which may be presently so as to be reasonably seen and utilized by a user may depend on the user's cognitive abilities, the characteristics of the display, such as size and brightness, and the relative capacity of the system engaged in supporting the GUI. For example, on an average 15″ display, the number of visible bubbles may range from approximately 1-50. Further, the use of split screens, borderless displays, and hierarchical organization may increase the number of objects in the same example. By way of further example, the use of at least two displays for one GUI may allow for at least a doubling of the objects able to be presented. Similarly, the use of at least two displays and at least two GUIs which are interconnected with the ability to share at least one object may provide a multiple fold increase in the number of objects which may be readily handled.
  • The use of a “borderless” display may also increase the number of usable objects by allowing certain objects to move out of the visible display area. The action of the bubbles out of the viewable area may be controlled in various ways. For example, a border which behaves the same as the standard viewable boarder, i.e., one in which the bubbles ricochet off from, may be placed a certain distance out of the viewable area and may retain the shape of the actual display area or may be varied on at least one side. For example, the off screen boarder of a display may be set at 4 inches outside each respective side of the visible display. Alternatively, the top border may be set at 8″ while the remaining sides are left in line with the visible area. As discussed further herein, off-screen bubbles may be recalled through filtering and other recall methods.
  • The ricochet action of the present invention may be partially or fully disabled. For example, one or more borders may be made neutral and/or sticky regardless of whether or not the ricochet characteristics of the bubbles are changed. Similarly, at least one bubble may be made neutral and/or sticky to allow for organizational preferences and increased control. For example, a sticky border may allow for a user to more easily select a bubble once the bubble is statically attached. Similarly, a bubble may be released from a sticky border by turning off the border stickiness or moving the bubble away from the sticky border, for example. Further still, neutral and/or sticky bubbles may provide increased control by the user over the objects provided by the GUI, in various ways, such as by allowing for the limiting of the ricochet effect.
  • As illustrated in FIG. 3, menus may be provided by the present invention for sorting, filtering, and arranging the bubbles. For example, the menu drop-down may provide the ability to filter the bubbles as they are assigned by the user. In the illustrated instance, the bubbles may be assigned to specific room type assignments, by applications and/or application types, and by popularity, for example. Choosing one of the presented menu options may sort and present a limited number of objects to the user. As illustrated in FIG. 4, the selecting of the “user” menu offering 402 may provide for display and interaction of only those objects that identify specific users. For example, bubbles 104 and 105 may represent two user objects. Each user object may have associated therewith at least one other object within the system. For example, by selecting the bubble 104, the objects associated with the user represented through bubble 104 may be presented as illustrated in FIG. 5.
  • As further illustrated in FIG. 5, and as discussed above in greater detail, the bubbles associated with user as depicted in bubble 104 may be representative in size correlative to the frequency of use and/or access by that particular user. By way of non-limiting example only, the user filtered display illustrated in FIG. 5 illustrates a relative higher frequency accessing of bubble 106 by the selected user than by other users of the system, as illustrated by FIG. 1. Although the system may assign a certain subset of objects to be associated with a user, for example, the user may define the criteria and or specific objects which may be associated. For example, although the user illustrated in FIG. 5 may access more of the system objects than the objects shown, a limitation on the filtering of objects may present bubbles associated with the user that the user has accessed a minimum amount of times in a given time period, for example. By way of further example, the user illustrated in FIG. 5 may have accessed the objects displayed at least a minimum of 10 times in the last month, for example.
  • The present invention may also allow for users of the system to keep and control personal user information which, as described further herein, may allow for a more tailored experience of the present invention by the user. As illustrated in FIG. 6, the user menu may allow for the accessing of a user's object, such as to change and input various information ranging from system preferences to social networking data to passwords and bookmarked URLs. As would be known by those skilled in the art, any range of information which may have use associated with a user of the present system may be accessed through the interface illustrated in FIG. 6.
  • As discussed herein, the embodiments may be directed to bubble interaction using billiard ball physics, user defined fixed placement, automatic fixed placement and/or based on an attribute. Attribute based alignment or interaction may include alphabetical, most recently used, most used and the like and may have orientations such as from A to Z across the screen, from A to Z from top to bottom, and/or other orientations. Further, the most used bubbles may be placed in the middle of the screen, and may extend there from in opposite directions as the amount of use of the bubble decreases. This type of configuration may provide advantages, such as when all of the bubbles fail to fit on the display screen at the same time. Having the most used in the center of the display may provide user convenience in accessing the bubbles not in view, and may require lesser used bubbles to be shifted in order to access them.
  • User defined fixed placement may include placement of the bubbles in fixed locations based on user selection. Such a configuration may include dividing the display into a grid, such as a one, two and/or three-dimensional grid having rows and columns, and placing the bubbles one at a time in a row and column location. Once a location is occupied, placement of a subsequent bubble may be limited or prevented. Bubbles may be placed in diagonals as well. Further, users may define a subset region of the display for placement of a bubble, for example. In such a configuration, a bubble may be confined to the upper left quadrant of the display, for example. Such confining may be combined with user-defined fixed placement, automatic fixed placement, and/or billiard ball physics, for example. Additionally, areas of the display may be reserved in certain configurations or embodiments for predefined types and/or sizes of bubbles, and changes in orientation may result in placement of icons in corresponding reserved areas based on the bubble attribute, for example.
  • Further, the automatic fixed placement on a grid, such as a one, two and/or three-dimensional grid, may be based, similarly to user defined fixed placement discussed above, on the nearest location of the bubble to a grid position and/or based on an attribute of the bubble, or on a zonal placement of certain types of bubbles, for example. Automatic placement may lock the bubble to a predefined configuration.
  • In an embodiment of the present invention, the bubbles and objects may be organized and held static in various views, including that illustrated in FIG. 7. In this illustration, the bubbles are organized in a linear view and may be moved side to side by dragging any individual or series of bubbles from either left to right or right to left across the display. The bubbles may also be selected and activated as described more fully herein.
  • Exemplary embodiments of the present invention may be implemented on various electronic systems including computers, dedicated computers or computing appliances, handhelds and cell phones, for example. However, it will be readily appreciated by those skilled in the art that GUI according to the present invention may be used in combination with any system having a processor and a display. In general, such systems, as illustrated in the exemplary block diagram form by FIG. 8, comprise a bus 800 for communicating information, a processor 801 coupled with the bus for processing information and instructions, a random access memory 802 coupled with the bus 800 for storing information and instructions for the processor 801, a read only memory 803 coupled with the bus 800 for storing static information and instructions for the processor 801, a data storage device 804 such as a magnetic disk and disk drive, flash memory or CD ROM drive coupled with the bus 800 for storing information and instructions, a display device 805 coupled to the bus 809 for displaying information to the computer user, a keyboard device 806 including alphanumeric and function keys coupled to the bus 800 for communicating information and command selections to the processor 801 t a cursor control device 807 coupled to the bus for communicating information and command selections to the processor 801, and a signal generating device 808 coupled to the bus 800 for communicating command selections to the processor 801.
  • The display device 805 utilized with the computer system of the present invention may be a liquid crystal device, touchscreen, cathode ray tube, or other display device suitable for creating images and alphanumeric characters (and ideographic character sets) recognizable to the user. The cursor control device 807 allows the computer user to dynamically signal the two dimensional movement of a visible symbol (cursor) on a display screen of the display device 805. Many implementations of the cursor control device are known in the art including a trackball, mouse, joystick or special keys on the keyboard device 806 capable of signaling movement of a given direction or manner of displacement. It is to be appreciated that the cursor also may be directed and/or activated via input from the keyboard using special keys and key sequence commands. Alternatively, the cursor may be directed and/or activated via input from a number of specially adapted cursor directing devices, including those uniquely developed for the disabled. In the discussions regarding cursor movement and/or activation within the preferred embodiment, it is to be assumed that the input cursor directing device or push button may consist any of those described above and specifically is not limited to the mouse cursor device.
  • Further, it will be appreciated by those skilled in the art that neither the cursor control device 807 nor the keyboard device 806 are necessary if the display device 805 is a touch-screen sensitive device. In an embodiment of the present invention, the graphical user interface is operated through user interactions facilitated by a touch-screen enabled display device 805. Such functionality may allow for the use of the present invention using only a display and without the need for peripherals, such as a mouse or keyboard.
  • As discussed in some measure above, the present invention may be used on a local and/or semi-local system. Systems that are not strictly local may include systems interconnected with use of at least one additional GUI and related peripherals. With reference to FIGS. 9 and 10, the present invention provides a GUI capable of driving the hardware, software, appliances, and peripherals desired to be driven by a user of any particular computing platform. For example, the present invention provides a GUI for a modular computing solution that may provide a computing solution that can be developed uniquely by each user for that user's life style profile. For example, modules may be added for use with a computing platform for which the present invention may provide access to the accessible applications and peripherals, such as internet computing, word processing computing, a set top box, a recipe system for use in a kitchen, or an iPod/MP3 player/radio for use by a swimming pool, by way of non-limiting example. Such myriad capabilities may be modularly provided to a base computing system and may be automatically served by the GUI.
  • For example, a base computing system may preferably provide an open platform for use with the present invention. The base computing system may be, for example, a Windows or Linux based machine, and/or may employ open source software and platforming, and/or may thus provide an openware solution. Such an “open format” computer may provide only that functionality desired by the user in a base model computer. Thus, for example, as illustrated in FIG. 10, the computing device for use with the present invention at the base level may be provided with a plurality of USB, Ethernet, RCA, proprietary, Optical, and like inputs, into which may be connected any number of appliances, peripherals, software modules, and the like. Such input ports may include, for example, a hardware and software port at the top of the unit, in the center and facing upward and/or one more ports on the front, sides, bottom, back, and even on the stand for the unit, and/or a power port, for example. Any mix of types of such ports, and in any locations, may be provided. Further, an application may, in certain embodiments, entail provision of application software at one port of the present invention, and operation of application specific hardware correspondent to the application software at one or more other ports. Ports may, for the purposes discussed herein, be interchangeable as to what is applied to any given port in a particular application, and may not be interchangeable for the purposes of other application modules. Additionally, the unit may be enabled to operate for periods of time, or permanently, on one or more batteries, rather than a plug-in power supply.
  • More specifically, in an exemplary embodiment of the present invention, a user may purchase, inexpensively, such as for ninety-nine dollars, a base computing system. Resident on that base computing system may be the present invention functioning as, or along with, limited operating system software, such as Linux or Windows based operating system software, such as a thin client or thick client operating system. The operating system and/or GUI of the present invention may provide, upon implementation and/or purchase of the computer, no, or very limited, application functionality. However, the user may be enabled to add on application hardware/software modules to provide functional computing aspects for execution by the open operating system and for consolidation and access by the GUI.
  • The disclosure herein is directed to the variations and modifications of the elements and methods of the invention disclosed that will be apparent to those skilled in the art in light of the disclosure herein. Thus, it is intended that the present invention covers the modifications and variations of this invention, provided those modifications and variations come within the scope of the appended claims and the equivalents thereof.

Claims (24)

We claim:
1. A computer system, comprising:
a display;
a graphical user interface rendered on the display, wherein the graphical user interface displays a plurality of objects representative of one or more computing functions;
a processor that varies a size of at least one of the plurality of objects based on determining a sum total of a frequency of user interaction of each of a plurality of users with the at least one of the plurality of objects.
2. The computer system of claim 1, wherein one of the plurality of objects further includes a user object that is representative of a user.
3. The computer system of claim 2, wherein the processor filters the plurality of objects based on a selection of the user object.
4. The computer system of claim 3, wherein the processor varies the size of the at least one of the plurality of objects based on the frequency of user interaction of the user with the at least one of the plurality of objects.
5. The computer system of claim 1, wherein the processor varies a position of the at least one of the plurality of objects based on the frequency of user interaction of a user with the at least one of the plurality of objects.
6. The computer system of claim 1, wherein the processor varies a position of a first object of the plurality of objects that is proximate to one or more of a boundary of the graphical user interface and a second object of the plurality of objects.
7. The computer system of claim 1, further comprising a control device that facilitates the frequency of user interaction, and wherein the control device is one of a cursor and a touch-screen.
8. The computer system of claim 1, further comprising a skin that encompasses the at least one of the plurality of objects.
9. The computer system of claim 8, wherein the skin facilitates the frequency of user interaction with the at least one of the plurality of objects.
10. The computer system of claim 1,
wherein the processor increases a first size of a first object of the plurality of objects relative to a second size of a second object of the plurality of objects based on a first sum total of the frequency of user interaction of each of the plurality of users with the first object higher than a second sum total of the frequency of user interaction of each of the plurality of users with the second object, and
wherein the processor decreases the first size relative to the second size based on the first sum total of the frequency of user interaction lower than the second sum total of the frequency of user interaction.
11. The computer system of claim 1, wherein the processor varies the size of the at least one of the plurality of objects based on an importance of the at least one of the plurality of objects to the user.
12. The computer system of claim 1, wherein the processor varies a position of the at least one of the plurality of objects based on one or more of billiard ball physics, a user defined fixed placement, and an automatic fixed placement.
13. A computer-implemented method, comprising:
rendering, by a display, a graphical user interface;
displaying, by the graphical user interface, a plurality of objects representative of one or more computing functions; and
varying, by a processor, a size of at least one of the plurality of objects based on determining a sum total of a frequency of user interaction of each of a plurality of users with the at least one of the plurality of objects.
14. The method of claim 13, wherein one of the plurality of objects further includes a user object that is representative of a user.
15. The method of claim 14, further comprising filtering, by the processor, the plurality of objects based on a selection of the user object.
16. The method of claim 15, wherein further comprising varying, by the processor, the size of the at least one of the plurality of objects based on the frequency of user interaction of the user with the at least one of the plurality of objects.
17. The method of claim 13, further comprising varying, by the processor, a position of the at least one of the plurality of objects based on the frequency of user interaction of a user with the at least one of the plurality of objects.
18. The method of claim 13, further comprising varying, by the processor, a position of a first object of the plurality of objects that is proximate to one or more of a boundary of the graphical user interface and a second object of the plurality of objects.
19. The method of claim 13, further comprising facilitating, by a control device, the frequency of user interaction, wherein the control device is one of a cursor and a touch-screen.
20. The method of claim 13, further comprising encompassing, by a skin, the at least one of the plurality of objects.
21. The method of claim 20, facilitating, by the skin, the frequency of user interaction with the at least one of the plurality of objects.
22. The method of claim 13, further comprising:
increasing, by the processor, a first size of a first object of the plurality of objects relative to a second size of a second object of the plurality of objects based on a first sum total of the frequency of user interaction of each of the plurality of users with the first object higher than a second sum total of the frequency of user interaction of each of the plurality of users with the second object, and
decreasing, the processor, the first size relative to the second size based on the first sum total of the frequency of user interaction lower than the second sum total of the frequency of user interaction.
23. The method of claim 13, further comprising varying, by the processor, the size of the at least one of the plurality of objects based on an importance of the at least one of the plurality of objects to the user.
24. The method of claim 13, further comprising varying, by the processor, a position of the at least one of the plurality of objects based on one or more of billiard ball physics, a user defined fixed placement, and an automatic fixed placement.
US14/859,907 2009-03-11 2015-09-21 System And Method For Providing User Access Abandoned US20160011742A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US20997409P true 2009-03-11 2009-03-11
US12/722,058 US9141261B2 (en) 2009-03-11 2010-03-11 System and method for providing user access
US14/859,907 US20160011742A1 (en) 2009-03-11 2015-09-21 System And Method For Providing User Access

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/859,907 US20160011742A1 (en) 2009-03-11 2015-09-21 System And Method For Providing User Access

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/722,058 Continuation US9141261B2 (en) 2009-03-11 2010-03-11 System and method for providing user access

Publications (1)

Publication Number Publication Date
US20160011742A1 true US20160011742A1 (en) 2016-01-14

Family

ID=42728796

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/722,058 Active 2033-05-18 US9141261B2 (en) 2009-03-11 2010-03-11 System and method for providing user access
US14/859,907 Abandoned US20160011742A1 (en) 2009-03-11 2015-09-21 System And Method For Providing User Access

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/722,058 Active 2033-05-18 US9141261B2 (en) 2009-03-11 2010-03-11 System and method for providing user access

Country Status (3)

Country Link
US (2) US9141261B2 (en)
EP (1) EP2406708A1 (en)
WO (1) WO2010105084A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD766313S1 (en) * 2015-01-20 2016-09-13 Microsoft Corporation Display screen with animated graphical user interface
USD776673S1 (en) * 2015-01-20 2017-01-17 Microsoft Corporation Display screen with animated graphical user interface
USD776674S1 (en) * 2015-01-20 2017-01-17 Microsoft Corporation Display screen with animated graphical user interface
USD776672S1 (en) * 2015-01-20 2017-01-17 Microsoft Corporation Display screen with animated graphical user interface

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9235341B2 (en) 2010-01-20 2016-01-12 Nokia Technologies Oy User input
US8914732B2 (en) * 2010-01-22 2014-12-16 Lg Electronics Inc. Displaying home screen profiles on a mobile terminal
US20120117510A1 (en) * 2010-11-05 2012-05-10 Xerox Corporation System and method for automatically establishing a concurrent data connection with respect to the voice dial features of a communications device
US8949720B1 (en) * 2011-05-09 2015-02-03 Symantec Corporation Systems and methods for managing access-control settings
AU345903S (en) * 2012-03-05 2012-12-05 Apple Inc Display screen for an electronic device
US9965130B2 (en) * 2012-05-11 2018-05-08 Empire Technology Development Llc Input error remediation
US8823667B1 (en) * 2012-05-23 2014-09-02 Amazon Technologies, Inc. Touch target optimization system
KR20140068410A (en) * 2012-11-28 2014-06-09 삼성전자주식회사 Method for providing user interface based on physical engine and an electronic device thereof
USD757104S1 (en) * 2013-01-04 2016-05-24 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US9671868B2 (en) 2013-06-11 2017-06-06 Honeywell International Inc. System and method for volumetric computing
US20150058140A1 (en) * 2013-08-21 2015-02-26 Electronic Arts, Inc. Systems and methods for in-application offers
US9608869B2 (en) * 2013-09-20 2017-03-28 Oracle International Corporation Enterprise applications navigation using tile characteristics that change with applications data
USD744528S1 (en) * 2013-12-18 2015-12-01 Aliphcom Display screen or portion thereof with animated graphical user interface
USD769930S1 (en) * 2013-12-18 2016-10-25 Aliphcom Display screen or portion thereof with animated graphical user interface
US9785243B2 (en) 2014-01-30 2017-10-10 Honeywell International Inc. System and method for providing an ergonomic three-dimensional, gesture based, multimodal interface for use in flight deck applications
USD781320S1 (en) * 2014-09-08 2017-03-14 Salesforce.Com, Inc. Display screen or portion thereof with graphical user interface
USD765099S1 (en) * 2014-10-15 2016-08-30 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
US10042538B2 (en) * 2014-11-26 2018-08-07 International Business Machines Corporation Enumeration and modification of cognitive interface elements in an ambient computing environment
USD761301S1 (en) * 2014-12-11 2016-07-12 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional graphical user interface
USD774089S1 (en) * 2015-02-25 2016-12-13 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD779552S1 (en) * 2015-02-27 2017-02-21 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
USD798320S1 (en) 2015-03-18 2017-09-26 Adp, Llc Display screen with graphical user interface
USD772246S1 (en) * 2015-03-18 2016-11-22 Adp, Llc Display screen or portion thereof with animated graphical user interface
USD805090S1 (en) 2015-03-18 2017-12-12 Adp, Llc Display screen with graphical user interface
USD827667S1 (en) * 2015-10-08 2018-09-04 Your Voice Usa Corp. Display screen with graphical user interface
USD801384S1 (en) * 2015-12-24 2017-10-31 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional icon
USD802012S1 (en) * 2015-12-24 2017-11-07 Samsung Electronics Co., Ltd. Display screen or portion thereof with transitional icon
USD778942S1 (en) * 2016-01-11 2017-02-14 Apple Inc. Display screen or portion thereof with graphical user interface
USD798895S1 (en) * 2016-08-12 2017-10-03 Salesforce.Com, Inc. Display screen or portion thereof with animated graphical user interface
USD810778S1 (en) * 2016-09-06 2018-02-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with icon
USD824943S1 (en) * 2017-01-24 2018-08-07 Virtual Diamond Boutique Inc. Display screen with a graphical user interface
USD863333S1 (en) * 2017-12-28 2019-10-15 Facebook, Inc. Display panel of a programmed computer system with a graphical user interface

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040017376A1 (en) * 2002-07-29 2004-01-29 Roberto Tagliabue Graphic entries for interactive directory

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7216304B1 (en) * 2000-01-05 2007-05-08 Apple Inc. Graphical user interface for computers having variable size icons
US7404147B2 (en) 2000-04-24 2008-07-22 The Trustees Of Columbia University In The City Of New York System and method for dynamic space management of a display space
KR100703690B1 (en) * 2004-11-19 2007-04-05 삼성전자주식회사 User interface and method for managing icon by grouping using skin image
US20070064004A1 (en) * 2005-09-21 2007-03-22 Hewlett-Packard Development Company, L.P. Moving a graphic element
TWI327308B (en) * 2006-10-05 2010-07-11 Acer Inc Handheld electronic apparatus with functions of intelligent remote control

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040017376A1 (en) * 2002-07-29 2004-01-29 Roberto Tagliabue Graphic entries for interactive directory

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD766313S1 (en) * 2015-01-20 2016-09-13 Microsoft Corporation Display screen with animated graphical user interface
USD776673S1 (en) * 2015-01-20 2017-01-17 Microsoft Corporation Display screen with animated graphical user interface
USD776674S1 (en) * 2015-01-20 2017-01-17 Microsoft Corporation Display screen with animated graphical user interface
USD776672S1 (en) * 2015-01-20 2017-01-17 Microsoft Corporation Display screen with animated graphical user interface

Also Published As

Publication number Publication date
WO2010105084A1 (en) 2010-09-16
US20100281408A1 (en) 2010-11-04
US9141261B2 (en) 2015-09-22
EP2406708A1 (en) 2012-01-18

Similar Documents

Publication Publication Date Title
EP2939098B1 (en) Device, method, and graphical user interface for transitioning between touch input to display output relationships
CA2840885C (en) Launcher for context based menus
JP6002836B2 (en) Device, method, and graphical user interface for transitioning between display states in response to a gesture
EP2681649B1 (en) System and method for navigating a 3-d environment using a multi-input interface
KR101424294B1 (en) Multi-touch uses, gestures, and implementation
CN102147679B (en) Method and system for multi-screen hold and drag gesture
KR101922749B1 (en) Dynamic context based menus
US9367233B2 (en) Display apparatus and method thereof
US6377285B1 (en) Zooming space-grid for graphical user interface
RU2597522C2 (en) Ordering tiles
US7480873B2 (en) Method and apparatus for manipulating two-dimensional windows within a three-dimensional display model
US8553038B2 (en) Application programming interfaces for synchronization
US7921376B2 (en) Method and apparatus for providing a three-dimensional task gallery computer interface
US6081270A (en) Method and system for providing an improved view of an object in a three-dimensional environment on a computer display
TWI528300B (en) Three-dimensional image of the organization, recall and use apps
US9037995B2 (en) Application programming interfaces for scrolling operations
US7904829B2 (en) User-defined assistive GUI glue
US20010028369A1 (en) Three dimensional spatial user interface
JP2014501998A (en) System and method for presenting multiple frames on a touch screen
US8910072B2 (en) Browsing and interacting with open windows
KR101825799B1 (en) Systems and methods for displaying notifications received from multiple applications
US9377932B2 (en) Flipping through content
US20160004431A1 (en) Device, Method, and Graphical User Interface for Determining Whether to Scroll or Select Content
US6052130A (en) Data processing system and method for scaling a realistic object on a user interface
US7594185B2 (en) Virtual desktop manager

Legal Events

Date Code Title Description
AS Assignment

Owner name: MATTEL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUHU, INC.;REEL/FRAME:037917/0349

Effective date: 20160129

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION