US20130103446A1 - Information sharing democratization for co-located group meetings - Google Patents
Information sharing democratization for co-located group meetings Download PDFInfo
- Publication number
- US20130103446A1 US20130103446A1 US13/278,065 US201113278065A US2013103446A1 US 20130103446 A1 US20130103446 A1 US 20130103446A1 US 201113278065 A US201113278065 A US 201113278065A US 2013103446 A1 US2013103446 A1 US 2013103446A1
- Authority
- US
- United States
- Prior art keywords
- computing device
- posture
- meeting
- display device
- whenever
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/109—Time management, e.g. calendars, reminders, meetings or time accounting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/109—Time management, e.g. calendars, reminders, meetings or time accounting
- G06Q10/1093—Calendar-based scheduling for persons or groups
- G06Q10/1095—Meeting or appointment
Abstract
Information sharing between meeting attendees during a co-located group meeting in a meeting space is democratized using a computer that is operating cooperatively with one or more object sensing devices in the meeting space to identify postures formed by the meeting attendees.
Description
- Project teams routinely hold group meetings to discuss the projects they are working on. During these group meetings the meeting attendees may discuss the status of tasks associated with a given project, they may assign and prioritize the tasks, and they may make decisions on the project, among other things. These group meetings are often very collaborative and interactive. Additionally, these group meetings can involve frequent information sharing between two or more of the meeting attendees. These group meetings can also involve “on-the-fly” information manipulation and/or annotation by a given meeting attendee, where the manipulation/annotation is intended to be publicly viewable by the other meeting attendees. Due to ongoing technology advances in areas such as portable personal computing devices, mobile computing applications, data communication, and computer networking, the meeting attendees will often each bring one or more portable personal computing devices to these group meetings.
- This Summary is provided to introduce a selection of concepts, in a simplified form, that are further described hereafter in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Information sharing democratization technique embodiments described herein generally involve democratizing information sharing during a co-located group meeting in a meeting space. In one exemplary embodiment, whenever a condition occurs which includes a meeting attendee who is not within a prescribed distance of a public display device in the meeting space forming a remote location indicator posture and remotely pointing this posture at the public display device, a computer will operate cooperatively with an audience-oriented object sensing device in the meeting space to identify this condition. The computer will also enable a remote location indicator mode for as long as this condition continues, and will display a location indicator on the public display device in a position thereon corresponding to where the remote location indicator posture is currently being pointed.
- In another exemplary embodiment, whenever another condition occurs which includes a first meeting attendee who is within a prescribed distance of a public display device having a touch-enabled display screen in the meeting space forming the remote location indicator posture and remotely pointing this posture at a second meeting attendee who is not within the prescribed distance of the public display device, a computer will operate cooperatively with both an audience-oriented object sensing device in the meeting space, and a presenter-oriented object sensing device in the meeting space, to identify this condition, identify the second meeting attendee, and identify a personal computing device that is associated with the second meeting attendee. The computer will also enable a presenter-to-audience-member transfer mode for as long as this condition continues. Whenever the presenter-to-audience-member transfer mode is enabled and the first meeting attendee performs an information-push touch gesture on an information object that is displayed on the display screen, the computer will transmit a copy of the information object to the personal computing device.
- In yet another exemplary embodiment, whenever a first condition occurs which includes a first meeting attendee who is using a touch-enabled non-handheld computing device forming the remote location indicator posture and remotely pointing this posture at a second meeting attendee, a computer will operate cooperatively with an object sensing device in the meeting space to identify the first condition, identify the second meeting attendee, and identify a personal computing device that is associated with the second meeting attendee. The computer will also enable a peer-to-peer transfer mode for as long as the first condition continues. Whenever the peer-to-peer transfer mode is enabled and the first meeting attendee performs the information-push touch gesture on a first information object that is displayed on a display screen of the non-handheld computing device, the computer will receive a copy of the first information object from the non-handheld computing device, and will forward the copy of the first information object to the personal computing device. Whenever a second condition occurs which includes the peer-to-peer transfer mode being disabled and a third meeting attendee who is using a touch-enabled handheld computing device forming a device pointing posture and remotely pointing this posture at the second meeting attendee, the computer will operate cooperatively with the object sensing device to identify the second condition, identify the second meeting attendee, and identify a personal computing device that is associated with the second meeting attendee. The computer will also enable the peer-to-peer transfer mode for as long as the second condition continues. Whenever the peer-to-peer transfer mode is enabled and the third meeting attendee performs the information-push touch gesture on a second information object that is displayed on a display screen of the handheld computing device, the computer will receive a copy of the second information object from the handheld computing device, and will forward the copy of the second information object to the personal computing device.
- The specific features, aspects, and advantages of the information sharing democratization technique embodiments described herein will become better understood with regard to the following description, appended claims, and accompanying drawings where:
-
FIG. 1 is a diagram illustrating an exemplary embodiment, in simplified form, of a meeting space framework for implementing the information sharing democratization technique embodiments described herein. -
FIG. 2 is a diagram illustrating an exemplary embodiment, in simplified form, of a system framework for implementing the information sharing democratization technique embodiments described herein. -
FIGS. 3A-3C are diagrams illustrating one embodiment, in simplified form, of information sharing democratization techniques for allowing a meeting audience member to remotely point at a public display device and remotely manipulate an information object that is displayed on this device. -
FIGS. 4A-4C are diagrams illustrating another embodiment, in simplified form, of information sharing democratization techniques for allowing a meeting audience member to remotely point at the public display device and remotely manipulate an information object that is displayed on this device. -
FIGS. 5A-5C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting audience member to remotely execute actions on the public display device or remotely execute commands in an application that is running on a central computing device. -
FIGS. 6A-6C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting audience member who is using a touch-enabled handheld computing device to remotely transfer a copy of an information object from the handheld computing device to the public display device. -
FIGS. 7A-7C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting presenter to locally interact with information objects that were previously transferred to a touch-enabled public display device by a meeting audience member. -
FIGS. 8A-8C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting audience member who is using either a tablet computer or a touch-enabled laptop computer to remotely transfer a copy of an information object from the public display device to the tablet/laptop computer. -
FIGS. 9A and 9B are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting audience member who is using a handheld computing device to transiently and publicly share the current contents of its display screen with all the other meeting attendees. -
FIG. 10 is a diagram illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing two or more meeting presenters to locally, independently and concurrently manipulate the information that is displayed on the touch-enabled public display device. -
FIGS. 11A-11C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting presenter to transfer an information object from the touch-enabled public display device to a tertiary display region. -
FIGS. 12A-12C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting presenter to add information content to the touch-enabled public display device using a posture palette. -
FIG. 13 is a diagram illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting presenter to control an explicit touch gesture mode on the touch-enabled public display device by using a prescribed posture. -
FIG. 14 is a diagram illustrating exemplary embodiments, in simplified form, of workflow templates which can be displayed on the public display device and used by meeting attendees to organize a set of information objects which are also displayed on this device. -
FIGS. 15A-15C are diagrams illustrating an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a meeting presenter to use a three buckets template to organize the set of information objects. -
FIG. 16 is a flow diagram illustrating one embodiment, in simplified form, of a process for democratizing information sharing during a co-located group meeting in a meeting space. -
FIG. 17 is a flow diagram illustrating another embodiment, in simplified form, of a process for democratizing information sharing during a co-located group meeting in a meeting space. -
FIGS. 18A and 18B are flow diagrams illustrating yet another embodiment, in simplified form, of a process for democratizing information sharing during a co-located group meeting in a meeting space. -
FIG. 19 is a diagram illustrating a simplified example of a general-purpose computer system on which various embodiments and elements of the information sharing democratization technique, as described herein, may be implemented. - In the following description of information sharing democratization technique embodiments reference is made to the accompanying drawings which form a part hereof, and in which are shown, by way of illustration, specific embodiments in which the information sharing democratization technique can be practiced. It is understood that other embodiments can be utilized and structural changes can be made without departing from the scope of the information sharing democratization technique embodiments.
- The term “portable personal computing device” is used herein to refer to a networking-enabled computing device that is carried by a meeting attendee and can be utilized by the meeting attendee during a group meeting that is being held with other meeting attendees. The term “touch-enabled” when applied to a device is used herein to indicate that the device includes a touch-sensitive display screen which can detect the presence and location of one or more touches or touch-based movements thereon, where these touches/movements can be made with hands, passive objects (such as a stylus or the like), and the like. The information sharing democratization technique embodiments described herein are operational with any type of touch-enabled portable personal computing device examples of which include, but are not limited to, touch-enabled handheld computing devices (such as smartphones, personal digital assistants, and the like), tablet computers (which by definition are touch-enabled), and touch-enabled laptop computers (also known at notebook computers). The term “information object” is used herein to refer to a particular item of online (e.g., digital) information content, or various types of iconic representations thereof, which can include one or more different types of digital information such as text, images, animations, audio, video, web hyperlinks, and the like. The term “copy” when applied to an information object (e.g., a copy of an information object) is used herein to refer to either a bit for bit replication of the information object (e.g., a file copy), or a user-selectable link to the information object (e.g., a Uniform Resource Locator (URL)).
- Generally speaking, the information sharing democratization technique embodiments described herein involve supporting co-located, collaborative group meetings in a meeting space by democratizing access to, control of, manipulation of, and sharing of information objects across one or more touch-enabled portable personal computing devices and a public display device. Such meetings can be attended by any group of two or more meeting attendees who are physically co-located together in the meeting space and wish to discuss and share information in a collaborative and interactive manner. One example of such a group of meeting attendees is the aforementioned project team that routinely holds a group meeting to discuss a project they are working on. Exemplary types of project teams include a software development team and a marketing program development team, among others.
- During project team group meetings the meeting attendees generally work together to sort, filter, edit and categorize collections of information objects. The formal structure of these meetings is often punctuated by moments of open-ended discussion and white-boarding (sometimes collectively referred to as “brainstorming”). These meetings are commonly held in a meeting space (such as a conference room or the like) having a public display device which is viewable by all the meeting attendees. The information objects are commonly stored on a central computing device which is connected to and operates cooperatively with the public display device, where the central computing device renders the objects and displays them on the public display device.
- Generally speaking and as will be appreciated from the more detailed description that follows, the information sharing democratization technique embodiments described herein are based on a skeletal tracking user interface which allows the meeting attendees to employ various multi-modal user interaction methods which seamlessly span different modalities and devices, and which accomplish the aforementioned sharing of information objects. These multi-modal user interaction methods thus create cross-device interactions. The information sharing democratization technique embodiments employ various types of in-air gestures performed by the meeting attendees, along with various types of postures formed by the meeting attendees, for social disclosure of commands, targeting and mode setting. The information sharing democratization technique embodiments also employ various types of direct-touch input (herein also referred to as “touch gestures”) performed by the meeting attendees for command selection/execution and precise gestures.
- The information sharing democratization technique embodiments described herein are advantageous for various reasons including, but not limited to, the following. The information sharing democratization technique embodiments provide for a collaborative, three-dimensional design space which supports proxemic interactions (i.e., proximity-aware, physically situated experiences) amongst a plurality of meeting attendees in a meeting space framework that includes the public display device and a plurality of heterogeneous, touch-enabled portable personal computing devices. The information sharing democratization technique embodiments are simple, fluid, intuitive, robust, and make common information sharing tasks more democratically accessible. The information sharing democratization technique embodiments allow each meeting attendee to interact with the public display device from anywhere in the meeting space and with any touch-enabled portable personal computing device they bring to the meeting. The information sharing democratization technique embodiments are socially acceptable in a group meeting context and do not cause embarrassment of or distraction to the meeting attendees. More particularly and by way of example but not limitation, the information sharing democratization technique embodiments do not employ gestures based on large hand/arm/body motions (such as arm/hand waving, among others), or unusual gestures, or uncomfortable gestures, or gestures that could interfere with communication between the meeting attendees.
- The information sharing democratization technique embodiments described herein manifest to the meeting attendees which particular attendee is interacting with the public display device at any given point in time, and which particular attendees are interacting with each other at any given point in time, thus creating a group awareness of such interactions. It will be appreciated that without such manifestations, it could be unsettling to the meeting attendees (and thus disruptive to the meeting) to see information objects being remotely manipulated on the public display device without knowing which meeting attendee is doing the manipulation. The information sharing democratization technique embodiments do not rely upon the use of speech which can be ambiguous and can disrupt the natural conversations which take place between the meeting attendees. It is noted however that alternate embodiments of the information sharing democratization technique are possible which can selectively incorporate the use of speech. The information sharing democratization technique embodiments utilize simple grammars and each modality that is employed in these embodiments has a separate use, thus making these embodiments easy to learn and reducing the potential for errors.
- In contrast to the conventional group meeting paradigm in which one of the meeting attendees serves as a presenter who centrally controls the dissemination of the information objects during a group meeting that is being held in a meeting space (i.e., the presenter controls which information objects are displayed on the public display device at each point in time), the information sharing democratization technique embodiments described herein provide democratic access to the information objects. In other words, the information sharing democratization technique embodiments generally allow the meeting attendees to remotely share relevant information objects with each other, and also remotely interact with and share relevant information objects with the public display device, all in a very low-overhead manner which does not interrupt the meeting's flow or distract from the meeting's subject matter. The information sharing democratization technique embodiments also effectively manage contention for the public display device by using touch gestures, in-air gestures and postures together in hybrid interactions. More particularly, the information sharing technique embodiments use skeletal tracking of simple and familiar motions to specify modes and operands, and use touch input to confirm and complete actions and commands.
- More particularly and by way of example but not limitation, during a group meeting the information sharing democratization technique embodiments allow any meeting attendee to freely disseminate one or more relevant information objects, which are stored on a touch-enabled portable personal computing device they bring to the meeting, either by transmitting the information objects to the public display device so they can be publicly viewed by all of the other meeting attendees, or by transmitting the information objects to the personal computing device of another meeting attendee so they can be privately viewed by such attendee. The information sharing democratization technique embodiments also allow any meeting attendee to freely download desired information objects from the public display device in the meeting space to a touch-enabled portable personal computing device they bring to the meeting. Such transmissions of information objects to, and downloads of information objects from, the public display device can be accomplished without the meeting attendees having to switch which computing device is connected to the public display device.
- As will also be appreciated from the more detailed description that follows, rather than employing specialized remote controllers which can be expensive and can get lost, the information sharing democratization technique embodiments described herein employ controller-less in-air gestures and postures which are performed by the meeting attendees. The information sharing democratization technique embodiments also allow the meeting attendees to optionally use any touch-enabled portable personal computing device they bring to the meeting as a remote controller.
-
FIG. 1 illustrates an exemplary embodiment, in simplified form, of a meeting space framework for implementing the information sharing democratization technique embodiments described herein. As exemplified inFIG. 1 , a plurality of meetingattendees 102/104/106/108 are physically co-located together in ameeting space 100 for the purpose of holding a group meeting. One or more of themeeting attendees 102 may act as a presenter during the meeting, and this role of presenter may be assumed by different meeting attendees during the course of the meeting. Generally speaking and as described heretofore, the meeting attendees may bring various types of touch-enabled portable personal computing devices to the meeting. More particularly and as exemplified inFIG. 1 , onemeeting attendee 104 may bring atablet computer 116 to the meeting. Anothermeeting attendee 108 may bring a touch-enabledhandheld computing device 118 to the meeting. Yet anothermeeting attendee 106 may bring a touch-enabledlaptop computer 120 to the meeting. A gesture identification application runs on each touch-enabled portable personal computing device that is brought to the meeting, where this application serves to identify any touch gestures that are performed on the display screen of the device. - Referring again to
FIG. 1 , themeeting space 100 includes apublic display device 110 which is connected to and operates cooperatively with a central computing device (not shown). A collection of information objects being discussed during the meeting are stored on the central computing device. A selected one or ones of these stored information objects 122 are rendered by the central computing device and displayed on thepublic display device 110 for public viewing by all of themeeting attendees 102/104/106/108. In the meeting space framework embodiment exemplified inFIG. 1 thepublic display device 110 is mounted on a wall of themeeting space 100. It is noted that alternate embodiments of the meeting space framework (not shown) are possible where the public display device can be mounted and positioned in the meeting space in any other manner which makes it publicly viewable by all of the meeting attendees. - The public display device can optionally be touch-enabled. In other words, the public display device can optionally include a touch-sensitive display screen which can detect the presence and location of one or more touches or touch gestures thereon, where these touch gestures can be made with either hands or passive objects (such as a stylus or the like), among other things. Whenever the central computing device is connected to a touch-enabled public display device, a gesture identification application running on the central computing device will operate cooperatively with the public display device to identify any touch gestures that are performed on its display screen.
- Generally speaking, the meeting space also includes a plurality of object sensing devices each of which also is connected to and operates cooperatively with the central computing device. More particularly, in the meeting space framework embodiment exemplified in
FIG. 1 themeeting space 100 also includes a presenter-orientedobject sensing device 114 which is physically located in the meeting space such that thissensing device 114 operates cooperatively with a skeletal tracking application running on the central computing device to identify the current physical location of any meeting attendees who are within a prescribed distance (e.g., ten feet) of the public display device 110 (e.g.,attendee 102, hereafter simply referred to as “presenters”), and also identify the in-air gestures and postures performed by such attendees, among other things. Themeeting space 100 also includes an audience-orientedobject sensing device 112 which is physically located in themeeting space 100 such that thissensing device 112 operates cooperatively with the skeletal tracking application running on the central computing device to identify the current physical location of any other meeting attendees who are not within the prescribed distance of the public display device 110 (e.g.,attendees 104/106/108, hereafter simply referred to as “audience members”), and also identify the in-air gestures and postures performed by such attendees, among other things. In the case where a given gesture or posture performed by a given meeting attendee involves a portable personal computing device (various examples of which are provided hereafter), the audience-orientedobject sensing device 112 also operates cooperatively with the skeletal tracking application to identify the device and associate it with the gesture/posture. It will be appreciated that the portable personal computing device can be identified using various methods such as either visual tagging, or infrared beacons which are transmitted from the device, or low-power radio signaling (e.g., Bluetooth) which emanates from the device, among others. - In an exemplary embodiment of the information sharing democratization technique described herein, the skeletal tracking application performs hand posture recognition using a conventional heuristic method that computes the average number of radial gaps between fingers on a hand. Alternate embodiments of the information sharing democratization technique are also possible which use other methods such as optical flow and statistical interference, among others. Alternate embodiments of the information sharing democratization technique are also possible which recognize other types of postures.
- Referring again to
FIG. 1 , theobject sensing devices meeting space 100. In another embodiment of the information sharing democratization technique each object sensing device is an ultrasound transmitter combined with an ultrasound receiver that is matched to the ultrasound transmitter. In yet another embodiment of the information sharing democratization technique each object sensing device is a pair of visible light video cameras (also known as RGB (red/green/blue) video cameras) which operate together as a stereo video camera. In yet another embodiment of the information sharing democratization technique each object sensing device is just a single visible light video camera. Additional embodiments of the information sharing democratization technique are also possible where each object sensing device can include various combinations of the infrared projector and matching infrared camera, the ultrasound transmitter and matching ultrasound receiver, the pair of visible light video cameras, and the single visible light video camera. Additional embodiments of the information sharing democratization technique are also possible where the different object sensing devices are implemented in different ways. - Referring again to
FIG. 1 , themeeting space 100 can optionally also include anoptical projection device 124 which also is connected to and operates cooperatively with the central computing device. Theoptical projection device 124 is physically located in the meeting space such that this projection device operates cooperatively with a tertiary display application running on the central computing device to optically project a tertiary display region (not shown) onto a prescribed location on an optically-reflective, surface in the meeting space that is near the public display device (hereafter simply referred to as a “tertiary surface”). The tertiary display region can thus be viewed by all of the meeting attendees. In the meeting space framework embodiment exemplified inFIG. 1 theoptical projection device 124 is mounted on the ceiling of themeeting space 100, the tertiary surface is a wall in the meeting space upon which the public display device is mounted, and the prescribed location is immediately above the public display device. In an alternate embodiment of the information sharing democratization technique the tertiary surface can be a portable projector screen (or the like) and the prescribed location can be either immediately to the right or to the left of the public display device, among other places. -
FIG. 2 illustrates an exemplary embodiment, in simplified form, of a system framework for implementing the information sharing democratization technique embodiments described herein. As exemplified inFIG. 2 , thepublic display device 200, the presenter-orientedobject sensing device 202, the audience-orientedobject sensing device 204, thecentral computing device 206, and the touch-enabled portablepersonal computing devices 208 are interconnected via adata communications network 212. This network can be implemented as either a conventional wired local area network (such as Ethernet, or the like), or a conventional wireless local area network (such as Wi-Fi, or the like), or a combination thereof. Theoptical projection device 210 is connected directly to thecentral computing device 206 via avideo connection 214 such as either a composite video connection, or an S-video connection, or an RGB video connection, or the like. - 1.3 Remotely Interacting with Public Display Device
- This section describes exemplary embodiments of information sharing democratization techniques for allowing the audience members to remotely interact with this the public display device.
- 1.3.1 Pointing and Dragging with Arm
-
FIGS. 3A-3C illustrate one embodiment, in simplified form, of information sharing democratization techniques for allowing an audience member to remotely point at the public display device and remotely manipulate an information object that is displayed on this device. As exemplified inFIG. 3A , whenever a condition occurs where an audience member forms a remotelocation indicator posture 300 and remotely points 302 this posture at thepublic display device 304, the skeletal tracking application running on the central computing device will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will enable a remote location indicator mode and display alocation indicator 306 on the public display device, where the position of the location indicator on the public display device corresponds to where the remote location indicator posture is currently being pointed. Generally speaking, the remotelocation indicator posture 300 can be implemented using any type of posture that is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and can be differentiated from the other postures described herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIG. 3A the remotelocation indicator posture 300 is implemented as anarm 318 of the audience member being extended away from their body and one ormore fingers 320 of the arm being pointed away from their body in the same general direction as the arm. - In the particular information sharing democratization technique embodiment exemplified in
FIG. 3A thelocation indicator 306 is implemented as a cursor. Alternate information sharing democratization technique embodiments (not shown) are also possible where the location indicator is implemented in other ways. By way of example but not limitation, the location indicator can be implemented as a static spotlight that indicates a general area of the public display device's screen. The location indicator can also be implemented as an overlay that indicates a prescribed portion (e.g., a quadrant) of the public display device's screen. The location indicator can also be implemented as a colored border around such a prescribed portion of the public display device's screen. - Referring again to
FIG. 3A , the central computing device will remain in the remote location indicator mode and continue to display thelocation indicator 306 on thepublic display device 304 for as long as the audience member maintains the remotelocation indicator posture 300 and points 302 it at the public display device. The audience member can thus use the remotelocation indicator posture 300 to remotely and precisely point at either a desired location on thepublic display device 304 or a particular information object 314 that is displayed on the public display device. - Referring again to
FIG. 3A , whenever the remote location indicator mode is enabled and the audience member changes where on thepublic display device 304 their remotelocation indicator posture 300 is currently being pointed 302, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to track these changes, and the central computing device will move thelocation indicator 306 on the public display device accordingly (thus allowing the audience member to remotely move the location indicator on the public display device). Whenever the remote location indicator mode is enabled and another condition occurs where the audience member either stops forming the remotelocation indicator posture 300 or points it away from thepublic display device 304, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the remote location indicator mode and remove thelocation indicator 306 from the public display device. - As exemplified in
FIGS. 3B and 3C , whenever another condition occurs where an audience member formsremote dragging posture 308 and remotely points 310 this posture at thepublic display device 304, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will enable a remote dragging mode and display a draggingcursor 312 on the public display device, where the position of the dragging cursor on the public display device corresponds to where this posture is currently being pointed. Generally speaking, theremote dragging posture 308 can be implemented using any type of posture that is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and can be differentiated from the other postures described herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIGS. 3B and 3C theremote dragging posture 308 is implemented as anarm 318 of the audience member being extended away from their body with all of thefingers 322 of the arm being bunched together and the palm of the arm being approximately perpendicular to the floor of the meeting space. The central computing device will remain in the remote dragging mode and continue to display the draggingcursor 312 on thepublic display device 304 for as long as the audience member maintains theremote dragging posture 308 and points 310/316 it at the public display device. - Referring again to
FIGS. 3B and 3C , whenever the remote dragging mode is enabled, and the audience member points 310 theirremote dragging posture 308 at a particular information object 314 that is displayed on thepublic display device 304 and then changes where on the public display device this posture is being pointed 316, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to track these changes, and the central computing device will move the information object on the public display device accordingly (thus allowing the audience member to remotely move the information object on the public display device). Whenever the remote dragging mode is enabled and another condition occurs where the audience member either stops forming theremote dragging posture 308 or points it away from thepublic display device 304, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the remote dragging mode and remove the draggingcursor 312 from the public display device. - In the case where the audience-oriented and presenter-oriented object sensing devices have a low sensing resolution, the center point of the public display device can be calibrated to be aligned in absolute coordinates using conventional methods. Outward from this center point a gain factor having a value of less than one can be applied by the central computing device to create greater pointing precision. In the context of the information sharing democratization technique embodiments described herein the term “gain factor” refers to a ratio of physical movement in real space to how far an object being displayed on a display device (such as a location indicator or the like) will move in relation to the physical movement. In the case where the audience-oriented and presenter-oriented object sensing devices have a greater sensing resolution, absolute pointing (e.g., a gain factor of one) can be employed by the central computing device. In the case where the audience-oriented and presenter-oriented object sensing devices are implemented as an infrared projector combined with an infrared camera that is matched to the infrared projector in order to produce an ongoing series of depth maps as described heretofore, and these sensing devices are intended to recognize hand postures, the skeletal tracking application running on the central computing device can segment the hand, and then track any movement of the hand by averaging the depth maps in a prescribed radius surrounding the segmented hand.
- 1.3.2 Pointing and Dragging with Arm and Hand-Held Computing Device
-
FIGS. 4A-4C illustrate another embodiment, in simplified form, of information sharing democratization techniques for allowing an audience member to remotely point at the public display device and remotely manipulate an information object that is displayed on this device. As exemplified inFIG. 4A , whenever a condition occurs where an audience member who is using a touch-enabledhandheld computing device 400 forms adevice pointing posture 402 with this device and remotely points 404 this posture at thepublic display device 406, the skeletal tracking application running on the central computing device will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will enable a device pointing mode and display thelocation indicator 412 on the public display device, where the position of the location indicator on the public display device corresponds to where this posture is currently being pointed. Generally speaking, thedevice pointing posture 402 can be implemented using any type of posture that is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and can be differentiated from the other postures describe herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIG. 4A thedevice pointing posture 402 is implemented as anarm 408 of the audience member being extended away from their body with thehandheld computing device 400 being held in thehand 410 of the arm and being pointed away from their body in the same general direction as the arm. - Referring again to
FIG. 4A , the central computing device will remain in the device pointing mode and continue to display thelocation indicator 412 on thepublic display device 406 for as long as the audience member maintains thedevice pointing posture 402 and points 404 it at the public display device. The audience member can thus use thedevice pointing posture 402 to remotely and precisely point at either a particular location on thepublic display device 406 or a particular information object 414 that is displayed on the public display device. - Referring again to
FIG. 4A , whenever the device pointing mode is enabled and the audience member changes where on thepublic display device 406 theirdevice pointing posture 402 is being pointed 404, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to track these changes and the central computing device will move thelocation indicator 412 on the public display device accordingly (thus allowing the audience member to remotely move the location indicator on the public display device). Whenever the device pointing mode is enabled and another condition occurs where the audience member either stops forming thedevice pointing posture 402 or points it away from the public display device 406 (e.g., whenever the audience member puts thehandheld computing device 400 down, among other things), the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the device pointing mode and remove thelocation indicator 412 from the public display device. - As exemplified in
FIGS. 4B and 4C , whenever another condition occurs where an audience member who is using a touch-enabledhandheld computing device 400 forms adevice dragging posture 416 with this device and remotely points 418 this posture at thepublic display device 406, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will enable a device dragging mode and display the draggingcursor 424 on the public display device, where the position of the dragging cursor on the public display device corresponds to where this posture is being pointed. Generally speaking, thedevice dragging posture 416 can be implemented using any type of posture that is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and can be differentiated from the other postures describe herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIGS. 4B and 4C thedevice dragging posture 416 is implemented as anarm 408 of the audience member being extended away from their body with thehandheld computing device 400 being held in the hand of the arm and being pointed away from their body in the same general direction as the arm, and afinger 420 of the arm/hand pressing a dragging icon (not shown) that is being displayed on thedisplay screen 422 of the handheld computing device. The central computing device will remain in the device dragging mode and continue to display the draggingcursor 424 on thepublic display device 406 for as long as the audience member maintains thedevice dragging posture 416 and points 418/426 it at thepublic display device 406. - Referring again to
FIGS. 4B and 4C , whenever the device dragging mode is enabled, and the audience member points 418 theirdevice dragging posture 416 at a particular information object 414 that is displayed on thepublic display device 406 and then changes where on the public display device this posture is being pointed 426, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to track these changes and the central computing device will move the information object on the public display device accordingly (thus allowing the audience member to remotely move the information object on the public display device). Whenever the device dragging mode is enabled and another condition occurs where the audience member either stops forming thedevice dragging posture 416 or points it away from the public display device 406 (e.g., whenever the audience member puts thehandheld computing device 400 down, or stops pressing theirfinger 420 on the dragging icon, among other things), the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the device dragging mode and remove the draggingcursor 424 from the public display device. - It is noted that rather than the device dragging posture being formed uni-manually (i.e., with the same hand that is holding the touch-enabled handheld computing device being used to press the dragging icon) as just described, an alternate embodiment of the information sharing democratization technique described herein is also possible where the device dragging posture is formed bi-manually (i.e., with a different hand than the one that is holding the handheld computing device being used to press the dragging icon). Additionally, it will be appreciated that the use of the touch-enabled handheld computing device to remotely manipulate information on the public display device is advantageous since it is both direct and precise.
- 1.3.3 Annotating Temporarily with Arm and Hand-Held Computing Device
- This section describes an exemplary embodiment of an information sharing democratization technique for allowing an audience member to remotely draw one or more temporary annotation marks on the public display device. As will be appreciated from the more detailed description that follows, this technique embodiment is advantageous since it allows any audience member to visually accentuate features remotely on the public display device.
- Whenever a condition occurs where an audience member who is using a touch-enabled handheld computing device forms a device annotating posture with this device and remotely points this posture at the public display device, the skeletal tracking application running on the central computing device will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will enable a remote annotation mode. Generally speaking, the device annotating posture can be implemented using any type of posture that is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and can be differentiated from the other postures describe herein. By way of example but not limitation, in an exemplary embodiment of the information sharing democratization technique described herein the device annotating posture is implemented as an arm of the audience member being extended away from their body with the handheld computing device being held in the hand of the arm and being pointed away from their body in the same general direction as the arm, and a finger of the arm/hand pressing an ink icon that is being displayed on the display screen of the handheld computing device.
- The central computing device will remain in the remote annotation mode for as long as the audience member maintains the device annotating posture and points it at the public display device. Whenever the remote annotation mode is enabled and the audience member changes where on the public display device their device annotating posture is being pointed, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to track these changes and the central computing device will draw one or more annotation marks (such as arrows, lassos, and underlines, among others) on the public display device according to these changes (thus allowing the audience member to remotely annotate the public display device). Whenever the remote annotation mode is enabled and another condition occurs where the audience member either stops forming the device annotating posture or points it away from the public display device (e.g., whenever the audience member either puts the handheld computing device down, or stops pressing their finger on the ink icon, among other things), the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the remote annotation mode and remove the annotation marks from the public display device.
- It is noted that rather than the device annotating posture being formed uni-manually (i.e., with the same hand that is holding the touch-enabled handheld computing device being used to press the ink icon) as just described, an alternate embodiment of the information sharing democratization technique described herein is also possible where the device annotating posture is formed bi-manually (i.e., with a different hand than the one that is holding the handheld computing device being used to press the ink icon).
- 1.3.4 Gesturing with Pointing and Touch
-
FIGS. 5A-5C illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing an audience member to remotely execute actions on the public display device or remotely execute commands in an application that is running on the central computing device. As will be appreciated from the more detailed description that follows, this particular technique embodiment is advantageous since it prevents accidental command execution due to misinterpreted in-air gestures, and provides tactile response to the audience member. - As exemplified in
FIG. 5A , whenever a condition occurs where an audience member who is using a touch-enabledhandheld computing device 508 forms thedevice pointing posture 500 with this device and remotely points 502 this posture at thepublic display device 504, the skeletal tracking application running on the central computing device will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will enable the device pointing mode and display thelocation indicator 506 on the public display device, where the position of the location indicator on the public display device corresponds to where this posture is being pointed. The central computing device will also instruct thehandheld computing device 508 being held in the audience member'shand 510 to enable the device pointing mode. The central computing device andhandheld computing device 508 will remain in the device pointing mode, and the central computing device will continue to display thelocation indicator 506 on thepublic display device 504, for as long as the audience member maintains thedevice pointing posture 500 and points 502 it at the public display device. - Referring again to
FIG. 5A , whenever the device pointing mode is enabled and another condition occurs where the audience member maintains thedevice pointing posture 500 and continues to remotely point 502 it at thepublic display device 504 for a prescribed period of time (e.g., two seconds), the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will transmit a command to thehandheld computing device 508 instructing it to display a touchgesture disclosure overlay 512 on itsdisplay screen 514. It will be appreciated that this overlay, and the other overlays described hereafter, can be implemented in various ways. In an exemplary embodiment of the information sharing democratization technique describe herein these overlays are implemented as semi-transparent overlays. Upon receiving this command from the central computing device, thehandheld computing device 508 will display thisoverlay 512 on itsdisplay screen 514. - Referring again to
FIG. 5A , the touchgesture disclosure overlay 512 includes one or more differentgraphical icons 516 each of which discloses a different rectilinear, mark-based touch gesture that is allowed to be performed by the audience member on thedisplay screen 514 of the touch-enabledhandheld computing device 508. Theoverlay 512 can also includetext 518 that is displayed alongside eachgraphical icon 516, where this text discloses a particular action or command that is associated with each touch gesture. Theoverlay 512 can optionally also disclose a first spatial operand (not shown) which allows the audience member to select a particular spatial location on the public display device. - It will be appreciated that many different rectilinear, mark-based touch gestures and associated actions/commands are possible. Generally speaking, whenever the device pointing mode is enabled and the second meeting attendee performs an allowed touch gesture on the display screen of the handheld computing device, the handheld computing device will transmit a command to the central computing device that is associated with this touch gesture. Upon receiving this command from the handheld computing device the central computing device will execute the command. By way of example but not limitation and as exemplified in
FIGS. 5B and 5C , whenever the audience member performs apan-right touch gesture 520 on thedisplay screen 514 of thehandheld computing device 508, the handheld computing device will transmit a pan-right command to the central computing device. Upon receiving the pan-right command, the central computing device will pan 522 the information that is displayed on thepublic display device 504 to the right. Similarly, whenever the audience member performs a pan-left touch gesture (not shown) on the display screen of the handheld computing device, the handheld computing device will transmit a pan-left command to the central computing device. Upon receiving the pan-left command, the central computing device will pan the information that is displayed on the public display device to the left. - Generally speaking, the pan-right and pan-left touch gestures can be implemented using any type of rectilinear, mark-based touch gesture that is recognizable by the gesture identification application, and can be differentiated from the other touch gestures described herein. By way of example but not limitation, in the particular information sharing democratization technique embodiment exemplified in
FIG. 5B thepan-right touch gesture 520 is implemented as a flick-right touch gesture (and correspondingly the pan-left touch gesture would be implemented as a flick-left touch gesture). - Whenever the device pointing mode is enabled, the audience member can employ other rectilinear, mark-based touch gestures to execute a variety of commands in an application that is running on the central computing device. It is noted that the rectilinear, mark-based touch gestures (such as the pan-right touch gesture, or the pan-left touch gesture, among others) can be performed either uni-manually (i.e., with the same hand that is holding the touch-enabled handheld computing device) or bi-manually (i.e., with a different hand than the one that is holding the handheld computing device).
- This section describes exemplary embodiments of information sharing democratization techniques for allowing the meeting attendees to either permanently or temporarily transfer copies of information objects between the public display device and a given touch-enabled portable personal computing device, and between two different touch-enabled portable personal computing devices.
- 1.4.1 Transferring Information Objects with Pointing and Touch
- In an exemplary embodiment of the information sharing democratization technique described herein an audience member who is using a touch-enabled handheld computing device can remotely transfer (i.e., “pull”) a copy of an information object from the public display device to the handheld computing device in the following manner. Whenever the device pointing mode is enabled on the central computing device and handheld computing device as described heretofore, and the audience member remotely points their device pointing posture at a particular information object that is displayed on the public display device, and whenever they then perform an information-pull touch gesture on the display screen of their handheld computing device, the handheld computing device will transmit an information-pull command to the central computing device, where this command requests that a copy of the particular information object be transferred to the handheld computing device. Upon receiving this command from the handheld computing device, the central computing device will transmit a copy of the particular information object to the audience member's handheld computing device. Generally speaking, the information-pull touch gesture can be implemented using any type of rectilinear, mark-based touch gesture that is recognizable by the gesture identification application, and can be differentiated from the other touch gestures described herein. By way of example but not limitation, in an exemplary embodiment of the information sharing democratization technique described herein the information-pull touch gesture is implemented as a flick-downward (i.e., away from the public display device) touch gesture. It is noted that the flick-downward touch gesture can be performed either uni-manually or bi-manually.
- After the copy of the particular information object has been received by the audience member's touch-enabled handheld computing device, a scaled-down version of the information object will be displayed on the handheld computing device's display screen at the approximate location where the information-pull touch gesture was performed. At this point, the audience member can stop forming the device pointing posture and can locally interact with the copy of the information object on their handheld computing device at their leisure. A context menu can optionally also be displayed on the handheld computing device's display screen, where this menu informs the audience member of the various operations they can perform on the information object. Exemplary operations can include editing the information object, saving, and deleting it, among others. The audience member can also locally manipulate the information object on their handheld computing device in other ways. By way of example but not limitation, the audience member can perform conventional pinch touch gestures to resize the information object. It will be appreciated that the context menu can be implemented in various ways. In an exemplary embodiment of the information sharing democratization technique describe herein the context menu is implemented as a semi-transparent overlay.
-
FIGS. 6A-6C illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing an audience member who is using a touch-enabled handheld computing device to remotely transfer (i.e., “push”) a copy of an information object from the handheld computing device to the public display device. As exemplified inFIG. 6A and described heretofore, whenever the audience member forms thedevice pointing posture 600 with thehandheld computing device 614 and remotely points 602 this posture at thepublic display device 604, the device pointing mode will be enabled on the central computing device and handheld computing device, and the central computing device will display thelocation indicator 606 on the public display device, where the position of the location indicator on the public display device corresponds to where this posture is being pointed. As exemplified inFIG. 6B , whenever the device pointing mode is enabled and the audience member remotely points their device pointing posture at a desired location on the public display device, and they then perform an information-push touch gesture 608 on a particular information object 610 that is displayed on thedisplay screen 612 of theirhandheld computing device 614, the handheld computing device will transmit a copy of the information object to the central computing device. - Generally speaking and referring again to
FIG. 6B , the information-push touch gesture 608 can be implemented using any type of rectilinear, mark-based touch gesture that is recognizable by the gesture identification application and can be differentiated from the other touch gestures described herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIG. 6B the information-push touch gesture 608 is implemented as a flick-upward (i.e., towards the public display device) touch gesture. It is noted that the information-push touch gesture can be performed either uni-manually or bi-manually. - After the copy of the information object has been received from the handheld computing device by the central computing device, the following things can happen. In one embodiment of the information sharing democratization technique described herein the central computing device will display a scaled-down version of the information object on the public display device, where the information object will be positioned on the public display device at the location of the location indicator at the time the audience member performed the information-push touch gesture. At this point the audience member can stop forming the device pointing posture, and any presenter can locally interact with this information object on the public display device at their leisure. This particular embodiment generally suffices in small group meeting situations, or in situations where the meeting attendees already know each other, since conventional social protocol will dictate permissions for when it is acceptable for an given audience member to transfer a copy of an information object to the public display device.
- In large group meeting situations, or in situations where the meeting attendees don't know each other, the following alternate embodiment of the information sharing democratization technique described herein is possible which is based on a “package metaphor” mode that can be selectively enabled and subsequently disabled on the central computing device by a responsible person. As exemplified in
FIGS. 6B and 6C , whenever the package metaphor mode is enabled, after the copy of theinformation object 610 has been received from the handheld computing device by the central computing device, the central computing device will display anicon 616 on thepublic display device 604 at the location of thelocation indicator 606 at the time the audience member performed the information-push touch gesture 608, where this icon serves as a surrogate for the information object. Thisicon 616 is hereafter simply referred to as a “package icon.” Thepackage icon 616 can optionally include a first text label (not shown) that specifies which audience member transferred the copy of the information object to the public display device. Whenever the audience member transfers copies of a plurality of information objects in sequence from their touch-enabled handheld computing device to a public display device, the plurality of information objects will be represented by a single package icon (i.e., the plurality of information objects will be grouped into a single “package”) which will include a second text label that specifies the number of different information objects that were transferred. Any presenter can locally interact with thepackage icon 616 on the public display device at their leisure in various ways including, but not limited to, the following. -
FIGS. 7A-7C illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a presenter to locally interact with information objects that were previously transferred to the public display device by an audience member. As exemplified inFIG. 7A , apackage icon 700 is displayed on thepublic display device 702, where the package icon includes afirst text label 704 that specifies which audience member (in this case John) transferred the information objects to the public display device, and asecond text label 706 that specifies the number of different information objects (in this case 3) that were transferred. - As exemplified in
FIG. 7B , whenever a condition occurs where the presenter hovers theirhand 708 over thepackage icon 700 for a prescribed period of time (e.g., two seconds), the skeletal tracking application running on the central computing device will operate cooperatively with the presenter-oriented object sensing device to identify this condition, and a scaled-downpreview version 710 of the one or more information objects that were transferred to thepublic display device 702 will be revealed to the presenter. Whenever another condition occurs where the presenter stops hovering theirhand 708 over thepackage icon 700, the scaled-downpreview version 710 of the information objects will stop being revealed. As exemplified inFIG. 7C , whenever the presenter taps 712 on the package icon, the one or more information objects 714 represented by the icon will be displayed on the public display device so these information objects can be publicly viewed by the audience members. - In another exemplary embodiment of the information sharing democratization technique described herein an audience member who is using either a tablet computer or touch-enabled laptop computer (hereafter collectively simply referred to as a touch-enabled non-handheld computing device) can remotely transfer (i.e., push) a copy of an information object from the non-handheld computing device to the public display device in the following manner. Whenever another condition occurs where the audience member forms the remote location indicator posture using one of their arms and remotely points this posture at a desired location on the public display device, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition and also identify that the audience member who is forming this posture is using the non-handheld computing device. The central computing device will then enable the remote location indicator mode and display the location indicator on the public display device, where the position of the location indicator on the public display device corresponds to where this posture is being pointed. The audience member can thus use the remote location indicator posture to remotely and precisely point at either a desired location on the public display device or a particular information object that is being displayed on the public display device.
- The central computing device will also transmit a command to the non-handheld computing device being used by the audience member instructing it to enable the remote location indicator mode. Upon receiving this command from the central computing device, the non-handheld computing device can optionally display the touch gesture disclosure overlay on its display screen (which serves as a visual indication to the audience member that their non-handheld computing device is now in the remote location indicator mode). The central computing device and non-handheld computing device will remain in the remote location indicator mode, and the central computing device will continue to display the location indicator on the public display device, for as long as the audience member maintains the remote location indicator posture and points it at the public display device.
- Generally speaking, whenever the remote location indicator mode is enabled and the audience member performs an allowed touch gesture on the display screen of their non-handheld computing device, the non-handheld computing device will transmit a command to the central computing device that is associated with this touch gesture. Upon receiving this command from the non-handheld computing device, the central computing device will execute the command. By way of example but not limitation, whenever the audience member uses their other arm to perform the information-push touch gesture on a particular information object that is displayed on the display screen of their non-handheld computing device, the non-handheld computing device will transmit a copy of the information object to the central computing device. After the copy of the information object has been received from the non-handheld computing device by the central computing device, in one embodiment of the information sharing democratization technique described herein the central computing device will display a scaled-down version of the information object on the public display device, and any presenter can interact with this information object as described heretofore. In an alternate embodiment of the information sharing democratization technique where the package metaphor mode has been enabled, the central computing device will display the package icon representing the information object on the public display device, and any presenter can interact with this package icon as also described heretofore.
-
FIGS. 8A-8C illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing an audience member who is using a touch-enabled non-handheld computing device to remotely transfer (i.e., pull) a copy of an information object from the public display device to the non-handheld computing device. As exemplified inFIG. 8A , whenever another condition occurs where the audience member forms the remotelocation indicator posture 800 and remotely points 804 this posture at thepublic display device 810, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition and also identify that the audience member who is forming this posture is using thenon-handheld computing device 812. The central computing device will then enable the remote location indicator mode and display thelocation indicator 806 on theinformation object 808 as described heretofore. The central computing device will also transmit a command to thenon-handheld computing device 812 being used by the audience member instructing it to enable the remote location indicator mode. Upon receiving this command from the central computing device, thenon-handheld computing device 812 can optionally display the touchgesture disclosure overlay 814 on itsdisplay screen 816. - As exemplified in
FIG. 8B , whenever the remote location indicator mode is enabled and the audience member remotely points 804 the remotelocation indicator posture 800 at a particular information object 808 that is displayed on thepublic display device 810, and they then perform the information-pull touch gesture 822 on thedisplay screen 816 of theirnon-handheld computing device 812, the non-handheld computing device will transmit an information-pull command to the central computing device, where this command requests that a copy of the particular information object be transferred to the non-handheld computing device. Upon receiving this command from thenon-handheld computing device 812, the central computing device will transmit a copy of the particular information object 808 to the non-handheld computing device. As exemplified inFIG. 8C , after the copy of theinformation object 808 has been received by thenon-handheld computing device 812, a scaled-down version of theinformation object 824 will be displayed will be displayed on the non-handheld computing device'sdisplay screen 816 at the approximate location where the information-pull touch gesture was performed. At this point, the audience member can stop forming the remotelocation indicator posture 800 and can locally interact with the copy of the information object on theirnon-handheld computing device 812 at their leisure. The context menu (not shown) can optionally also be displayed on the non-handheld computing device'sdisplay screen 816, where this menu informs the audience member of the various operations they can perform on theinformation object 824. The audience member can also locally manipulate theinformation object 824 in other ways including, but not limited to, performing conventional pinch touch gestures (not shown) to resize the information object. - 1.4.2 Transient Sharing with Pointing and Accelerometers
- This section describes exemplary embodiments of information sharing democratization techniques for allowing an audience member to briefly (i.e., transiently) show an information object to the rest of the meeting attendees (e.g., the equivalent of the audience member briefly holding up a piece of paper to the rest of the meeting attendees). This capability can be useful in various group meeting scenarios including, but not limited to, the following. Consider a scenario where a project team is holding a group meeting to discuss a project they are working on and a question or problem arises during the meeting. One of the audience members may use their touch-enabled portable personal computing device during the meeting to work on producing an answer or solution. Once they have produced an answer or solution, they can publicly share it with the rest of the meeting attendees.
-
FIGS. 9A and 9B illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing an audience member who is using a handheld computing device to transiently and publicly share the current contents of its display screen with all the other meeting attendees. As exemplified inFIG. 9A , whenever the audience member is “normally” using thehandheld computing device 900 the contents of itsdisplay screen 902 is kept private to the audience member (i.e., these contents are not displayed on the public display device 904). As exemplified inFIG. 9B , whenever a condition occurs where the audience member forms adevice sharing posture 906 with thehandheld computing device 900 and remotely points this posture at thepublic display device 904, the following things will happen. The skeletal tracking application running on the central computing device will operate cooperatively with the audience-oriented object sensing device to identify this condition, a physical orientation sensing device included within the handheld computing device 900 (e.g., an accelerometer, or the combination of an accelerometer and a gyroscope, or the combination of an accelerometer and a gyroscope and a magnetometer) will sense that the handheld computing device is in a prescribed spatial orientation, both the central computing device and handheld computing device will enable a transient sharing mode, and the handheld computing device will transmit a copy of the current contents of itsdisplay screen 912 to the central computing device. - Generally speaking and referring again to
FIG. 9B , thedevice sharing posture 906 can be implemented using any type of posture that involves the audience member holding thehandheld computing device 900, and is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and is differentiable from the other postures describe herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIG. 9B thedevice sharing posture 906 is implemented as anarm 908 of the audience member being extended away from their body with thehandheld computing device 900 being held in thehand 910 of the arm such that the handheld computing device is approximately perpendicular to the floor of the meeting space. As such, in this particular technique embodiment the prescribed spatial orientation is the handheld computing device being approximately perpendicular to the floor of the meeting space. Thehandheld computing device 900 can be held in either a portrait screen orientation (as exemplified inFIG. 9B ) or landscape screen orientation (not shown). Alternate embodiments of the information sharing democratization technique described herein are possible where the screen orientation of the handheld computing device can be used for different purposes such as determining a default action or a particular mode of use. - Referring again to
FIG. 9B , after the copy of the current contents of the handheld computing device'sdisplay screen 912 have been received from the handheld computing device by the central computing device, the central computing device will display these contents in the form of anoverlay 914 on thepublic display device 904. The central computing device andhandheld computing device 900 will remain in the transient sharing mode, and the central computing device will continue to display theoverlay 914 on thepublic display device 904, for as long as the audience member maintains thedevice sharing posture 906 and points it at the public display device. Whenever the central computing device and handheld computing device are in the transient sharing mode, thehandheld computing device 900 will transmit any changes in the contents of itsdisplay screen 912 to the central computing device, and the central computing device will accordingly update theoverlay 914 with these changes. Thus, the current contents of the handheld computing device'sdisplay screen 912 will be shared transiently with all of the meeting attendees for as long as the audience member continues to form thedevice sharing posture 906. - Referring again to
FIG. 9B , whenever the transient sharing mode is enabled and another condition occurs where the audience member either stops forming thedevice sharing posture 906 or points it away from the public display device 904 (e.g., whenever the audience member puts the handheld computing device down, among other things), the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, the central computing device andhandheld computing device 900 will disable the transient sharing mode, and the central computing device will remove theoverlay 914 from thepublic display device 904. During the time theoverlay 914 is displayed on thepublic display device 904, a presenter (not shown) can make the overlay a permanent part of the public display device in the following manner. Whenever this presenter touches the overlay and touch-drags it in any direction along the public display device, the gesture identification application that is running on the central computing device will identify this touch-dragging activity, and transform the overlay into a scaled-down version thereof, and replace the overlay with this scaled-down version, and move the location of this scaled-down version on the public display device based on this touch-dragging activity. - In another exemplary embodiment of the information sharing democratization technique described herein an audience member who is using a touch-enabled non-handheld computing device can transiently and publicly share the current contents of its display screen with all the other meeting attendees in the following manner. Whenever another condition occurs where the audience member forms the remote dragging posture and remotely points this posture at the public display device, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition and also identify that the audience member who is forming this posture is using the non-handheld computing device. The central computing device will then enable the transient sharing mode, and will also transmit a command to the non-handheld computing device being used by the audience member instructing it to enable the transient sharing mode. Whenever the transient sharing mode is enabled and the audience member touches the display screen of their non-handheld computing device, the non-handheld computing device will transmit a copy of the current contents of its display screen to the central computing device. After the copy of the current contents of the non-handheld computing device's display screen have been received from the non-handheld computing device by the central computing device, the central computing device will display these contents in the form of an overlay on the public display device.
- The central computing device and touch-enabled non-handheld computing device will remain in the transient sharing mode, and the central computing device will continue to display the overlay on the public display device, for as long as the audience member maintains the remote dragging posture and points it at the public display device. Whenever the transient sharing mode is enabled, the non-handheld computing device will transmit any changes in the contents of its display screen to the central computing device, and the central computing device will accordingly update the overlay with these changes. Thus, the current contents of the non-handheld computing device's display screen will be shared transiently with all of the meeting attendees for as long as the audience member continues to form the remote dragging posture and points it at the public display device. Whenever the transient sharing mode is enabled and another condition occurs where the audience member either stops forming the remote dragging posture or points it away from the public display device, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, the central computing device and non-handheld computing device will disable the transient sharing mode, and the central computing device will remove the overlay from the public display device. During the time the overlay is displayed on the public display device, a presenter can make the overlay a permanent part of the public display device in the in the manner just described.
- This section describes exemplary embodiments of information sharing democratization techniques for allowing the meeting attendees to transfer a copy of an information object from the personal computing device of one meeting attendee to the personal computing device of another meeting attendee. It will be appreciated that this information transfer can be made either from one audience member to another, or from an audience member to a presenter, or vice versa.
- In one embodiment of the information sharing democratization technique described herein a first audience member who is using a touch-enabled non-handheld computing device can remotely transfer (i.e., push) a copy of an information object from the non-handheld computing device to a personal computing device that is associated with a second audience member in the following manner. Whenever a condition occurs where the first audience member forms the remote location indicator posture and remotely points this posture at the second audience member, the skeletal tracking application running on the central computing device will operate cooperatively with the audience-oriented object sensing device to identify this condition, identify that the first audience member who is forming this posture is using the non-handheld computing device, identify the second audience member, and identify the personal computing device that is associated with the second audience member. This association between the personal computing device and second audience member can be established in various ways. By way of example but not limitation, the association can be programmed into a user profile for the second audience member on the central computing device. The association can also be established by sensing the physical location of each of the personal computing devices and each of the meeting attendees in the meeting space, and then associating a given meeting attendee with the personal computing device(s) that is within a prescribed distance (e.g., three feet) thereof. The central computing device will also enable a peer-to-peer transfer mode and will remain in this mode for as long as the first audience member maintains the remote location indicator posture and points it at the second audience member.
- Whenever the peer-to-peer transfer mode is enabled, the first audience member can perform the information-push touch gesture on a particular information object that is displayed on the display screen of their touch-enabled non-handheld computing device, which will cause the non-handheld computing device to transmit a copy of the information object to the central computing device. Upon receiving the copy of the information object from the non-handheld computing device, the central computing device will forward it to the personal computing device that is associated with the second audience member. After the copy of the information object has been received by this personal computing device, it can display a package icon on its display screen which serves as a surrogate for the information object. The package icon can optionally include a text label which specifies that the information object was received from the first audience member. The second audience member can then tap on the package icon at their leisure to display the information object represented by the icon. Whenever the peer-to-peer transfer mode is enabled and another condition occurs where the first audience member either stops forming the remote location indicator posture or points it away from the second audience member, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the peer-to-peer transfer mode.
- In another embodiment of the information sharing democratization technique described herein a third audience member who is using a touch-enabled handheld computing device can remotely transfer (i.e., push) a copy of an information object from the handheld computing device to the personal computing device that is associated with the second audience member in the following manner. Whenever another condition occurs where the third audience member forms the device pointing posture with the handheld computing device and remotely points this posture at the second audience member, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, identify that the third audience member who is forming this posture is using the handheld computing device, identify the second audience member, and identify the personal computing device that is associated with the second audience member. The central computing device will also enable the peer-to-peer transfer mode and will remain in this mode for as long as the third audience member maintains the device pointing posture and points it at the second audience member.
- Whenever the peer-to-peer transfer mode is enabled, the third audience member can perform the information-push touch gesture on a particular information object that is displayed on the display screen of the handheld computing device, which will cause the handheld computing device to transmit a copy of the information object to the central computing device. Upon receiving the copy of the information object from the handheld computing device, the central computing device will forward it to the personal computing device that is associated with the second audience member. After the copy of the information object has been received by this personal computing device, it will display the package icon on its display screen which serves as a surrogate for the information object. The second audience member can then tap on the package icon at their leisure to display the information object represented by the icon. Whenever the peer-to-peer transfer mode is enabled and another condition occurs where the third audience member either stops forming the remote location indicator posture or points it away from the second audience member, the skeletal tracking application will operate cooperatively with the audience-oriented object sensing device to identify this condition, and the central computing device will disable the peer-to-peer transfer mode. It is noted that the information-push touch gesture can be performed either uni-manually (e.g., with the same hand that is holding the handheld computing device) or bi-manually (e.g., with a different hand than the one that is holding the handheld computing device).
- In addition to the two different information sharing democratization technique embodiments for allowing one audience member to transfer a copy of an information object to another audience member that were just described, an alternate embodiment of the information sharing democratization technique described herein is also possible where the information transfer can take place between an audience member and a presenter. In this case the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to perform the various identification operations just described. Another alternate embodiment of the information sharing democratization technique is also possible where the information transfer can take place between two different presenters. In this case the skeletal tracking application will operate cooperatively with just the presenter-oriented object sensing device to perform the various identification operations just described. In a situation where the two meeting attendees involved in the information transfer are directly next to one another, it will be appreciated that it may be infeasible for one meeting attendee to point at the other meeting attendee. In this situation conventional methods which are intended to support very short-range sharing of information objects can be used to allow one meeting attendee to transfer a copy of an information object to the other meeting attendee. Examples of such conventional methods include the Bump application for smartphones, the stitching method which uses pen gestures that span a plurality of displays, and the pick-and-drop method, among others.
- 1.4.4 Transferring Information Objects from Public Display Device to Audience
- This section describes exemplary embodiments of information sharing democratization techniques for allowing a presenter to transfer a copy of an information object that is displayed on the public display device to one or more audience members.
- In one embodiment of the information sharing democratization technique described herein a presenter can remotely transfer (i.e., push) a copy of an information object that is displayed on the public display device to a personal computing device that is associated with a particular audience member in the following manner. Whenever a condition occurs where the presenter forms the remote location indicator posture using one of their hands and remotely points this posture at the particular audience member, the skeletal tracking application running on the central computing device will operate cooperatively with both the audience-oriented object sensing device and presenter-oriented object sensing device to identify this condition, identify the particular audience member, and identify the personal computing device that is associated with the audience member, where this association has been previously programmed into a user profile for the audience member on the central computing device. The central computing device will also enable a presenter-to-audience-member transfer mode and will remain in this mode for as long as the presenter maintains the remote location indicator posture and points it at the audience member. Whenever the presenter-to-audience-member transfer mode is enabled and another condition occurs where the presenter either stops forming the remote location indicator posture or points it away from the audience member, the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, and the central computing device will disable the presenter-to-audience-member transfer mode.
- Whenever the presenter-to-audience-member transfer mode is enabled and another condition occurs where the presenter maintains the remote location indicator posture which is being pointed at the particular audience member for a prescribed period of time (e.g., two seconds), and during this period of time the presenter's other hand is within a prescribed distance (e.g., two meters) of the public display device's display screen, the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, and the central computing device will display a first touch gesture disclosure overlay around the presenter on the public display device, where this overlay includes one or more different graphical icons each of which discloses a different rectilinear, mark-based touch gesture that is allowed to be performed by the presenter on the public display device's display screen. This overlay can also include text that is displayed alongside each graphical icon, where this text discloses a particular action or command that is associated with each touch gesture. Whenever the presenter-to-audience-member transfer mode is enabled, the presenter can perform the information-push touch gesture on an information object that is displayed on the display screen of the public display device, which will cause the central computing device to transmit a copy of the information object to the personal computing device that is associated with the audience member.
- In another embodiment of the information sharing democratization technique described herein the presenter can remotely transfer (i.e., push) a copy of an information object that is displayed on the public display device to the personal computing devices that are associated with all of the audience members in the following manner. Whenever the presenter-to-audience-member transfer mode is disabled and another condition occurs where the presenter forms the remote location indicator posture using one of their hands and remotely points this posture away from the public display device and at the floor of the meeting space, the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, identify the audience members, and identify the personal computing device that is associated with each of the audience members. The central computing device will also enable a presenter-to-entire-audience transfer mode and will remain in this mode for as long as the presenter maintains the remote location indicator posture and points it away from the public display device and at the floor. Whenever the presenter-to-entire-audience transfer mode is enabled and another condition occurs where the presenter either stops forming the remote location indicator posture or stops pointing it away from the public display device and at the floor, the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, and the central computing device will disable the presenter-to-entire-audience transfer mode.
- Whenever the presenter-to-entire-audience transfer mode is enabled and another condition occurs where the presenter maintains the remote location indicator posture which is being pointed away from the public display device and at the floor for a prescribed period of time (e.g., two seconds), and during this period of time the presenter's other hand is within the prescribed distance of the public display device's display screen, the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, and the central computing device will display a second touch gesture disclosure overlay around the presenter on the public display device, where this overlay includes the aforementioned one or more different graphical icons and a visual cue indicating that any information object that is transferred from the public display device will be broadcast to all of the audience members. This overlay can also include the aforementioned text that is displayed alongside each graphical icon. Whenever the presenter-to-entire-audience transfer mode is enabled the presenter can perform the information-push touch gesture on an information object that is displayed on the public display device's display screen, which will cause the central computing device to transmit a copy of the information object to the personal computing device that is associated with each of the audience members.
- 1.5 Enriching Use of Public Display Device with Skeletal Tracking
- This section describes exemplary embodiments of information sharing democratization techniques which enrich the experience of using the public display device for presenters.
- In an exemplary embodiment of the information sharing democratization technique described herein the skeletal tracking application running on the central computing device can operate cooperatively with the presenter-oriented and audience-oriented object sensing devices to identify how many presenters and how many audience members are in the meeting space at any given point in time. As will now be described in more detail, the central computing device can use this information to detect various social contexts and then assign a prescribed mode of operation based on the particular social context that is detected at a given point in time. Exemplary modes of operation include, but are not limited to, the following.
- Whenever a condition occurs where there are no meeting attendees in the meeting space (i.e., there are no presenters and no audience members), the skeletal tracking application running on the central computing device will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, the central computing device will enable an ambient display mode, and the central computing device will display one or more default information objects on the public display device. Exemplary default information objects include a calendar for the group meetings that are scheduled in the meeting space for the current day or week, and a list of current software bug counts for each software development team member, among others. The central computing device will remain in the ambient display mode for as long as there are no meeting attendees in the meeting space. Whenever the ambient display mode is enabled and a meeting attendee enters the meeting space, the meeting attendee can transfer (i.e., pull) a copy of a default information object from the public display device to their touch-enabled portable personal computing device in the various manners described heretofore.
- Whenever the ambient display mode is enabled and another condition occurs where one or more meeting attendees enter and remain in the meeting space for a prescribed period of time (e.g., two minutes), the skeletal tracking application will operate cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify this condition, the central computing device will disable the ambient display mode, and the central computing device will remove the default information objects from the public display device. Whenever the ambient display mode is enabled and a one of the one or more meeting attendees either locally performs an allowed touch gesture on the display screen of the public display device, or remotely performs an allowed in-air gesture on this display screen, the central computing device will also disable the ambient display mode and remove the default information objects from the public display device.
- Whenever another condition occurs where there is just one presenter and they are facing toward the audience (i.e., the presenter is facing away from the public display device), the skeletal tracking application will operate cooperatively with the presenter-oriented object sensing device to identify this condition, the central computing device will enable a single-speaking-presenter mode, and the central computing device will hide all user interface elements that specifically support the presenter that are currently being displayed on the public display device (such as the posture palettes described hereafter, among others). This is advantageous since the presenter is not looking at these user interface elements and they can obscure the audience's view of other information objects being displayed on the public display device. The central computing device will remain in the single-speaking-presenter mode for as long as there is just one presenter and they are facing toward the audience.
- Whenever another condition occurs where there is just one presenter and they are facing toward the public display device enough to see its contents, the skeletal tracking application will operate cooperatively with the presenter-oriented object sensing device to identify this condition, the central computing device will enable a single-working-presenter mode, and the central computing device will display one or more of the aforementioned user interface elements that support the presenter. The central computing device will remain in the single-working-presenter mode for as long as there is just one presenter and they are facing toward the public display device.
-
FIG. 10 illustrates an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing two or more presenters to locally, independently and concurrently manipulate the information that is displayed on the public display device. As exemplified inFIG. 10 , whenever another condition occurs where there are two ormore presenters public display device 1004 into a number ofdifferent regions many presenters graphical element 1010 betweenadjacent regions public display device 1004 is operating in the multi-working presenters mode. - Referring again to
FIG. 10 , each of thedifferent regions public display device 1004 provide the presenter that is facing toward the region the ability to independently manipulate (via touch gestures on the region) the underlying information content that was being displayed on the public display device at the time the central computing device enabled the multi-working presenters mode. As such, whenever the multi-working presenters mode is enabled a given one of the two or more presenters (e.g., presenter 1000) can perform touch gestures on theregion 1008 that is in front of them to independently pan the region across the underlying information content, or open an information object that is displayed in the region, among other things. It will be appreciated that the multi-working presenters mode is advantageous since it eliminates the contention issues that can arise whenever two or more presenters want to manipulate the public display device at the same time. Whenever the multi-working presenters mode is enabled and another condition occurs where there are no longer two or more presenters that are facing toward thepublic display device 1004 enough to see its contents, the skeletal tracking application will operate cooperatively with the presenter-oriented object sensing device to identify this condition, the central computing device will disable the multi-working presenters mode, and the central computing device will remove thedifferent regions graphical element 1010 from the public display device. - Whenever another condition occurs where there are no presenters and there are one or more audience members, the skeletal tracking application will operate cooperatively with the audience-oriented and presenter-oriented object sending devices to identify this condition, and the central computing device will enter an audience-only mode which operates as follows. Whenever the audience-only mode is enabled, and the package metaphor mode is also enabled, and the central computing device displays a package icon on the public display device (which indicates that an audience member remotely transferred (i.e., pushed) copies of one or more information objects from their portable personal computing device to the public display device as described heretofore), the central computing device will automatically display the information objects that were transferred on the public display device. This is advantageous since there is currently no presenter at the public display device who can hover their hand over or tap on the package icon as described heretofore. Whenever the audience-only mode is enabled and a presenter arrives at the public display device, the skeletal tracking application will operate cooperatively with the audience-oriented and presenter-oriented object sending devices to identify this condition, and the central computing device will disable the audience-only mode.
-
FIGS. 11A-11C illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a presenter to transfer an information object from the public display device to a tertiary display region which is optically projected via an optical projection device onto a prescribed location on a tertiary surface in the meeting space that is near the public display device as described heretofore. As will be appreciated from the more detailed description that follows, this technique embodiment is advantageous since it allows the size of the public display device to be virtually expanded in an easy, low cost and flexible manner. - As exemplified in
FIG. 11A , whenever a condition occurs where a presenter forms the remotelocation indicator posture 1102 and points 1106 this posture at thetertiary display region 1110, the skeletal tracking application running on the central computing device will operate cooperatively with the presenter-oriented object sensing device to identify this condition, and the central computing device will enable a tertiary display mode. The tertiary display application running on the central computing device will then cause the optical projection device in the meeting space to optically project thelocation indicator 1108 within thetertiary display region 1110, where the position of the location indicator within the tertiary display region corresponds to where the remotelocation indicator posture 1102 is currently being pointed 1106. The central computing device will remain in the tertiary display mode and continue to cause the optical projection device to optically project thelocation indicator 1108 within thetertiary display region 1110 for as long as the presenter maintains the remotelocation indicator posture 1102 and points 1106 it at the tertiary display region. Whenever the tertiary display mode is enabled and another condition occurs where the presenter either stops forming the remotelocation indicator posture 1102 or points it away from thetertiary display region 1110, the skeletal tracking application will operate cooperatively with the presenter-oriented object sensing device to identify this condition, and the central computing device will disable the tertiary display mode and cause the optical projection device to remove the location indicator from the tertiary display region. - As exemplified in
FIGS. 11B and 11C , whenever the tertiary display mode is enabled and the presenter performs the information-push touch gesture 1116 on aninformation object 1118 that is displayed on the display screen of thepublic display device 1100, the tertiary display application running on the central computing device will cause the optical projection device to optically project theinformation object 1120 within thetertiary display region 1110 and behind thelocation indicator 1108, and the central computing device will remove theinformation object 1118 from thepublic display device 1100. Whenever the presenter subsequently performs the information-pull touch gesture (not shown) on thedisplay screen 1122 of thepublic display device 1100, the tertiary display application will cause the projection device to remove the information object from the display region, and the central computing device will re-display the information object on the public display device. - 1.5.3 Adding Content to Public Display Device with Posture Palette
-
FIGS. 12A-120 illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a presenter to add information content to the public display device by using a posture palette. As exemplified inFIGS. 12A and 12B , the term “posture palette” is used herein to refer to a palette ofgraphical icons 1206 and an optional region of supportingtext 1208 which are displayed on a touch-enabledpublic display device 1200 whenever the presenter forms apalette generating posture 1202 and hovers this posture over thedisplay screen 1204 of the public display device for a prescribed period of time (e.g., two seconds). As will now be described in more detail, theposture palette 1206/1208 can be used by the presenter to add various types of information content to the display screen of the public display device. As will be appreciated from the more detailed description that follows, this technique embodiment is advantageous since it eliminates having to use a toolbar on the public display device, which can result in the presenter having to walk in front of the public display device to reach distant commands. - As is exemplified in
FIG. 12A , whenever a condition occurs where a presenter forms thepalette generating posture 1202 and hovers this posture over thedisplay screen 1204 of thepublic display device 1200 for the prescribed period of time, the skeletal tracking application running on the central computing device will operate cooperatively with the presenter-oriented object sensing device to identify this condition, the central computing device will enable a palette mode and will also display theposture palette 1206/1208 on the display screen of the public display device in a prescribed position thereon that is adjacent to the current location of this posture. In order to visually indicate to all the meeting attendees that the palette mode is active, the central computing device can optionally also display a colored (e.g., green) overlay (not shown) on thepublic display device 1200 in an area immediately surrounding the presenter, and can also optionally display a colored (e.g., red)shadow 1224 behind thepalette generating posture 1202. Theshadow 1224 can be implemented in various ways. In an exemplary embodiment of the information sharing democratization technique described herein theshadow 1224 is implemented as a semi-transparent overlay. - Generally speaking and referring again to
FIG. 12A , thepalette generating posture 1202 can be implemented using any type of posture that is recognizable by the combination of the audience-oriented object sensing device and skeletal tracking application, and is differentiable from the other postures described herein. By way of example but not limitation, in the particular technique embodiment exemplified inFIG. 12A thepalette generating posture 1202 is implemented as ahand 1210 of the presenter having its palm open and all of itsfingers 1212 spread apart. In the implementation of the information sharing democratization technique embodiment that is exemplified inFIG. 12A , the prescribed location is to the right of thepalette generating posture 1202. It will be appreciated that other implementations (not shown) of this technique embodiment are also possible where the prescribed location is either above or to the left of the palette generating posture, among other places. - Referring again to
FIG. 12A , the central computing device will remain in the palette mode and continue to display theposture palette 1206/1208 on thedisplay screen 1204 of thepublic display device 1200 for as long as the presenter maintains thepalette generating posture 1202 and hovers it over the display screen. Whenever the palette mode is enabled, the skeletal tracking application will track the position of the presenter and theirpalette generating posture 1202, and whenever the presenter or this posture moves, the skeletal tracking application will correspondingly move theposture palette 1206/1208, colored overlay immediately surrounding the presenter, andcolored shadow 1224 behindpalette generating posture 1202 on the public display device. This allows the presenter to reposition theposture palette 1206/1208 on thepublic display device 1200 as needed without having to touch the public display device (i.e., the presenter can move the posture palette closer to theirother hand 1214 when needed and then move the posture palette away from their other hand when needed). - As exemplified in
FIG. 12B , the presenter can use theposture palette 1206/1208 to add various items of information content to thepublic display device 1200 in the following manner. Each graphical icon in theposture palette 1206/1208 represents a different item of information content that the presenter may choose to add to thedisplay screen 1204 of thepublic display device 1200. Whenever the palette mode is enabled and the presenter touches a particulargraphical icon 1216 in theposture palette 1206 and then touch-drags this icon out of the posture palette and onto thedisplay screen 1204 by performing a touch-dragging movement 1218 on the display screen, the central computing device will track the touch-dragging movement and display theparticular information object 1220 that is associated with particular graphical icon on the display screen behind the current location of the touch-dragging movement. As exemplified inFIG. 12C , whenever the palette mode is enabled and another condition occurs where the presenter either stops forming the palette generating posture or stops hovering it over thedisplay screen 1204 of thepublic display device 1200, the skeletal tracking application will operate cooperatively with the presenter-oriented object sensing device to identify this condition, and the central computing device will disable the palette mode and remove the posture palette from the display screen. Whenever the presenter stops performing the touch-dragging movement 1218 and removes their touch from thedisplay screen 1204, theinformation object 1220 will continue to be displayed on the display screen. - It is noted that rather than the presenter bi-manually using the posture palette to add information content to the public display device as just described, an alternate embodiment of the information sharing democratization technique described herein is also possible where the presenter can uni-manually use the posture palette (i.e., with the same hand that is forming the palette generating posture). More particularly, after forming the palette generating posture using one hand and hovering this posture over the display screen as described heretofore, the presenter can then tap on the display screen using this same hand which will cause the posture palette to be pinned in a fixed position on the display screen where tap occurred. Once the posture palette is pinned, the presenter can then use this same hand to touch a particular graphical icon in the posture palette and touch-drag this icon out of the posture palette and onto the display screen as just described.
-
FIG. 13 illustrates an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a presenter to control an explicit touch gesture mode on the public display device by using a prescribed posture. As exemplified inFIG. 13 , whenever a condition occurs where a presenter forms thepalette generating posture 1302 and hovers this posture over thedisplay screen 1304 of thepublic display device 1300 for the prescribed period of time and the central computing device enables the palette mode as described in the immediately preceding section, and whenever the presenter then touches 1308 a region of the display screen where nothing is currently being displayed, the following things will happen. The central computing device will remove the posture palette (not shown) from the public display device 1300 (after which the presenter can stop forming thepalette generating posture 1302 or stop hovering it over thedisplay screen 1304 of the public display device 1300). The central computing device will also disable the palette mode whenever the presenter stops touching thedisplay screen 1304. - Referring again to
FIG. 13 , whenever the presenter touches 1308 the region of thedisplay screen 1304 where nothing is currently being displayed and holds this touch without moving it on the display screen for a prescribed period of time (e.g., two seconds), the central computing device will display a conventionalhierarchic marking menu 1310 which serves to guide the presenter through the various touch gestures that are allowed to be used on the display screen. Additionally, in the case where the presenter is already familiar with the various touch gestures that are allowed and wants to manipulate the information content that is currently displayed on thepublic display device 1300 rather than holding theirtouch 1308 on thedisplay screen 1304, they can immediately continue to execute one of the allowed touch gestures on the display screen. - It is noted that rather than the presenter bi-manually initiating the palette mode and executing an allowed touch gesture on the display screen of the public display device as just described, an alternate embodiment of the information sharing democratization technique described herein is also possible where the presenter can uni-manually (i.e., with just a single hand) initiate the palette mode and execute an allowed touch gesture. More particularly, after forming the palette generating posture using one hand and hovering this posture over the display screen as described heretofore, the presenter can then use one or more fingers of the same hand to touch a region of the display screen where nothing is currently being displayed to either display the hierarchic marking menu or execute one of the allowed touch gestures on the display screen.
- This section describes an exemplary embodiment of an information sharing democratization technique which allows meeting attendees to organize (e.g., sort, or categorize, or compare, among other activities) a set of information objects. As will be appreciated from the more detailed description that follows, this technique embodiment is advantageous in various group meeting scenarios such as the aforementioned project team group meetings which often involve assigning priorities to tasks, scheduling tasks, partitioning tasks among workers, and the like.
-
FIG. 14 illustrates exemplary embodiments, in simplified form, of workflow templates which can be displayed on the public display device and used by the meeting attendees to organize a set of information objects which are also displayed on this device. As exemplified inFIG. 14 , the workflow templates can include a threebuckets template 1400, aperipheral buckets template 1402, aVenn diagram template 1404, aplanning grid template 1406, apipeline template 1408, and aparallel pipeline template 1410, among others. Each of these templates includes a plurality of visual buckets (e.g., 1412, 1414, 1416, 1418, 1420, 1422 and 1424) which are arranged in a different prescribed structure, where each bucket serves as a container into which a given information object can be placed using a touch-dragging gesture on the display screen of the public display device. -
FIGS. 15A-15C illustrate an exemplary embodiment, in simplified form, of an information sharing democratization technique for allowing a presenter to use the three buckets template to organize a set of information objects which are displayed on the public display device. More particularly,FIG. 15A , illustrates a situation where the threebuckets template 1500 is displayed on the public display device (not shown) and the presenter has already moved a plurality of information objects into each of the three buckets (i.e., the presenter has already moved information objects 1508-1510 intobucket 1504, information objects 1511 and 1512 into bucket 1505, and information objects 1513-1515 into bucket 1502). As exemplified inFIGS. 15A-15C , whenever the presenter touches 1518 aninformation object 1516 that is displayed on the public display device but is not currently in a bucket, and the presenter then touch-drags 1520 this information object into a desiredbucket 1506, the gesture identification application that is running on the central computing device will identify this touch-dragging activity, a screen layout application that is also running on the central computing device will automatically resize this information object and any information objects that were previously in the desiredbucket - It will be appreciated that the presenter can also use touch-dragging gestures on the display screen of the public display device to move a given information object from one bucket to another, or move a given information object out of a bucket it is current in and back onto the display screen workspace (at which point the information object will return to its original size). The presenter can also use touch gestures to zoom in on a specific bucket to examine its contents in greater detail. It will also be appreciated that in addition to the presenter being the one who performs the organization and other information object manipulation activities described in this section, an audience member can also remotely perform these organization and manipulation activities.
-
FIG. 16 illustrates one embodiment, in simplified form, of a process for democratizing information sharing during a co-located group meeting in a meeting space. As exemplified inFIG. 16 , the process starts inblock 1600 with a condition occurring where an audience member forms the remote location indicator posture and remotely points it at the public display device. The central computing device then operates cooperatively with the audience-oriented object sensing device to identify this condition (block 1602). The central computing device then enables the remote location indicator mode for as long as this condition continues (block 1604), and displays the location indicator on the public display device in a position thereon corresponding to where the remote location indicator posture is currently being pointed. -
FIG. 17 illustrates another embodiment, in simplified form, of a process for democratizing information sharing during a co-located group meeting in a meeting space. As exemplified inFIG. 17 , the process starts inblock 1700 with a condition occurring where a presenter forms the remote location indicator posture and remotely points it at an audience member. The central computing device then operates cooperatively with both the audience-oriented object sensing device and the presenter-oriented object sensing device to identify this condition, identify the audience member, and identify a personal computing device that is associated with the audience member (block 1702). The central computing device then enables the presenter-to-audience-member transfer mode for as long as this condition continues (block 1704). Then, whenever the presenter-to-audience-member transfer mode is enabled and the presenter performs the information-push touch gesture on an information object that is displayed on the display screen of the public display device, the central computing device will transmit a copy of the information object to the personal computing device that is associated with the audience member (block 1706). -
FIGS. 18A and 18B illustrate yet another embodiment, in simplified form, of a process for democratizing information sharing during a co-located group meeting in a meeting space. As exemplified inFIG. 18A , the process starts inblock 1800 with a first condition occurring where a first meeting attendee who is using a touch-enabled non-handheld computing device forms the remote location indicator posture and remotely points it at a second meeting attendee. The central computing device then operates cooperatively with an object sensing device in the meeting space to identify this first condition, identify the second meeting attendee, and identify the personal computing device that is associated with the second meeting attendee (block 1802). The central computing device then enables the peer-to-peer transfer mode for as long as this first condition continues (block 1804). Then, whenever the peer-to-peer transfer mode is enabled and the first meeting attendee performs the information-push touch gesture on a first information object that is displayed on the display screen of the non-handheld computing device, the central computing device will receive a copy of the first information object from the non-handheld computing device and forward this copy to the personal computing device (block 1806). - As exemplified in
FIG. 18B , whenever a second condition occurs where the peer-to-peer transfer mode is disabled and a third meeting attendee who is using a touch-enabled handheld computing device forms the device pointing posture and remotely points it at the second meeting attendee (block 1808), the central computing device will operate cooperatively with the object sensing device to identify this second condition, identify the second meeting attendee, and identify the personal computing device that is associated with the second meeting attendee (block 1810). The central computing device then enables the peer-to-peer transfer mode for as long as this second condition continues (block 1812). Then, whenever the peer-to-peer transfer mode is enabled and the third meeting attendee performs the information-push touch gesture on a second information object that is displayed on the display screen of the handheld computing device, the central computing device will receive a copy of the second information object from the handheld computing device and forward this copy to the personal computing device (block 1814). - While the information sharing democratization technique has been described by specific reference to embodiments thereof, it is understood that variations and modifications thereof can be made without departing from the true spirit and scope of the information sharing democratization technique. By way of example but not limitation, in addition to the various touch gesture disclosure overlays described heretofore, an alternate embodiment of the information sharing democratization technique described herein is also possible where an online learning system (such as the conventional GestureBar user interface for learning gestural interactions, among others) can implemented to disclose the various touch gestures and in-air gestures that are allowed to be performed. A given presenter or audience member who initiates one of the information sharing operations describe herein can also be provided with haptic feedback when the sharing operation is successfully completed, where the nature of this feedback is adapted to the characteristics of the particular device the presenter or audience member initiated the sharing operation from. The object sensing devices can also be implemented in ways other than those described heretofore. By way of example but not limitation, the object sensing devices can be implemented using the “Peanut” ultra-low-power, short-range wireless radio technology configured to operate in a triangulation mode. This particular implementation is advantageous in that the radio signals generally won't pass through walls or other obstructions, so that the space being sensed by the object sensing devices will correspond to the physical dimensions of the meeting space.
- Another alternate embodiment of the information sharing democratization technique described herein is also possible where a large meeting space includes a plurality of public display devices. In this technique embodiment a different presenter-oriented object sensing device can be paired with each public display device so that the current physical location of any presenters who are adjacent to each public display device can be identified, and the in-air gestures performed by and postures formed by such presenters can also be identified. The touch gesture disclosure overlay that is displayed on the display screen of the touch-enabled handheld computing device can disclose a second spatial operand which allows the audience member that is operating this device to select which of the public display devices to control. A different optical projection device can also be paired with each public display device. Yet another alternate embodiment of the information sharing democratization technique described herein is also possible where a large meeting space includes a plurality of audience-oriented object sensing devices in order to support the identification of the current physical location of the audience members, and the identification of the in-air gestures performed by the audience members, and the identification of the postures formed by the audience members, among other things.
- It is also noted that any or all of the aforementioned embodiments can be used in any combination desired to form additional hybrid embodiments. Although the information sharing democratization technique embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described heretofore. Rather, the specific features and acts described heretofore are disclosed as example forms of implementing the claims.
- The information sharing democratization technique embodiments described herein are operational within numerous types of general purpose or special purpose computing system environments or configurations.
FIG. 19 illustrates a simplified example of a general-purpose computer system on which various embodiments and elements of the information sharing democratization technique, as described herein, may be implemented. It should be noted that any boxes that are represented by broken or dashed lines inFIG. 19 represent alternate embodiments of the simplified computing device, and that any or all of these alternate embodiments, as described below, may be used in combination with other alternate embodiments that are described throughout this document. - For example,
FIG. 19 shows a general system diagram showing asimplified computing device 1900. Such computing devices can be typically be found in devices having at least some minimum computational capability, including, but not limited to, personal computers (PCs), server computers, handheld computing devices, laptop or mobile computers, communications devices such as cell phones and personal digital assistants (PDAs), multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, and audio or video media players. - To allow a device to implement the information sharing democratization technique embodiments described herein, the device should have a sufficient computational capability and system memory to enable basic computational operations. In particular, as illustrated by
FIG. 19 , the computational capability is generally illustrated by one or more processing unit(s) 1910, and may also include one or more graphics processing units (GPUs) 1915, either or both in communication withsystem memory 1920. Note that that the processing unit(s) 1910 may be specialized microprocessors (such as a digital signal processor (DSP), a very long instruction word (VLIW) processor, or other micro-controller) or can be conventional central processing units (CPUs) having one or more processing cores including, but not limited to, specialized GPU-based cores in a multi-core CPU. - In addition, the simplified computing device of
FIG. 19 may also include other components, such as, for example, acommunications interface 1930. The simplified computing device ofFIG. 19 may also include one or more conventional computer input devices 1940 (e.g., pointing devices, keyboards, audio input devices, video input devices, haptic input devices, devices for receiving wired or wireless data transmissions, and the like). The simplified computing device ofFIG. 19 may also include other optional components, such as, for example, one or more conventional computer output devices 1950 (e.g., display device(s) 1955, audio output devices, video output devices, devices for transmitting wired or wireless data transmissions, and the like). Note thattypical communications interfaces 1930, input devices 1940, output devices 1950, andstorage devices 1960 for general-purpose computers are well known to those skilled in the art, and will not be described in detail herein. - The simplified computing device of
FIG. 19 may also include a variety of computer readable media. Computer readable media can be any available media that can be accessed bycomputer 1900 viastorage devices 1960, and includes both volatile and nonvolatile media that is either removable 1970 and/or non-removable 1980, for storage of information such as computer-readable or computer-executable instructions, data structures, program modules, or other data. By way of example but not limitation, computer readable media may include computer storage media and communication media. Computer storage media includes, but is not limited to, computer or machine readable media or storage devices such as digital versatile disks (DVDs), compact discs (CDs), floppy disks, tape drives, hard drives, optical drives, solid state memory devices, random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, magnetic cassettes, magnetic tapes, magnetic disk storage, or other magnetic storage devices, or any other device which can be used to store the desired information and which can be accessed by one or more computing devices. - Storage of information such as computer-readable or computer-executable instructions, data structures, program modules, and the like, can also be accomplished by using any of a variety of the aforementioned communication media to encode one or more modulated data signals or carrier waves, or other transport mechanisms or communications protocols, and includes any wired or wireless information delivery mechanism. Note that the terms “modulated data signal” or “carrier wave” generally refer a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. For example, communication media includes wired media such as a wired network or direct-wired connection carrying one or more modulated data signals, and wireless media such as acoustic, radio frequency (RF), infrared, laser, and other wireless media for transmitting and/or receiving one or more modulated data signals or carrier waves. Combinations of the any of the above should also be included within the scope of communication media.
- Furthermore, software, programs, and/or computer program products embodying the some or all of the various embodiments of the information sharing democratization technique described herein, or portions thereof, may be stored, received, transmitted, or read from any desired combination of computer or machine readable media or storage devices and communication media in the form of computer executable instructions or other data structures.
- Finally, the information sharing democratization technique embodiments described herein may be further described in the general context of computer-executable instructions, such as program modules, being executed by a computing device. Generally, program modules include routines, programs, objects, components, data structures, and the like, that perform particular tasks or implement particular abstract data types. The information sharing democratization technique embodiments may also be practiced in distributed computing environments where tasks are performed by one or more remote processing devices, or within a cloud of one or more devices, that are linked through one or more communications networks. In a distributed computing environment, program modules may be located in both local and remote computer storage media including media storage devices. Additionally, the aforementioned instructions may be implemented, in part or in whole, as hardware logic circuits, which may or may not include a processor.
Claims (20)
1. A computer-implemented process for democratizing information sharing during a co-located group meeting in a meeting space, comprising:
using a computer to perform the following process actions:
whenever a first condition occurs comprising a meeting attendee who is not within a prescribed distance of a public display device in the meeting space forming a remote location indicator posture and remotely pointing said posture at the public display device,
operating cooperatively with an audience-oriented object sensing device in the meeting space to identify the first condition,
enabling a remote location indicator mode for as long as the first condition continues, and
displaying a location indicator on the public display device in a position thereon corresponding to where said posture is currently being pointed.
2. The process of claim 1 , further comprising the actions of:
whenever a second condition occurs comprising the meeting attendee forming a remote dragging posture and remotely pointing said posture at the public display device,
operating cooperatively with the audience-oriented object sensing device to identify the second condition,
enabling a remote dragging mode for as long as the second condition continues, and
displaying a dragging cursor on the public display device in a position thereon corresponding to where said posture is currently being pointed; and
whenever the remote dragging mode is enabled and the meeting attendee points said posture at an information object that is displayed on the public display device and then changes where on the public display device said posture is being pointed, operating cooperatively with the object sensing device to track said changes and moving the information object on the public display device accordingly.
3. The process of claim 1 , wherein the meeting attendee is using a touch-enabled handheld computing device, further comprising the actions of:
whenever a second condition occurs comprising the meeting attendee forming a device pointing posture with the handheld computing device and remotely pointing said posture at the public display device,
operating cooperatively with the audience-oriented object sensing device to identify the second condition,
enabling a device pointing mode for as long as the second condition continues, and
displaying the location indicator on the public display device in a position thereon corresponding to where said posture is currently being pointed; and
whenever the device pointing mode is enabled and the meeting attendee performs an allowed touch gesture on a display screen of the handheld computing device, receiving a command from the handheld computing device that is associated with said touch gesture and executing the command.
4. The process of claim 3 , wherein the process action of whenever the device pointing mode is enabled and the meeting attendee performs an allowed touch gesture on a display screen of the handheld computing device, receiving a command from the handheld computing device that is associated with said touch gesture and executing the command comprises the actions of:
whenever the meeting attendee remotely points the device pointing posture at an information object that is displayed on the public display device, and the meeting attendee then performs an information-pull touch gesture on said display screen,
receiving an information-pull command from the handheld computing device, said command requesting that a copy of the information object be transferred to the handheld computing device, and
transmitting a copy of the information object to the handheld computing device.
5. The process of claim 3 , wherein the process action of whenever the device pointing mode is enabled and the meeting attendee performs an allowed touch gesture on a display screen of the handheld computing device, receiving a command from the handheld computing device that is associated with said touch gesture and executing the command comprises the actions of:
whenever the meeting attendee remotely points the device pointing posture at a desired location on the public display device, and the meeting attendee then performs an information-push touch gesture on an information object that is displayed on said display screen,
receiving a copy of said information object from the handheld computing device, and either,
displaying a scaled-down version of said information object at the desired location on the public display device, or
whenever a package metaphor mode is enabled, displaying an icon at the desired location on the public display device, wherein the icon serves as a surrogate for said information object.
6. The process of claim 3 , further comprising the actions of:
whenever the device pointing mode is enabled and a third condition occurs comprising the second condition continuing for a prescribed period of time,
operating cooperatively with the audience-oriented object sensing device to identify the third condition, and
transmitting a command to the handheld computing device instructing it to display a touch gesture disclosure overlay on its display screen, said overlay comprising one or more different graphical icons each of which discloses a different rectilinear, mark-based touch gesture that is allowed to be performed by the meeting attendee on said display screen.
7. The process of claim 1 , wherein the meeting attendee is using a touch-enabled handheld computing device, further comprising the actions of:
whenever a second condition occurs comprising the meeting attendee forming a device dragging posture with the handheld computing device and remotely pointing said posture at the public display device,
operating cooperatively with the audience-oriented object sensing device to identify the second condition,
enabling a device dragging mode for as long as the second condition continues, and
displaying a dragging cursor on the public display device in a position thereon corresponding to where said posture is being pointed; and
whenever the device dragging mode is enabled and the meeting attendee points said posture at an information object that is displayed on the public display device and then changes where on the public display device said posture is being pointed, operating cooperatively with the object sensing device to track said changes and moving the information object on the public display device accordingly.
8. The process of claim 1 , wherein the meeting attendee is using a touch-enabled handheld computing device, further comprising the actions of:
whenever a second condition occurs comprising the meeting attendee forming a device annotating posture with the handheld computing device and remotely pointing said posture at the public display device,
operating cooperatively with the audience-oriented object sensing device to identify the second condition, and
enabling a remote annotation mode for as long as the second condition continues;
whenever the remote annotation mode is enabled and the meeting attendee changes where on the public display device said posture is being pointed, operating cooperatively with the object sensing device to track said changes and drawing one or more annotation marks on the public display device according to said changes.
9. The process of claim 1 , wherein the meeting attendee is using a touch-enabled non-handheld computing device, further comprising the action of, whenever the remote location indicator mode is enabled and the meeting attendee performs an allowed touch gesture on a display screen of the non-handheld computing device, receiving a command from the non-handheld computing device that is associated with said touch gesture and executing the command.
10. The process of claim 9 , wherein the process action of whenever the remote location indicator mode is enabled and the meeting attendee performs an allowed touch gesture on a display screen of the non-handheld computing device, receiving a command from the non-handheld computing device that is associated with said touch gesture and executing the command comprises the actions of:
whenever the meeting attendee remotely points the remote location indicator posture at a desired location on the public display device, and the meeting attendee then performs an information-push touch gesture on an information object that is displayed on said display screen,
receiving a copy of said information object from the non-handheld computing device, and either,
displaying a scaled-down version of said information object at the desired location on the public display device, or
whenever a package metaphor mode is enabled, displaying an icon at the desired location on the public display device, wherein the icon serves as a surrogate for said information object.
11. The process of claim 9 , wherein the process action of whenever the remote location indicator mode is enabled and the meeting attendee performs an allowed touch gesture on a display screen of the non-handheld computing device, receiving a command from the non-handheld computing device that is associated with said touch gesture and executing the command comprises the actions of:
whenever the meeting attendee remotely points the remote location indicator posture at an information object that is displayed on the public display device, and the meeting attendee then performs an information-pull touch gesture on said display screen,
receiving an information-pull command from the non-handheld computing device, said command requesting that a copy of said information object be transferred to the non-handheld computing device, and
transmitting a copy of said information object to the non-handheld computing device.
12. The process of claim 1 , wherein the meeting attendee is using a handheld computing device, further comprising the actions of:
whenever a second condition occurs comprising the meeting attendee forming a device sharing posture with the handheld computing device and remotely pointing said posture at the public display device,
operating cooperatively with the audience-oriented object sensing device to identify the second condition,
enabling a transient sharing mode for as long as the second condition continues,
receiving a copy of the current contents of a display screen of the handheld computing device from the handheld computing device, and
displaying said current contents in the form of an overlay on the public display device.
13. The process of claim 1 , wherein the meeting attendee is using a touch-enabled non-handheld computing device, further comprising the actions of:
whenever a second condition occurs comprising the meeting attendee forming a remote dragging posture and remotely pointing said posture at the public display device,
operating cooperatively with the audience-oriented object sensing device to identify the second condition, and
enabling a transient sharing mode for as long as the second condition continues; and
whenever the transient sharing mode is enabled and the meeting attendee touches a display screen of the non-handheld computing device,
receiving a copy of the current contents of said display screen from the non-handheld computing device, and
displaying said current contents in the form of an overlay on the public display device.
14. A computer-implemented process for democratizing information sharing during a co-located group meeting in a meeting space, comprising:
using a computer to performing the following process actions:
whenever a first condition occurs comprising a first meeting attendee who is within a first prescribed distance of a public display device comprising a touch-enabled display screen in the meeting space forming a remote location indicator posture and remotely pointing said posture at a second meeting attendee who is not within the prescribed distance of the public display device,
operating cooperatively with both an audience-oriented object sensing device in the meeting space, and a presenter-oriented object sensing device in the meeting space, to identify the first condition, identify the second meeting attendee, and identify a personal computing device that is associated with the second meeting attendee, and
enabling a presenter-to-audience-member transfer mode for as long as the first condition continues; and
whenever the presenter-to-audience-member transfer mode is enabled and the first meeting attendee performs an information-push touch gesture on an information object that is displayed on the display screen, transmitting a copy of said information object to said personal computing device.
15. The process of claim 14 , wherein the first meeting attendee forms the remote location indicator posture using one of their hands, further comprising the actions of:
whenever the presenter-to-audience-member transfer mode is enabled and a second condition occurs comprising the first condition continuing for a prescribed period of time and the other hand of the first meeting attendee being within a second prescribed distance of the display screen,
operating cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify the second condition, and
displaying a touch gesture disclosure overlay around the first meeting attendee on the public display device, said overlay comprising one or more different graphical icons each of which discloses a different rectilinear, mark-based touch gesture that is allowed to be performed by the first meeting attendee on the display screen;
whenever a third condition occurs comprising the first meeting attendee forming said posture and remotely pointing it away from the public display device and at a floor of the meeting space,
operating cooperatively with both the audience-oriented and presenter-oriented object sensing devices to identify the third condition, identify any meeting attendees who are not within the first prescribed distance of the public display device, and identify the personal computing device that is associated with each of said attendees, and
enabling a presenter-to-entire-audience transfer mode for as long as the third condition continues; and
whenever the presenter-to-entire-audience transfer mode is enabled and the first meeting attendee performs the information-push touch gesture on the information object that is displayed on the display screen, transmitting a copy of said information object to the personal computing device that is associated with each of the meeting attendees who are not within the first prescribed distance of the public display device.
16. The process of claim 14 , further comprising the actions of:
whenever a second condition occurs comprising there being no meeting attendees in the meeting space,
operating cooperatively with the audience-oriented and presenter-oriented object sensing devices to identify the second condition,
enabling an ambient display mode for as long as the second condition continues, and
displaying one or more default information objects on the public display device; and
whenever the ambient display mode is enabled and a third condition occurs comprising one or more meeting attendees entering and remaining in the meeting space for a prescribed period of time,
operating cooperatively with the audience-oriented and presenter-oriented object sensing devices to identify the third condition,
disabling the ambient display mode, and
removing the default information objects from the public display device.
17. The process of claim 14 , further comprising the actions of:
whenever a second condition occurs comprising there being two or more meeting attendees within the first prescribed distance of the public display device and said attendees are facing toward the public display device enough to see its contents,
operating cooperatively with the presenter-oriented object sensing device to identify the second condition,
enabling a multi-working presenters mode, and
segmenting the display screen into a number of different regions, wherein said number equals a current count of the two or more meeting attendees, each different region is positioned in front of a different one of the two or more meeting attendees, and a splitter graphical element is displayed between adjacent regions.
18. The process of claim 14 , further comprising the actions of:
whenever a second condition occurs comprising the first meeting attendee forming the remote location indicator posture and pointing said posture at a tertiary display region which is optically projected onto a prescribed location on a tertiary surface in the meeting space that is located near the public display device,
operating cooperatively with the presenter-oriented object sensing device to identify the second condition,
enabling a tertiary display mode for as long as the second condition continues, and
causing an optical projection device in the meeting space to optically project a location indicator within said region in a position there-within corresponding to where said posture is currently being pointed; and
whenever the tertiary display mode is enabled and the first meeting attendee performs the information-push touch gesture on the information object that is displayed on the display screen,
causing the optical projection device to optically project said information object within the tertiary display region, and
removing said information object from the public display device.
19. The process of claim 14 , further comprising the actions of:
whenever a second condition occurs comprising the first meeting attendee forming a palette generating posture and hovering said posture over the display screen for a prescribed period of time,
operating cooperatively with the presenter-oriented object sensing device to identify the second condition,
enabling a palette mode for as long as the second condition continues, and
displaying a posture palette on the display screen in a position thereon that is adjacent to the current location of said posture, wherein the posture palette comprises a palette of graphical icons and each of said icons represents a different item of information content that the first meeting attendee may choose to add to the display screen.
20. A computer-implemented process for democratizing information sharing during a co-located group meeting in a meeting space, comprising:
using a computer to performing the following process actions:
whenever a first condition occurs comprising a first meeting attendee who is using a touch-enabled non-handheld computing device forming a remote location indicator posture and remotely pointing said posture at a second meeting attendee,
operating cooperatively with an object sensing device in the meeting space to identify the first condition, identify the second meeting attendee, and identify a personal computing device that is associated with the second meeting attendee, and
enabling a peer-to-peer transfer mode for as long as the first condition continues;
whenever the peer-to-peer transfer mode is enabled and the first meeting attendee performs an information-push touch gesture on a first information object that is displayed on a display screen of the non-handheld computing device,
receiving a copy of the first information object from the non-handheld computing device, and
forwarding the copy of the first information object to said personal computing device;
whenever a second condition occurs comprising the peer-to-peer transfer mode being disabled and a third meeting attendee who is using a touch-enabled handheld computing device forming a device pointing posture and remotely pointing said posture at the second meeting attendee,
operating cooperatively with the object sensing device to identify the second condition, identify the second meeting attendee, and identify the personal computing device that is associated with the second meeting attendee, and
enabling the peer-to-peer transfer mode for as long as the second condition continues; and
whenever the peer-to-peer transfer mode is enabled and the third meeting attendee performs the information-push touch gesture on a second information object that is displayed on a display screen of the handheld computing device,
receiving a copy of the second information object from the handheld computing device, and
forwarding the copy of the second information object to said personal computing device.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/278,065 US20130103446A1 (en) | 2011-10-20 | 2011-10-20 | Information sharing democratization for co-located group meetings |
US14/269,070 US9659280B2 (en) | 2011-10-20 | 2014-05-02 | Information sharing democratization for co-located group meetings |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/278,065 US20130103446A1 (en) | 2011-10-20 | 2011-10-20 | Information sharing democratization for co-located group meetings |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/269,070 Division US9659280B2 (en) | 2011-10-20 | 2014-05-02 | Information sharing democratization for co-located group meetings |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130103446A1 true US20130103446A1 (en) | 2013-04-25 |
Family
ID=48136710
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/278,065 Abandoned US20130103446A1 (en) | 2011-10-20 | 2011-10-20 | Information sharing democratization for co-located group meetings |
US14/269,070 Active 2033-02-20 US9659280B2 (en) | 2011-10-20 | 2014-05-02 | Information sharing democratization for co-located group meetings |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/269,070 Active 2033-02-20 US9659280B2 (en) | 2011-10-20 | 2014-05-02 | Information sharing democratization for co-located group meetings |
Country Status (1)
Country | Link |
---|---|
US (2) | US20130103446A1 (en) |
Cited By (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120216151A1 (en) * | 2011-02-22 | 2012-08-23 | Cisco Technology, Inc. | Using Gestures to Schedule and Manage Meetings |
US20120326995A1 (en) * | 2011-06-24 | 2012-12-27 | Ricoh Company, Ltd. | Virtual touch panel system and interactive mode auto-switching method |
US20130110768A1 (en) * | 2011-10-31 | 2013-05-02 | Fujitsu Limited | Method for managing data, medium, and apparatus for managing data |
US20130191719A1 (en) * | 2012-01-19 | 2013-07-25 | Microsoft Corporation | Notebook driven accumulation of meeting documentation and notations |
US20130198642A1 (en) * | 2003-03-14 | 2013-08-01 | Comcast Cable Communications, Llc | Providing Supplemental Content |
US20130229345A1 (en) * | 2012-03-01 | 2013-09-05 | Laura E. Day | Manual Manipulation of Onscreen Objects |
US20130236101A1 (en) * | 2012-03-08 | 2013-09-12 | Fuji Xerox Co., Ltd. | Information processing apparatus, non-transitory computer readable medium, and information processing method |
US20140026068A1 (en) * | 2012-07-20 | 2014-01-23 | Samsung Electronics Co., Ltd. | Method of controlling display of display device by mobile terminal and mobile terminal for the same |
US20140104168A1 (en) * | 2012-10-12 | 2014-04-17 | Microsoft Corporation | Touchless input |
US20140214471A1 (en) * | 2013-01-31 | 2014-07-31 | Donald Raymond Schreiner, III | System for Tracking Preparation Time and Attendance at a Meeting |
US20140282273A1 (en) * | 2013-03-15 | 2014-09-18 | Glen J. Anderson | System and method for assigning voice and gesture command areas |
US20140331141A1 (en) * | 2013-05-03 | 2014-11-06 | Adobe Systems Incorporated | Context visual organizer for multi-screen display |
US20140351718A1 (en) * | 2013-05-24 | 2014-11-27 | Fuji Xerox Co., Ltd. | Information processing device, information processing method, and computer-readable medium |
US20150007056A1 (en) * | 2013-06-28 | 2015-01-01 | Linkedln Corporation | Virtual conference manager |
US20150006669A1 (en) * | 2013-07-01 | 2015-01-01 | Google Inc. | Systems and methods for directing information flow |
US20150121231A1 (en) * | 2013-10-28 | 2015-04-30 | Promethean Limited | Systems and Methods for Interactively Presenting a Presentation to Viewers |
US20150199018A1 (en) * | 2014-01-14 | 2015-07-16 | Microsoft Corporation | 3d silhouette sensing system |
US20150339090A1 (en) * | 2014-05-23 | 2015-11-26 | Samsung Electronics Co., Ltd. | Sharing a screen between electronic devices |
US20150363026A1 (en) * | 2014-06-16 | 2015-12-17 | Touchplus Information Corp. | Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof |
US9294539B2 (en) | 2013-03-14 | 2016-03-22 | Microsoft Technology Licensing, Llc | Cooperative federation of digital devices via proxemics and device micro-mobility |
WO2016099708A1 (en) * | 2014-12-18 | 2016-06-23 | Google Inc. | Methods, systems, and media for launching a mobile application using a public display device |
US20170019473A1 (en) * | 2015-07-16 | 2017-01-19 | Promethean Limited | Multi-network mirroring systems and methods |
US20170118436A1 (en) * | 2012-07-25 | 2017-04-27 | Samsung Electronics Co., Ltd. | Method and mobile terminal for displaying information, method and display device for providing information, and method and mobile terminal for generating control signal |
US20170242648A1 (en) * | 2016-02-19 | 2017-08-24 | RAPC Systems, Inc. | Combined Function Control And Display And System For Displaying And Controlling Multiple Functions |
US20170251238A1 (en) * | 2016-02-29 | 2017-08-31 | T1V, Inc. | System for connecting a mobile device and a common display |
US9759420B1 (en) | 2013-01-25 | 2017-09-12 | Steelcase Inc. | Curved display and curved display support |
US9804731B1 (en) | 2013-01-25 | 2017-10-31 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
FR3050894A1 (en) * | 2016-04-27 | 2017-11-03 | Jean Neimark | INTERACTIVE AND MULTI-USER COMMUNICATION SYSTEM FOR APPLICATIONS IN THE FIELDS OF EVENT OR RECREATION |
US9841939B2 (en) | 2014-12-18 | 2017-12-12 | Google Inc. | Methods, systems, and media for presenting requested content on public display devices |
US20180107341A1 (en) * | 2016-10-16 | 2018-04-19 | Dell Products, L.P. | Volumetric Tracking for Orthogonal Displays in an Electronic Collaboration Setting |
US9967320B2 (en) | 2014-12-18 | 2018-05-08 | Google Llc | Methods, systems, and media for controlling information used to present content on a public display device |
US9967611B2 (en) | 2002-09-19 | 2018-05-08 | Comcast Cable Communications Management, Llc | Prioritized placement of content elements for iTV applications |
US9992546B2 (en) | 2003-09-16 | 2018-06-05 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US10110973B2 (en) | 2005-05-03 | 2018-10-23 | Comcast Cable Communications Management, Llc | Validation of content |
US10149014B2 (en) | 2001-09-19 | 2018-12-04 | Comcast Cable Communications Management, Llc | Guide menu based on a repeatedly-rotating sequence |
US10171878B2 (en) | 2003-03-14 | 2019-01-01 | Comcast Cable Communications Management, Llc | Validating data of an interactive content application |
US20190056857A1 (en) * | 2017-08-18 | 2019-02-21 | Microsoft Technology Licensing, Llc | Resizing an active region of a user interface |
US10237617B2 (en) | 2003-03-14 | 2019-03-19 | Comcast Cable Communications Management, Llc | System and method for blending linear content, non-linear content or managed content |
US10264213B1 (en) | 2016-12-15 | 2019-04-16 | Steelcase Inc. | Content amplification system and method |
US10341397B2 (en) * | 2015-08-12 | 2019-07-02 | Fuji Xerox Co., Ltd. | Non-transitory computer readable medium, information processing apparatus, and information processing system for recording minutes information |
US10417991B2 (en) | 2017-08-18 | 2019-09-17 | Microsoft Technology Licensing, Llc | Multi-display device user interface modification |
US10602225B2 (en) | 2001-09-19 | 2020-03-24 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US10606255B2 (en) * | 2014-03-25 | 2020-03-31 | Mitsubishi Electric Corporation | Plant monitor and control system |
US10880609B2 (en) | 2013-03-14 | 2020-12-29 | Comcast Cable Communications, Llc | Content event messaging |
US10904490B1 (en) * | 2014-02-13 | 2021-01-26 | Steelcase Inc. | Inferred activity based conference enhancement method and system |
US10984570B2 (en) * | 2019-01-04 | 2021-04-20 | Boe Technology Group Co., Ltd. | Picture marking method and apparatus, computer device, and computer readable storage medium |
US11070890B2 (en) | 2002-08-06 | 2021-07-20 | Comcast Cable Communications Management, Llc | User customization of user interfaces for interactive television |
US11115722B2 (en) | 2012-11-08 | 2021-09-07 | Comcast Cable Communications, Llc | Crowdsourcing supplemental content |
US11144959B2 (en) | 2014-12-18 | 2021-10-12 | Google Llc | Methods, systems, and media for presenting advertisements relevant to nearby users on a public display device |
US11237699B2 (en) | 2017-08-18 | 2022-02-01 | Microsoft Technology Licensing, Llc | Proximal menu generation |
US11301124B2 (en) | 2017-08-18 | 2022-04-12 | Microsoft Technology Licensing, Llc | User interface modification using preview panel |
US11327626B1 (en) | 2013-01-25 | 2022-05-10 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
US11381875B2 (en) | 2003-03-14 | 2022-07-05 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US11388451B2 (en) | 2001-11-27 | 2022-07-12 | Comcast Cable Communications Management, Llc | Method and system for enabling data-rich interactive television using broadcast database |
US11412306B2 (en) | 2002-03-15 | 2022-08-09 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US20220342485A1 (en) * | 2019-09-20 | 2022-10-27 | Interdigital Ce Patent Holdings, Sas | Device and method for hand-based user interaction in vr and ar environments |
US20220374188A1 (en) * | 2021-05-19 | 2022-11-24 | Benq Corporation | Electronic billboard and controlling method thereof |
US20230046864A1 (en) * | 2015-07-16 | 2023-02-16 | Promethean Limited | Multi-network computing device integration systems and methods |
US11783382B2 (en) | 2014-10-22 | 2023-10-10 | Comcast Cable Communications, Llc | Systems and methods for curating content metadata |
US11832024B2 (en) | 2008-11-20 | 2023-11-28 | Comcast Cable Communications, Llc | Method and apparatus for delivering video and video-related content at sub-asset level |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9557878B2 (en) * | 2012-04-25 | 2017-01-31 | International Business Machines Corporation | Permitting participant configurable view selection within a screen sharing session |
US9047472B2 (en) * | 2013-01-14 | 2015-06-02 | International Business Machines Corporation | Managing sensitive content |
US10664162B2 (en) * | 2013-11-18 | 2020-05-26 | Red Hat, Inc. | Multiple display management |
US9671931B2 (en) * | 2015-01-04 | 2017-06-06 | Personify, Inc. | Methods and systems for visually deemphasizing a displayed persona |
US10459676B2 (en) * | 2016-10-16 | 2019-10-29 | Dell Products, L.P. | Dynamic user interface for multiple shared displays in an electronic collaboration setting |
US9883142B1 (en) * | 2017-03-21 | 2018-01-30 | Cisco Technology, Inc. | Automated collaboration system |
DE102022205559A1 (en) | 2022-05-31 | 2023-11-30 | Robert Bosch Gesellschaft mit beschränkter Haftung | Display device for a motor vehicle |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090322676A1 (en) * | 2007-09-07 | 2009-12-31 | Apple Inc. | Gui applications for use with 3d remote controller |
US7995090B2 (en) * | 2003-07-28 | 2011-08-09 | Fuji Xerox Co., Ltd. | Video enabled tele-presence control host |
US20110197147A1 (en) * | 2010-02-11 | 2011-08-11 | Apple Inc. | Projected display shared workspaces |
US20120017147A1 (en) * | 2010-07-16 | 2012-01-19 | John Liam Mark | Methods and systems for interacting with projected user interface |
US20120192084A1 (en) * | 2010-10-25 | 2012-07-26 | Dedo Interactive, Inc. | Synchronized panel technology |
US8312392B2 (en) * | 2009-10-02 | 2012-11-13 | Qualcomm Incorporated | User interface gestures and methods for providing file sharing functionality |
US8413077B2 (en) * | 2008-12-25 | 2013-04-02 | Sony Corporation | Input apparatus, handheld apparatus, and control method |
US8452057B2 (en) * | 2010-05-17 | 2013-05-28 | Hon Hai Precision Industry Co., Ltd. | Projector and projection control method |
US8555207B2 (en) * | 2008-02-27 | 2013-10-08 | Qualcomm Incorporated | Enhanced input using recognized gestures |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6108028A (en) * | 1998-11-02 | 2000-08-22 | Intel Corporation | Method of activating and deactivating a screen saver in a video conferencing system |
US7036128B1 (en) | 1999-01-05 | 2006-04-25 | Sri International Offices | Using a community of distributed electronic agents to support a highly mobile, ambient computing environment |
US7119819B1 (en) * | 1999-04-06 | 2006-10-10 | Microsoft Corporation | Method and apparatus for supporting two-dimensional windows in a three-dimensional environment |
US7434166B2 (en) | 2003-06-03 | 2008-10-07 | Harman International Industries Incorporated | Wireless presentation system |
EP2304588A4 (en) | 2008-06-11 | 2011-12-21 | Teliris Inc | Surface computing collaboration system, method and apparatus |
US20100216508A1 (en) | 2009-02-23 | 2010-08-26 | Augusta Technology, Inc. | Systems and Methods for Driving an External Display Device Using a Mobile Phone Device |
US8730309B2 (en) | 2010-02-23 | 2014-05-20 | Microsoft Corporation | Projectors and depth cameras for deviceless augmented reality and interaction |
CA2844105A1 (en) * | 2011-08-31 | 2013-03-07 | Smart Technologies Ulc | Detecting pointing gestures in a three-dimensional graphical user interface |
-
2011
- 2011-10-20 US US13/278,065 patent/US20130103446A1/en not_active Abandoned
-
2014
- 2014-05-02 US US14/269,070 patent/US9659280B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7995090B2 (en) * | 2003-07-28 | 2011-08-09 | Fuji Xerox Co., Ltd. | Video enabled tele-presence control host |
US20090322676A1 (en) * | 2007-09-07 | 2009-12-31 | Apple Inc. | Gui applications for use with 3d remote controller |
US8555207B2 (en) * | 2008-02-27 | 2013-10-08 | Qualcomm Incorporated | Enhanced input using recognized gestures |
US8413077B2 (en) * | 2008-12-25 | 2013-04-02 | Sony Corporation | Input apparatus, handheld apparatus, and control method |
US8312392B2 (en) * | 2009-10-02 | 2012-11-13 | Qualcomm Incorporated | User interface gestures and methods for providing file sharing functionality |
US20110197147A1 (en) * | 2010-02-11 | 2011-08-11 | Apple Inc. | Projected display shared workspaces |
US8452057B2 (en) * | 2010-05-17 | 2013-05-28 | Hon Hai Precision Industry Co., Ltd. | Projector and projection control method |
US20120017147A1 (en) * | 2010-07-16 | 2012-01-19 | John Liam Mark | Methods and systems for interacting with projected user interface |
US20120192084A1 (en) * | 2010-10-25 | 2012-07-26 | Dedo Interactive, Inc. | Synchronized panel technology |
Non-Patent Citations (1)
Title |
---|
JEON, et al., "Interaction Techniques in Large Display Environments using Hand-held Devices", Proceedings of the 4 Association for Computing Machinery (ACM) symposium on Virtual reality software and technology (VRST '06), November 1-3, 2006, pages 100-103, ACM, Limassol, Cyprus. * |
Cited By (109)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10149014B2 (en) | 2001-09-19 | 2018-12-04 | Comcast Cable Communications Management, Llc | Guide menu based on a repeatedly-rotating sequence |
US10587930B2 (en) | 2001-09-19 | 2020-03-10 | Comcast Cable Communications Management, Llc | Interactive user interface for television applications |
US10602225B2 (en) | 2001-09-19 | 2020-03-24 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US11388451B2 (en) | 2001-11-27 | 2022-07-12 | Comcast Cable Communications Management, Llc | Method and system for enabling data-rich interactive television using broadcast database |
US11412306B2 (en) | 2002-03-15 | 2022-08-09 | Comcast Cable Communications Management, Llc | System and method for construction, delivery and display of iTV content |
US11070890B2 (en) | 2002-08-06 | 2021-07-20 | Comcast Cable Communications Management, Llc | User customization of user interfaces for interactive television |
US10491942B2 (en) | 2002-09-19 | 2019-11-26 | Comcast Cable Communications Management, Llc | Prioritized placement of content elements for iTV application |
US9967611B2 (en) | 2002-09-19 | 2018-05-08 | Comcast Cable Communications Management, Llc | Prioritized placement of content elements for iTV applications |
US10687114B2 (en) | 2003-03-14 | 2020-06-16 | Comcast Cable Communications Management, Llc | Validating data of an interactive content application |
US11089364B2 (en) | 2003-03-14 | 2021-08-10 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US10237617B2 (en) | 2003-03-14 | 2019-03-19 | Comcast Cable Communications Management, Llc | System and method for blending linear content, non-linear content or managed content |
US20130198642A1 (en) * | 2003-03-14 | 2013-08-01 | Comcast Cable Communications, Llc | Providing Supplemental Content |
US10664138B2 (en) * | 2003-03-14 | 2020-05-26 | Comcast Cable Communications, Llc | Providing supplemental content for a second screen experience |
US11381875B2 (en) | 2003-03-14 | 2022-07-05 | Comcast Cable Communications Management, Llc | Causing display of user-selectable content types |
US10616644B2 (en) | 2003-03-14 | 2020-04-07 | Comcast Cable Communications Management, Llc | System and method for blending linear content, non-linear content, or managed content |
US10171878B2 (en) | 2003-03-14 | 2019-01-01 | Comcast Cable Communications Management, Llc | Validating data of an interactive content application |
US9992546B2 (en) | 2003-09-16 | 2018-06-05 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US10848830B2 (en) | 2003-09-16 | 2020-11-24 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US11785308B2 (en) | 2003-09-16 | 2023-10-10 | Comcast Cable Communications Management, Llc | Contextual navigational control for digital television |
US10110973B2 (en) | 2005-05-03 | 2018-10-23 | Comcast Cable Communications Management, Llc | Validation of content |
US10575070B2 (en) | 2005-05-03 | 2020-02-25 | Comcast Cable Communications Management, Llc | Validation of content |
US11272265B2 (en) | 2005-05-03 | 2022-03-08 | Comcast Cable Communications Management, Llc | Validation of content |
US11765445B2 (en) | 2005-05-03 | 2023-09-19 | Comcast Cable Communications Management, Llc | Validation of content |
US11832024B2 (en) | 2008-11-20 | 2023-11-28 | Comcast Cable Communications, Llc | Method and apparatus for delivering video and video-related content at sub-asset level |
US20120216151A1 (en) * | 2011-02-22 | 2012-08-23 | Cisco Technology, Inc. | Using Gestures to Schedule and Manage Meetings |
US8782566B2 (en) * | 2011-02-22 | 2014-07-15 | Cisco Technology, Inc. | Using gestures to schedule and manage meetings |
US20120326995A1 (en) * | 2011-06-24 | 2012-12-27 | Ricoh Company, Ltd. | Virtual touch panel system and interactive mode auto-switching method |
US20130110768A1 (en) * | 2011-10-31 | 2013-05-02 | Fujitsu Limited | Method for managing data, medium, and apparatus for managing data |
US20130191719A1 (en) * | 2012-01-19 | 2013-07-25 | Microsoft Corporation | Notebook driven accumulation of meeting documentation and notations |
US9449303B2 (en) * | 2012-01-19 | 2016-09-20 | Microsoft Technology Licensing, Llc | Notebook driven accumulation of meeting documentation and notations |
US20130229345A1 (en) * | 2012-03-01 | 2013-09-05 | Laura E. Day | Manual Manipulation of Onscreen Objects |
US20130236101A1 (en) * | 2012-03-08 | 2013-09-12 | Fuji Xerox Co., Ltd. | Information processing apparatus, non-transitory computer readable medium, and information processing method |
US20140026068A1 (en) * | 2012-07-20 | 2014-01-23 | Samsung Electronics Co., Ltd. | Method of controlling display of display device by mobile terminal and mobile terminal for the same |
US10114522B2 (en) * | 2012-07-20 | 2018-10-30 | Samsung Electronics Co., Ltd | Method of controlling display of display device by mobile terminal and mobile terminal for the same |
US20170118436A1 (en) * | 2012-07-25 | 2017-04-27 | Samsung Electronics Co., Ltd. | Method and mobile terminal for displaying information, method and display device for providing information, and method and mobile terminal for generating control signal |
US20140104168A1 (en) * | 2012-10-12 | 2014-04-17 | Microsoft Corporation | Touchless input |
US10019074B2 (en) | 2012-10-12 | 2018-07-10 | Microsoft Technology Licensing, Llc | Touchless input |
US9310895B2 (en) * | 2012-10-12 | 2016-04-12 | Microsoft Technology Licensing, Llc | Touchless input |
US11115722B2 (en) | 2012-11-08 | 2021-09-07 | Comcast Cable Communications, Llc | Crowdsourcing supplemental content |
US9759420B1 (en) | 2013-01-25 | 2017-09-12 | Steelcase Inc. | Curved display and curved display support |
US10977588B1 (en) | 2013-01-25 | 2021-04-13 | Steelcase Inc. | Emissive shapes and control systems |
US10652967B1 (en) | 2013-01-25 | 2020-05-12 | Steelcase Inc. | Curved display and curved display support |
US11246193B1 (en) | 2013-01-25 | 2022-02-08 | Steelcase Inc. | Curved display and curved display support |
US9804731B1 (en) | 2013-01-25 | 2017-10-31 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
US11327626B1 (en) | 2013-01-25 | 2022-05-10 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
US10754491B1 (en) | 2013-01-25 | 2020-08-25 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
US11443254B1 (en) | 2013-01-25 | 2022-09-13 | Steelcase Inc. | Emissive shapes and control systems |
US10983659B1 (en) | 2013-01-25 | 2021-04-20 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
US11775127B1 (en) | 2013-01-25 | 2023-10-03 | Steelcase Inc. | Emissive surfaces and workspaces method and apparatus |
US11102857B1 (en) | 2013-01-25 | 2021-08-24 | Steelcase Inc. | Curved display and curved display support |
US10154562B1 (en) | 2013-01-25 | 2018-12-11 | Steelcase Inc. | Curved display and curved display support |
US20140214471A1 (en) * | 2013-01-31 | 2014-07-31 | Donald Raymond Schreiner, III | System for Tracking Preparation Time and Attendance at a Meeting |
US9294539B2 (en) | 2013-03-14 | 2016-03-22 | Microsoft Technology Licensing, Llc | Cooperative federation of digital devices via proxemics and device micro-mobility |
US11601720B2 (en) | 2013-03-14 | 2023-03-07 | Comcast Cable Communications, Llc | Content event messaging |
US10880609B2 (en) | 2013-03-14 | 2020-12-29 | Comcast Cable Communications, Llc | Content event messaging |
US9774653B2 (en) | 2013-03-14 | 2017-09-26 | Microsoft Technology Licensing, Llc | Cooperative federation of digital devices via proxemics and device micro-mobility |
US20140282273A1 (en) * | 2013-03-15 | 2014-09-18 | Glen J. Anderson | System and method for assigning voice and gesture command areas |
US20140331141A1 (en) * | 2013-05-03 | 2014-11-06 | Adobe Systems Incorporated | Context visual organizer for multi-screen display |
US9940014B2 (en) * | 2013-05-03 | 2018-04-10 | Adobe Systems Incorporated | Context visual organizer for multi-screen display |
US20140351718A1 (en) * | 2013-05-24 | 2014-11-27 | Fuji Xerox Co., Ltd. | Information processing device, information processing method, and computer-readable medium |
US20150007056A1 (en) * | 2013-06-28 | 2015-01-01 | Linkedln Corporation | Virtual conference manager |
US9661041B2 (en) * | 2013-06-28 | 2017-05-23 | Linkedin Corporation | Virtual conference manager |
US20150006669A1 (en) * | 2013-07-01 | 2015-01-01 | Google Inc. | Systems and methods for directing information flow |
US20150121231A1 (en) * | 2013-10-28 | 2015-04-30 | Promethean Limited | Systems and Methods for Interactively Presenting a Presentation to Viewers |
US9720506B2 (en) * | 2014-01-14 | 2017-08-01 | Microsoft Technology Licensing, Llc | 3D silhouette sensing system |
US20150199018A1 (en) * | 2014-01-14 | 2015-07-16 | Microsoft Corporation | 3d silhouette sensing system |
US10001845B2 (en) * | 2014-01-14 | 2018-06-19 | Microsoft Technology Licensing, Llc | 3D silhouette sensing system |
US20170285763A1 (en) * | 2014-01-14 | 2017-10-05 | Microsoft Technology Licensing, Llc | 3d silhouette sensing system |
US11006080B1 (en) | 2014-02-13 | 2021-05-11 | Steelcase Inc. | Inferred activity based conference enhancement method and system |
US11706390B1 (en) | 2014-02-13 | 2023-07-18 | Steelcase Inc. | Inferred activity based conference enhancement method and system |
US10904490B1 (en) * | 2014-02-13 | 2021-01-26 | Steelcase Inc. | Inferred activity based conference enhancement method and system |
US10606255B2 (en) * | 2014-03-25 | 2020-03-31 | Mitsubishi Electric Corporation | Plant monitor and control system |
US9760331B2 (en) * | 2014-05-23 | 2017-09-12 | Samsung Electronics Co., Ltd. | Sharing a screen between electronic devices |
US20150339090A1 (en) * | 2014-05-23 | 2015-11-26 | Samsung Electronics Co., Ltd. | Sharing a screen between electronic devices |
US20150363026A1 (en) * | 2014-06-16 | 2015-12-17 | Touchplus Information Corp. | Control device, operation mode altering method thereof, control method thereof and battery power warning method thereof |
US11783382B2 (en) | 2014-10-22 | 2023-10-10 | Comcast Cable Communications, Llc | Systems and methods for curating content metadata |
US9916122B2 (en) | 2014-12-18 | 2018-03-13 | Google Llc | Methods, systems, and media for launching a mobile application using a public display device |
US10108390B2 (en) | 2014-12-18 | 2018-10-23 | Google Llc | Methods, systems, and media for presenting requested content on public display devices |
US9841939B2 (en) | 2014-12-18 | 2017-12-12 | Google Inc. | Methods, systems, and media for presenting requested content on public display devices |
US10528316B2 (en) | 2014-12-18 | 2020-01-07 | Google Llc | Methods, systems, and media for presenting requested content on public display devices |
US9967320B2 (en) | 2014-12-18 | 2018-05-08 | Google Llc | Methods, systems, and media for controlling information used to present content on a public display device |
CN106716324A (en) * | 2014-12-18 | 2017-05-24 | 谷歌公司 | Methods, systems, and media for launching a mobile application using a public display device |
WO2016099708A1 (en) * | 2014-12-18 | 2016-06-23 | Google Inc. | Methods, systems, and media for launching a mobile application using a public display device |
US11144959B2 (en) | 2014-12-18 | 2021-10-12 | Google Llc | Methods, systems, and media for presenting advertisements relevant to nearby users on a public display device |
US10594777B2 (en) | 2014-12-18 | 2020-03-17 | Google Llc | Methods, systems, and media for controlling information used to present content on a public display device |
US11245746B2 (en) | 2014-12-18 | 2022-02-08 | Google Llc | Methods, systems, and media for controlling information used to present content on a public display device |
US20170019473A1 (en) * | 2015-07-16 | 2017-01-19 | Promethean Limited | Multi-network mirroring systems and methods |
US20230046864A1 (en) * | 2015-07-16 | 2023-02-16 | Promethean Limited | Multi-network computing device integration systems and methods |
US10341397B2 (en) * | 2015-08-12 | 2019-07-02 | Fuji Xerox Co., Ltd. | Non-transitory computer readable medium, information processing apparatus, and information processing system for recording minutes information |
US20170242648A1 (en) * | 2016-02-19 | 2017-08-24 | RAPC Systems, Inc. | Combined Function Control And Display And System For Displaying And Controlling Multiple Functions |
US10616633B2 (en) * | 2016-02-29 | 2020-04-07 | T1V, Inc. | System for connecting a mobile device and a common display |
US10931996B2 (en) | 2016-02-29 | 2021-02-23 | TIV, Inc. | System for connecting a mobile device and a common display |
US20170251238A1 (en) * | 2016-02-29 | 2017-08-31 | T1V, Inc. | System for connecting a mobile device and a common display |
FR3050894A1 (en) * | 2016-04-27 | 2017-11-03 | Jean Neimark | INTERACTIVE AND MULTI-USER COMMUNICATION SYSTEM FOR APPLICATIONS IN THE FIELDS OF EVENT OR RECREATION |
US20180107341A1 (en) * | 2016-10-16 | 2018-04-19 | Dell Products, L.P. | Volumetric Tracking for Orthogonal Displays in an Electronic Collaboration Setting |
US10514769B2 (en) * | 2016-10-16 | 2019-12-24 | Dell Products, L.P. | Volumetric tracking for orthogonal displays in an electronic collaboration setting |
US11190731B1 (en) | 2016-12-15 | 2021-11-30 | Steelcase Inc. | Content amplification system and method |
US11652957B1 (en) | 2016-12-15 | 2023-05-16 | Steelcase Inc. | Content amplification system and method |
US10897598B1 (en) | 2016-12-15 | 2021-01-19 | Steelcase Inc. | Content amplification system and method |
US10264213B1 (en) | 2016-12-15 | 2019-04-16 | Steelcase Inc. | Content amplification system and method |
US10638090B1 (en) | 2016-12-15 | 2020-04-28 | Steelcase Inc. | Content amplification system and method |
US11237699B2 (en) | 2017-08-18 | 2022-02-01 | Microsoft Technology Licensing, Llc | Proximal menu generation |
US11301124B2 (en) | 2017-08-18 | 2022-04-12 | Microsoft Technology Licensing, Llc | User interface modification using preview panel |
US20190056857A1 (en) * | 2017-08-18 | 2019-02-21 | Microsoft Technology Licensing, Llc | Resizing an active region of a user interface |
US10417991B2 (en) | 2017-08-18 | 2019-09-17 | Microsoft Technology Licensing, Llc | Multi-display device user interface modification |
US10984570B2 (en) * | 2019-01-04 | 2021-04-20 | Boe Technology Group Co., Ltd. | Picture marking method and apparatus, computer device, and computer readable storage medium |
US11762476B2 (en) * | 2019-09-20 | 2023-09-19 | Interdigital Ce Patent Holdings, Sas | Device and method for hand-based user interaction in VR and AR environments |
US20220342485A1 (en) * | 2019-09-20 | 2022-10-27 | Interdigital Ce Patent Holdings, Sas | Device and method for hand-based user interaction in vr and ar environments |
US20220374188A1 (en) * | 2021-05-19 | 2022-11-24 | Benq Corporation | Electronic billboard and controlling method thereof |
Also Published As
Publication number | Publication date |
---|---|
US9659280B2 (en) | 2017-05-23 |
US20140245190A1 (en) | 2014-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9659280B2 (en) | Information sharing democratization for co-located group meetings | |
Bragdon et al. | Code space: touch+ air gesture hybrid interactions for supporting developer meetings | |
US10558341B2 (en) | Unified system for bimanual interactions on flexible representations of content | |
US8949736B2 (en) | System and method for immersive process design collaboration on mobile devices | |
US9229539B2 (en) | Information triage using screen-contacting gestures | |
US11010032B2 (en) | Navigating a hierarchical data set | |
KR102184269B1 (en) | Display apparatus, portable apparatus and method for displaying a screen thereof | |
US10359905B2 (en) | Collaboration with 3D data visualizations | |
US20130198653A1 (en) | Method of displaying input during a collaboration session and interactive board employing same | |
CN106462372A (en) | Transferring content between graphical user interfaces | |
JP2019503004A (en) | How to swap visual elements and put interactive content on individual related displays | |
CN104024983B (en) | Interaction models for indirect interaction equipment | |
US9842311B2 (en) | Multiple users working collaborative on a single, touch-sensitive “table top”display | |
US20140282066A1 (en) | Distributed, interactive, collaborative, touchscreen, computing systems, media, and methods | |
WO2016035800A1 (en) | Object management device, brainstorming assistance device, object management method, and computer-readable recording medium | |
US10540070B2 (en) | Method for tracking displays during a collaboration session and interactive board employing same | |
Korzetz et al. | Natural collocated interactions for merging results with mobile devices | |
US7589749B1 (en) | Methods and apparatus for graphical object interaction and negotiation | |
CA2914351A1 (en) | A method of establishing and managing messaging sessions based on user positions in a collaboration space and a collaboration system employing same | |
US10116768B2 (en) | Control system, control method, and communication device | |
KR101060175B1 (en) | Method for controlling touch screen, recording medium for the same, and method for controlling cloud computing | |
US9927892B2 (en) | Multiple touch selection control | |
JP6293903B2 (en) | Electronic device and method for displaying information | |
CN108885556A (en) | Control numeral input | |
Chen et al. | Photo4action: phone camera-based interaction for graph visualizations on large wall displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRAGDON, ANDREW;DELINE, ROBERT;HINCKLEY, KEN;AND OTHERS;SIGNING DATES FROM 20111010 TO 20111020;REEL/FRAME:027103/0785 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0001 Effective date: 20141014 |