US20150058753A1 - Sharing electronic drawings in collaborative environments - Google Patents

Sharing electronic drawings in collaborative environments Download PDF

Info

Publication number
US20150058753A1
US20150058753A1 US14/091,944 US201314091944A US2015058753A1 US 20150058753 A1 US20150058753 A1 US 20150058753A1 US 201314091944 A US201314091944 A US 201314091944A US 2015058753 A1 US2015058753 A1 US 2015058753A1
Authority
US
United States
Prior art keywords
gesture
command
undo
electronic drawing
user device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/091,944
Inventor
Matthew Anderson
Frederic Mayot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Getgo Inc
Original Assignee
Citrix Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Citrix Systems Inc filed Critical Citrix Systems Inc
Priority to US14/091,944 priority Critical patent/US20150058753A1/en
Assigned to CITRIX SYSTEMS, INC. reassignment CITRIX SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MAYOT, Frederic, ANDERSON, MATTHEW
Priority to PCT/US2014/048851 priority patent/WO2015026497A1/en
Publication of US20150058753A1 publication Critical patent/US20150058753A1/en
Assigned to GETGO, INC. reassignment GETGO, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CITRIX SYSTEMS, INC.
Assigned to JPMORGAN CHASE BANK, N.A. reassignment JPMORGAN CHASE BANK, N.A. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GETGO, INC.
Assigned to GETGO, INC., LOGMEIN, INC. reassignment GETGO, INC. RELEASE OF SECURITY INTEREST RECORDED AT REEL/FRAME 041588/0143 Assignors: JPMORGAN CHASE BANK, N.A.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/80Creating or modifying a manually drawn or painted image using a manual input device, e.g. mouse, light pen, direction keys on keyboard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes

Abstract

An improved technique involves providing for a gesture-based undo command for use within a collaborative drawing environment. Such an undo function that is both gesture-based and capable of being used within a collaborative environment takes full advantage of the capabilities of tablet computers and laptop computers having touch screens. The gesture-based undo command may involve a multi-point linear swipe such as a two-point linear swipe gesture in order to easily distinguish an undo command from a drawing command.

Description

    BACKGROUND
  • Conventional electronic drawing tools allow users to draw freehand pictures on a display of a computer. One conventional drawing tool includes a brush palette for selecting a brush that simulates a type of brush or pen that applies a brush stroke, a color palette for selecting a color for the brush stroke, and various auxiliary tools such as an erase, undo, and redo button for correcting mistakes. A user may provide a brush stroke on the display by moving an input device, e.g., a mouse or touch inputs, along a desired stroke path.
  • Some conventional drawing tools provide the ability for a user to apply the auxiliary tools via certain finger motions. Such finger motions are integral to the use of certain tablet and laptop computers and provide for a clean interface that enhances ease of use. For example, one such tool allows a user to perform an undo function by rotating a finger on the display in a counterclockwise motion, and a redo function by rotating a finger on the display in a clockwise motion.
  • Some conventional drawing tools are used within a collaborative environment for sharing with other users. An example of such a collaborative environment is a web conference in which an electronic drawing is presented to the other users. A web conference typically shares visual data among multiple meeting participants. To create a web conference, the users connect their respective computers to a conference server through a network, e.g., the Internet. The conference server typically processes visual data (e.g., a desktop view from a presenting participant containing a drawing) and provides that visual data for display on respective display screens so that all conference participants are able to view the visual data.
  • SUMMARY
  • Unfortunately, there are deficiencies with the above-described conventional electronic drawing tools. For example, the above-described conventional electronic drawing tools that use finger motions for applying undo and redo functions are not configured to be used in a collaborative environment such as an online meeting. On the other hand, conventional electronic drawing tools that are capable of being used in a collaborative environment only have buttons for undo functions and are not configured to use gesture-based undo commands. Both types of conventional electronic drawing tools above do not take full advantage of the capabilities of tablet computing technology in sharing electronic drawings with a group of participants in an online meeting.
  • It should be understood that, in addition to not being configured to being used in a collaborative environment, the above-described conventional electronic drawing tools that do have undo functions are based on finger motions that use awkward motions such as circular swipes that are not easy to perform. In one example, a particular tool requires a user to make a counterclockwise circular arc on a touch screen, while a corresponding redo function requires the user to make a clockwise circular arc.
  • In contrast with the above-described conventional electronic drawing tools which use awkward gestures to enable undo commands, an improved technique involves providing for a gesture-based undo command for use within a collaborative drawing environment. Such an undo function that is both gesture-based and capable of being used within a collaborative environment takes full advantage of the capabilities of tablet computers and laptop computers having touch screens. The gesture-based undo command may involve a multi-point linear swipe such as a two-point linear swipe gesture in order to easily distinguish an undo command from a drawing command.
  • One embodiment of the improved technique is directed to a method of presenting an electronic drawing in a collaborative environment over an electronic network, the electronic drawing including multiple objects. The method includes sharing, by a processor of a particular user device within the collaborative environment, the electronic drawing among multiple user devices within the collaborative environment. The method also includes receiving, by the processor of the particular user device, a gesture-based undo command identifying an object to be removed from the electronic drawing. The method further includes communicating by the processor with the multiple user devices to remove the object from the electronic drawing in response to the gesture-based undo command.
  • Additionally, some embodiments of the improved technique are directed to an electronic apparatus constructed and arranged to present an electronic drawing in a collaborative environment over an electronic network, the electronic drawing including multiple objects. The apparatus includes memory and a set of processors coupled to the memory to form controlling circuitry. The controlling circuitry is constructed and arranged to carry out the method of presenting an electronic drawing in a collaborative environment over an electronic network.
  • Furthermore, some embodiments of the improved technique are directed to a computer program product having a non-transitory computer readable storage medium which stores code including a set of instructions which, when executed by a computer, cause the computer to carry out the method of presenting an electronic drawing in a collaborative environment over an electronic network.
  • In some arrangements, communicating with the multiple user devices to remove the object from the electronic drawing includes sending a delete command to remove the object from the electronic drawing.
  • In some arrangements, receiving the gesture-based undo command includes analyzing a set of touch points resulting from a particular gesture input on the particular user device, the touch points including samples of the particular gesture at equally spaced time intervals taken by the processor of the particular user device, and verifying whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command.
  • In some arrangements, the specified undo waveform is indicative of a multi-point linear swipe on the particular user device, the multi-point linear swipe being swept out in a particular direction along an axis of the particular user device. Verifying whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command includes generating a speed and direction of the multipoint linear swipe from a time series produced by the set of touch points.
  • In some arrangements, the method further includes receiving, after receiving the gesture-based undo command, a gesture-based redo command, the gesture-based redo command being configured to replace an object removed by a previous gesture-based undo command and in response to the gesture-based redo command, communicating with the multiple user devices to restore the object to the electronic drawing.
  • In some arrangements, communicating with the multiple user devices to restore the object to the electronic drawing includes sending a draw stroke command to the multiple user devices that produces the object on each of the multiple user devices.
  • In some arrangements, receiving the gesture-based redo command includes analyzing a set of touch points resulting from a particular gesture input on the particular user device, the touch points including samples of the particular gesture at equally spaced time intervals taken by the processor of the particular user device and verifying whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command.
  • In some arrangements, a specified redo gesture includes a multipoint linear swipe on the particular user device, the multi-point linear swipe being swept out in a direction along the axis of the particular user device substantially opposite to the multipoint linear swipe of the specified undo gesture. Verifying whether the set of touch points includes generating a speed and direction of the multipoint linear swipe of the specified redo gesture from a time series produced by the set of touch points.
  • In some arrangements, communicating with the multiple user devices includes transmitting the delete command to a central server which in turn transmits the delete command to the multiple user devices.
  • BRIEF DESCRIPTION OF THE DRAWING
  • The foregoing and other objects, features and advantages will be apparent from the following description of particular embodiments of the invention, as illustrated in the accompanying figures in which like reference characters refer to the same parts throughout the different views.
  • FIG. 1 is a block diagram illustrating an example electronic environment in which the improved technique may be carried out.
  • FIG. 2 is an example user device within the electronic environment shown in FIG. 1.
  • FIG. 3 is a block diagram illustrating example undo and redo commands from the online meeting server shown in FIG. 2.
  • FIG. 4 is a block diagram illustrating example undo and redo gestures that are converted to waveforms received by the online meeting server shown in FIG. 2.
  • FIG. 5 is a flow chart illustrating an example method of carrying out the improved technique within the computing device shown in FIG. 1.
  • DETAILED DESCRIPTION
  • An improved technique involves providing for a gesture-based undo command for use within a collaborative drawing environment. Such an undo function that is both gesture-based and capable of being used within a collaborative environment takes full advantage of the capabilities of tablet computers and laptop computers having touch screens. The gesture-based undo command may involve a multi-point linear swipe such as a two-point linear swipe gesture in order to easily distinguish an undo command from a drawing command.
  • FIG. 1 illustrates an example electronic environment 10 in which the improved technique may be carried out. Electronic environment 10 includes client devices 12(1), 12(2), . . . , 12(M), where M is the number of client devices being used in an online meeting 24, communications medium 18, and online meeting server 22.
  • Communications medium 18 is constructed and arranged to connect the various components of electronic environment 10 together to enable these components to exchange electronic signals 30. At least a portion of communications medium 18 is illustrated as a cloud in FIG. 1 to indicate that communications medium 18 is capable of having a variety of different topologies including backbone, hub-and-spoke, loop, irregular, combinations thereof, and so on. Along these lines, communications medium 18 may include copper-based communications devices and cabling, fiber optic devices and cabling, wireless devices, combinations thereof, etc. Furthermore, communications medium 18 is capable of supporting LAN-based communications, cellular communications, standard telephone communications, combinations thereof, etc.
  • Client devices 12 are typically tablet computers having a respective touch-screen display 16, although client devices 12 can be any electronic computing device with a touch-screen display, e.g., laptop computer, smartphone, and the like. Each client device 12 is constructed and arranged to operate an online meeting client on behalf of a respective user 14, in which there is an electronic drawing environment capable of allowing respective user 14 to create and remove drawing objects using gesture-based commands. For example, user 14(1) may create a circle on display 16(1) within the electronic drawing environment by swiping a finger on the display in a circle. Furthermore, client devices 12 are capable of communicating gesture-based actions, among others, to online meeting server 22 via communications medium 18.
  • Online meeting server 22 is constructed and arranged to host online meetings 24 among users 14. Online meeting server 22 is also constructed and arranged to communicate a command to the client devices 12 to perform an undo command in response to receiving the gesture-based undo command.
  • It should be understood that, in some arrangements, online meeting server 22 may be replaced by a communication server that may not supply audio as is typical in an online meeting. Along these lines, the improved technique described herein does not require sound or other multimedia aside from a visual medium for the drawing.
  • During operation, client devices 12(1), 12(2), . . . , 12(M) initiate online meeting 24 via online meeting server 22 through communications medium 18. Online meeting server 22 provides, via an internet browser running on each client device 12, an online meeting interface displayed on a respective display 16(1), 16(2), . . . , 16(M) (displays 16) of client device 12(1), 12(2), . . . , 12(M). The online meeting interface runs, among other applications, an electronic drawing program that displays an electronic drawing 20 on each of displays 16.
  • During online meeting 24, a user, e.g., user 14(1), draws an object 32 within the electronic drawing. The electronic drawing program within the online meeting interface provided by online meeting server 22 is configured to display object 32 on all displays 16 simultaneously as part of online meeting 24. For example, as will be described below, user 14(1) draws object 32 using a finger gesture on display 16(1). Subsequently, the electronic drawing program causes client device 12(1) to send an electronic signal 30 to online meeting server 22 that represents object 32. Online meeting server 22 then sends the received signal 30 to other client devices 12(2), . . . , 12(M), each of which in turn maps the signal to object 32 and displays object 32 in their respective displays 16.
  • After devices 12 display object 32 on respective displays 16, but before another object is created within electronic drawing 20, a user, e.g., user 14(M), initiates, via gesture 26, a gesture-based undo command 28 that is configured to cause object 32 to be removed from electronic drawing 20. For example, as will be described in detail below in connection with FIG. 5, user 14(M) applies a linear, two-point gesture 26 toward the left part of display 16(M). Such a multipoint gesture distinguishes the undo function from a standard drawing command initiated by a single-point gesture that creates objects in electronic drawing 20.
  • In response to the initiation of gesture-based undo command 28, the electronic drawing program causes client device 12(M) to erase object 32 from display 16(M). Further, client device 12(M) sends an electronic signal 30 to online meeting server that contains a delete command 34.
  • Upon receiving delete command 34, online meeting server 22 sends a delete communication to each of client devices 12 via communications medium 18.
  • Delete communication 34 provides a delete command to the electronic drawing program running on each of client devices 12(1), 12(2), . . . , 12(M) that is configured to remove object 32 from electronic drawing 20.
  • FIG. 2 is a block diagram that illustrates further details of online meeting server 22. Online meeting server 22 includes controller 40, which in turn includes processor 44 and memory 46, and network interface 42.
  • Network interface 42 is constructed and arranged to provide connections for online meeting server 22 to communications medium 18. Network interface 42 takes the form of an Ethernet card; in some arrangements, network interface 42 takes other forms including a wireless receiver and a token ring card.
  • Processor 44 takes the form of, but is not limited to, Intel or AMD-based CPUs, and can include a single or multi-cores each running single or multiple threads. Processor 44 is coupled to memory 46 and is configured to execute instructions from code 58.
  • Memory 46 is configured to store code 58 that contains instructions to conduct an online meeting. Memory 46 also includes an undo stack 50 and a redo stack 52.
  • Undo stack 50 and redo stack 52 are configured to track objects which have been removed from and placed back in electronic drawing 20. Further details about undo stack 50 and redo stack 52 will be discussed in connection with FIG. 4.
  • In many arrangements, the electronic drawing program will also make a redo function available that puts back the most recent object removed from electronic drawing 20 by the undo function. Further details of how the undo and redo functions operate within online meeting 24 are described below in connection with FIG. 3.
  • FIG. 3 is a block diagram illustrating an example undo stack 50 (see FIG. 2) and redo stack 52 in memory 46 while undo and redo functions are being performed either by each client device 12 or, as in some arrangements, online meeting server 22.
  • As objects are added to electronic drawing 20, client device 12(M) provides each object with an object identifier, e.g., object identifier 80 or object identifier 82. In the case illustrated in FIG. 4, object identifiers 80 and 82 are simply integers associated with an object. For example, online meeting server 22 may assign an object identifier to an object upon completion of an object creation gesture, e.g., when a user 14 lifts his or her finger from display 16. Upon the creation of an object with object identifier 80 in electronic drawing 20, client device 12(M) stores object identifier 80 in undo stack 50 in a last-in, first-out manner.
  • Assume that object identifier 80 corresponds to the most recently created object in electronic drawing 20. During an example operation, client 12(M) receives a gesture-based undo command 28(a) as described in detail above. In response to receiving undo command 28(a), client 12(M) moves object identifier 80 from undo stack 50 to redo stack 52 in a last-in, first-out manner. In response to receiving a second gesture-based undo command 28(b) before any other commands are received, client 12(M) moves the next object identifier 82 in undo stack 50 (i.e., corresponding to the next-most recent object) to redo stack 52 in the last-in, first-out manner.
  • It should be understood that, as each undo command 28(a) or 28(b) is processed by client 12(M), the objects corresponding to the object identifiers moved to redo stack 52 are removed from electronic drawing 20 and are no longer visible on displays 16.
  • Assume further that, sometime after receiving gesture-based undo command 28(b), client 12(M) receives a gesture-based redo command 28(c). Upon receipt of redo command 28(c), client 12(M) moves the most recent object identifier in redo stack 52—in this case, object identifier 82—back to undo stack in the last-in, first-out manner. As redo command 28(c) is processed by client 12(M), the object corresponding to object identifier 82 will reappear in electronic drawing 20 and will be visible on displays 16 after the object and its identifier is sent to server 22.
  • In some arrangements, when a client device 12(M) creates a shape, client device 12(M) places the shape in local undo stack 50, sends the shape and a shape identifier for that shape to server 22, and records the time of creation of the shape. Server 22 then stores the shape identifier in a list and broadcasts the shape to all connected clients 12. When client device 12(M) detects an undo, it moves the shape from undo stack 50 to redo stack 52 and sends a delete command to server 22. Server 22 deletes the shape in the list and forwards the delete command to all connected clients. When client 12(M) detects a redo gesture, client 12(M) moves the shape from redo stack 52 to undo stack 50, and sends the shape to server 22 along with the corresponding shape identifier and the recorded time of creation of the shape to the server. Server 22 then places the shape identifier in the list and forwards the shape, its corresponding identifier and time of creation to clients 12. When receiving a shape, each client 12 places the shape in a list ordered by time of creation. That allows each client 12 to properly render the shape coming from a redo command.
  • FIG. 4 is a block diagram illustrating criteria for determining whether a gesture corresponds to an undo or redo command. Consider an intended undo gesture 90 performed on display 16(M) by user 14(M). It is assumed that, as described above, the standard undo gesture corresponding to a waveform stored in waveform library 54 is a two-point linear swipe to the left, parallel to axis 92 of display 16(M).
  • It should be understood that, many times, user 14(M) will not be able to perfectly reproduce such a gesture when performing intended undo gesture 90. Rather, for example, intended undo gesture 90 may involve a two-point swipe to the left at an angle with respect to axis 92. The subsequent waveform generated from gesture 90 is, in some arrangements, configured to behave continuously with respect to swipe angle so that online meeting server 22 can determine whether the swipe angle is too large to be an intended undo function. To this effect, there is a threshold angle 96 beyond which online meeting server 22 will fail to recognize the waveform generated from gesture 90 as corresponding to an undo function.
  • In some arrangements, online meeting server 22 also imposes a minimum swipe length requirement on gesture 90 in order to be recognized as an undo function. To this effect, the subsequent waveform generated from gesture 90 is also configured to behave continuously with respect to swipe length 98. In this way, online meeting server 22 may determine whether swipe length is at least as long as a threshold swipe length 100; if so, online meeting server 22 may recognize gesture 90 as corresponding to the undo function.
  • Intended redo gestures 110 are treated similarly; an example standard redo gesture is a two-point linear swipe 112 parallel to axis 92 to the right.
  • FIG. 5 is a flow diagram illustrating a method 150 of conducting an online meeting, including steps 152, 154, and 156. In step 152, an electronic drawing, e.g., electronic drawing 20, which is shared among multiple user devices, e.g., user devices 12, is provided as part of an online meeting, e.g., online meeting 24 by a processor of an online meeting server, e.g., 22, the electronic drawing including multiple objects, e.g., object 32. In step 154, a gesture-based undo command, e.g., gesture-based undo command 28(a), identifying an object to be removed from the electronic drawing is received from a particular user device, e.g., user device 12(M). In step 156, in response to the gesture-based undo command, the processor communicates with the multiple user devices to remove the object from the electronic drawing.
  • As used throughout this document, the words “comprising,” “including,” and “having” are intended to set forth certain items, steps, elements, or aspects of something in in that these are provided by way of example only and the invention is not limited to these particular embodiments. In addition, the word “set” as used herein indicates one or more of something, unless a statement is made to the contrary.
  • It should be understood that the improvement described here has a number of applications, including providing a technique for conducting an online meeting.
  • Having described certain embodiments, numerous alternative embodiments or variations can be made. For example, the above discussion dealt mainly with online meeting server 22 issuing undo commands in response to receiving gesture data. In some arrangements, however, online meeting server 22 performs the undo command to remove objects from electronic drawing 20.
  • Also, the improvements or portions thereof may be embodied as a non-transient computer-readable storage medium, such as a magnetic disk, magnetic tape, compact disk, DVD, optical disk, flash memory, Application Specific Integrated Circuit (ASIC), Field Programmable Gate Array (FPGA), and the like. Multiple computer-readable media may be used. The medium (or media) may be encoded with instructions which, when executed on one or more computers or other processors, perform methods that implement the various processes described herein. Such medium (or media) may be considered an article of manufacture or a machine, and may be transportable from one machine to another.
  • Further, although features are shown and described with reference to particular embodiments hereof, such features may be included in any of the disclosed embodiments and their variants. Thus, it is understood that features disclosed in connection with any embodiment can be included as variants of any other embodiment, whether such inclusion is made explicit herein or not.
  • Those skilled in the art will therefore understand that various changes in form and detail may be made to the embodiments disclosed herein without departing from the scope of the invention.

Claims (20)

What is claimed is:
1. A method of presenting an electronic drawing in a collaborative environment over an electronic network, the electronic drawing including multiple objects, the method comprising:
sharing, by a processor of a particular user device within the collaborative environment, the electronic drawing among multiple user devices within the collaborative environment;
receiving, by the processor of the particular user device, a gesture-based undo command identifying an object to be removed from the electronic drawing; and
in response to the gesture-based undo command, communicating by the processor with the multiple user devices to remove the object from the electronic drawing.
2. A method as in claim 1,
wherein communicating with the multiple user devices to remove the object from the electronic drawing includes:
sending a delete command to remove the object from the electronic drawing.
3. A method as in claim 2,
wherein receiving the gesture-based undo command includes:
analyzing a set of touch points resulting from a particular gesture input on the particular user device, the touch points including samples of the particular gesture at equally spaced time intervals taken by the processor of the particular user device, and
verifying whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command.
4. A method as in claim 3,
wherein the specified undo gesture includes a multipoint linear swipe on the particular user device, the multi-point linear swipe being swept out in a particular direction along an axis of the particular user device; and
wherein verifying whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command includes:
generating a speed and direction of the multipoint linear swipe from a time series produced by the set of touch points.
5. A method as in claim 2, further comprising:
receiving, after receiving the gesture-based undo command, a gesture-based redo command, the gesture-based redo command being configured to replace an object removed by a previous gesture-based undo command; and
in response to the gesture-based redo command, communicating with the multiple user devices to restore the object to the electronic drawing.
6. A method as in claim 5,
wherein communicating with the multiple user devices to restore the object to the electronic drawing includes:
sending a draw stroke command to the multiple user devices that produces the object on each of the multiple user devices.
7. A method as in claim 5,
wherein receiving the gesture-based redo command includes:
analyzing a set of touch points resulting from a particular gesture input on the particular user device, the touch points including samples of the particular gesture at equally spaced time intervals taken by the processor of the particular user device, and
verifying whether the set of touch points are indicative of a specified redo gesture that generates the gesture-based redo command.
8. A method as in claim 5,
wherein a specified redo gesture includes a multipoint linear swipe on the particular user device, the multi-point linear swipe being swept out in a direction along the axis of the particular user device substantially opposite to the multipoint linear swipe of the specified undo gesture; and
wherein verifying whether the set of touch points includes:
generating a speed and direction of the multipoint linear swipe of the specified redo gesture from a time series produced by the set of touch points.
9. A method as in claim 2,
wherein communicating with the multiple user devices includes:
transmitting the delete command to a central server which in turn transmits the delete command to the multiple user devices.
10. An electronic apparatus constructed and arranged to present an electronic drawing in a collaborative environment over an electronic network, the electronic drawing including multiple objects, the apparatus comprising:
a network interface;
memory; and
a controller including controlling circuitry, the controlling circuitry being constructed and arranged to:
share, by a processor of a particular user device within the collaborative environment, the electronic drawing among multiple user devices within the collaborative environment;
receive, by the processor of the particular user device, a gesture-based undo command identifying an object to be removed from the electronic drawing; and
in response to the gesture-based undo command, communicate by the processor with the multiple user devices to remove the object from the electronic drawing.
11. An apparatus as in claim 10,
wherein the controlling circuitry constructed and arranged to communicate with the multiple user devices to remove the object from the electronic drawing is further constructed and arranged to:
send a delete command to remove the object from the electronic drawing.
12. An apparatus as in claim 11,
wherein the controlling circuitry constructed and arranged to receive the gesture-based undo command is further constructed and arranged to:
analyze a set of touch points resulting from a particular gesture input on the particular user device, the touch points including samples of the particular gesture at equally spaced time intervals taken by the processor of the particular user device, and
verify whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command.
13. An apparatus as in claim 4,
wherein the specified undo gesture includes a multipoint linear swipe on the particular user device, the multi-point linear swipe being swept out in a particular direction along an axis of the particular user device; and
wherein the controlling circuitry constructed and arranged to verify whether the set of touch points are indicative of a specified undo gesture that generates the gesture-based undo command is further constructed and arranged to:
generate a speed and direction of the multipoint linear swipe from a time series produced by the set of touch points.
14. An apparatus as in claim 11, wherein the controlling circuitry is further constructed and arranged to:
receive after receiving the gesture-based undo command, a gesture-based redo command, the gesture-based redo command being configured to replace an object removed by a previous gesture-based undo command; and
in response to the gesture-based redo command, communicate with the multiple user devices to restore the object to the electronic drawing.
15. An apparatus as in claim 14,
wherein the controlling circuitry constructed and arranged to communicate with the multiple user devices to restore the object to the electronic drawing is further constructed and arranged to:
send a draw stroke command to the multiple user devices that produces the object on each of the multiple user devices.
16. An apparatus as in claim 14,
wherein the controlling circuitry constructed and arranged to receive the gesture-based redo command is further constructed and arranged to:
analyze a set of touch points resulting from a particular gesture input on the particular user device, the touch points including samples of the particular gesture at equally spaced time intervals taken by the processor of the particular user device, and
verify whether the set of touch points are indicative of a specified redo gesture that generates the gesture-based redo command.
17. An apparatus as in claim 14,
wherein a specified redo gesture includes a multipoint linear swipe on the particular user device, the multi-point linear swipe being swept out in a direction along the axis of the particular user device substantially opposite to the multipoint linear swipe of the specified undo gesture; and
wherein the controlling circuitry constructed and arranged to verify whether the set of touch points is further constructed and arranged to:
generate a speed and direction of the multipoint linear swipe of the specified redo gesture from a time series produced by the set of touch points.
18. An apparatus as in claim 11,
wherein the controlling circuitry constructed and arranged to communicate with the multiple user devices to restore the object to the electronic drawing is further constructed and arranged to:
transmit the delete command to a central server which in turn transmits the delete command to the multiple user devices.
19. A computer program product having a non-transitory, computer-readable storage medium which stores instructions that, when executed by a controller, causes the controller to carry out a method of presenting an electronic drawing in a collaborative environment over an electronic network, the electronic drawing including multiple objects, the method comprising:
sharing, by a processor of a particular user device within the collaborative environment, the electronic drawing among multiple user devices within the collaborative environment;
receiving, by the processor of the particular user device, a gesture-based undo command identifying an object to be removed from the electronic drawing; and
in response to the gesture-based undo command, communicating by the processor with the multiple user devices to remove the object from the electronic drawing.
20. A computer program product as in claim 19,
wherein communicating with the multiple user devices to remove the object from the electronic drawing includes:
sending a delete command to remove the object from the electronic drawing.
US14/091,944 2013-08-22 2013-11-27 Sharing electronic drawings in collaborative environments Abandoned US20150058753A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/091,944 US20150058753A1 (en) 2013-08-22 2013-11-27 Sharing electronic drawings in collaborative environments
PCT/US2014/048851 WO2015026497A1 (en) 2013-08-22 2014-07-30 Method and apparatus for collaborative electronic drawing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361869038P 2013-08-22 2013-08-22
US14/091,944 US20150058753A1 (en) 2013-08-22 2013-11-27 Sharing electronic drawings in collaborative environments

Publications (1)

Publication Number Publication Date
US20150058753A1 true US20150058753A1 (en) 2015-02-26

Family

ID=52481543

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/091,944 Abandoned US20150058753A1 (en) 2013-08-22 2013-11-27 Sharing electronic drawings in collaborative environments
US14/097,320 Active 2034-09-08 US9354707B2 (en) 2013-08-22 2013-12-05 Combination color and pen palette for electronic drawings

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/097,320 Active 2034-09-08 US9354707B2 (en) 2013-08-22 2013-12-05 Combination color and pen palette for electronic drawings

Country Status (2)

Country Link
US (2) US20150058753A1 (en)
WO (2) WO2015026497A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150067542A1 (en) * 2013-08-30 2015-03-05 Citrix Systems, Inc. Gui window with portal region for interacting with hidden interface elements
US20160173543A1 (en) * 2014-12-11 2016-06-16 LiveLoop, Inc. Method and system for concurrent collaborative undo operations in computer application software
US20170255378A1 (en) * 2016-03-02 2017-09-07 Airwatch, Llc Systems and methods for performing erasures within a graphical user interface
US20180074775A1 (en) * 2016-06-06 2018-03-15 Quirklogic, Inc. Method and system for restoring an action between multiple devices
US20180356956A1 (en) * 2017-06-12 2018-12-13 Google Inc. Intelligent command batching in an augmented and/or virtual reality environment
DE102017113763A1 (en) * 2017-06-21 2018-12-27 SMR Patents S.à.r.l. Method for operating a display device for a motor vehicle and motor vehicle
US10324618B1 (en) * 2016-01-05 2019-06-18 Quirklogic, Inc. System and method for formatting and manipulating digital ink
US10755029B1 (en) 2016-01-05 2020-08-25 Quirklogic, Inc. Evaluating and formatting handwritten input in a cell of a virtual canvas
US11444982B1 (en) * 2020-12-31 2022-09-13 Benjamin Slotznick Method and apparatus for repositioning meeting participants within a gallery view in an online meeting user interface based on gestures made by the meeting participants
US11546385B1 (en) 2020-12-31 2023-01-03 Benjamin Slotznick Method and apparatus for self-selection by participant to display a mirrored or unmirrored video feed of the participant in a videoconferencing platform
US11550399B2 (en) * 2016-03-29 2023-01-10 Microsoft Technology Licensing, Llc Sharing across environments
US11621979B1 (en) 2020-12-31 2023-04-04 Benjamin Slotznick Method and apparatus for repositioning meeting participants within a virtual space view in an online meeting user interface based on gestures made by the meeting participants

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10528249B2 (en) * 2014-05-23 2020-01-07 Samsung Electronics Co., Ltd. Method and device for reproducing partial handwritten content
JPWO2017072913A1 (en) * 2015-10-29 2018-05-24 Necディスプレイソリューションズ株式会社 Control method, electronic blackboard system, display device, and program
US10871880B2 (en) * 2016-11-04 2020-12-22 Microsoft Technology Licensing, Llc Action-enabled inking tools
US10739988B2 (en) * 2016-11-04 2020-08-11 Microsoft Technology Licensing, Llc Personalized persistent collection of customized inking tools
CN107123152B (en) * 2017-04-06 2023-01-06 腾讯科技(深圳)有限公司 Editing processing method and device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100245563A1 (en) * 2009-03-31 2010-09-30 Fuji Xerox Co., Ltd. System and method for facilitating the use of whiteboards
US20120013556A1 (en) * 2010-07-16 2012-01-19 Edamak Corporation Gesture detecting method based on proximity-sensing
US20130002648A1 (en) * 2011-06-29 2013-01-03 Google Inc. Managing Satellite and Aerial Image Data in a Composite Document
US20130055140A1 (en) * 2011-08-30 2013-02-28 Luis Daniel Mosquera System and method for navigation in an electronic document
US20130069882A1 (en) * 2011-09-16 2013-03-21 Research In Motion Limited Electronic device and method of character selection
US20130086487A1 (en) * 2011-10-04 2013-04-04 Roland Findlay Meeting system that interconnects group and personal devices across a network
US20130100036A1 (en) * 2011-10-19 2013-04-25 Matthew Nicholas Papakipos Composite Touch Gesture Control with Touch Screen Input Device and Secondary Touch Input Device
US20130241847A1 (en) * 1998-01-26 2013-09-19 Joshua H. Shaffer Gesturing with a multipoint sensing device
US20130285927A1 (en) * 2012-04-30 2013-10-31 Research In Motion Limited Touchscreen keyboard with correction of previously input text
US8584049B1 (en) * 2012-10-16 2013-11-12 Google Inc. Visual feedback deletion
US20140022338A1 (en) * 2010-12-20 2014-01-23 St-Ericsson Sa Method for Producing a Panoramic Image on the Basis of a Video Sequence and Implementation Apparatus
US20140108989A1 (en) * 2012-10-16 2014-04-17 Google Inc. Character deletion during keyboard gesture

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6337698B1 (en) * 1998-11-20 2002-01-08 Microsoft Corporation Pen-based interface for a notepad computer
US7503493B2 (en) * 1999-10-25 2009-03-17 Silverbrook Research Pty Ltd Method and system for digitizing freehand graphics with user-selected properties
US6674459B2 (en) 2001-10-24 2004-01-06 Microsoft Corporation Network conference recording system and method including post-conference processing
US6970712B1 (en) 2001-12-13 2005-11-29 At&T Corp Real time replay service for communications network
JP3861690B2 (en) * 2002-01-07 2006-12-20 ソニー株式会社 Image editing apparatus, image editing method, storage medium, and computer program
US7356563B1 (en) * 2002-06-06 2008-04-08 Microsoft Corporation Methods of annotating a collaborative application display
US7663605B2 (en) * 2003-01-08 2010-02-16 Autodesk, Inc. Biomechanical user interface elements for pen-based computers
JP4756914B2 (en) * 2005-05-30 2011-08-24 キヤノン株式会社 Remote cooperative work support system and control method thereof
WO2008095226A1 (en) * 2007-02-08 2008-08-14 Silverbrook Research Pty Ltd Bar code reading method
US9513716B2 (en) * 2010-03-04 2016-12-06 Autodesk, Inc. Bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
US8676552B2 (en) * 2011-02-16 2014-03-18 Adobe Systems Incorporated Methods and apparatus for simulation of fluid motion using procedural shape growth
US9250768B2 (en) * 2012-02-13 2016-02-02 Samsung Electronics Co., Ltd. Tablet having user interface
US20130239051A1 (en) * 2012-03-06 2013-09-12 Apple Inc. Non-destructive editing for a media editing application
US20140040789A1 (en) * 2012-05-08 2014-02-06 Adobe Systems Incorporated Tool configuration history in a user interface
US20140049479A1 (en) 2012-08-17 2014-02-20 Research In Motion Limited Smudge effect for electronic drawing application
US9310998B2 (en) * 2012-12-27 2016-04-12 Kabushiki Kaisha Toshiba Electronic device, display method, and display program
WO2014147716A1 (en) * 2013-03-18 2014-09-25 株式会社 東芝 Electronic device and handwritten document processing method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130241847A1 (en) * 1998-01-26 2013-09-19 Joshua H. Shaffer Gesturing with a multipoint sensing device
US20100245563A1 (en) * 2009-03-31 2010-09-30 Fuji Xerox Co., Ltd. System and method for facilitating the use of whiteboards
US20120013556A1 (en) * 2010-07-16 2012-01-19 Edamak Corporation Gesture detecting method based on proximity-sensing
US20140022338A1 (en) * 2010-12-20 2014-01-23 St-Ericsson Sa Method for Producing a Panoramic Image on the Basis of a Video Sequence and Implementation Apparatus
US20130002648A1 (en) * 2011-06-29 2013-01-03 Google Inc. Managing Satellite and Aerial Image Data in a Composite Document
US20130055140A1 (en) * 2011-08-30 2013-02-28 Luis Daniel Mosquera System and method for navigation in an electronic document
US20130069882A1 (en) * 2011-09-16 2013-03-21 Research In Motion Limited Electronic device and method of character selection
US20130086487A1 (en) * 2011-10-04 2013-04-04 Roland Findlay Meeting system that interconnects group and personal devices across a network
US20130100036A1 (en) * 2011-10-19 2013-04-25 Matthew Nicholas Papakipos Composite Touch Gesture Control with Touch Screen Input Device and Secondary Touch Input Device
US20130285927A1 (en) * 2012-04-30 2013-10-31 Research In Motion Limited Touchscreen keyboard with correction of previously input text
US8584049B1 (en) * 2012-10-16 2013-11-12 Google Inc. Visual feedback deletion
US20140108989A1 (en) * 2012-10-16 2014-04-17 Google Inc. Character deletion during keyboard gesture

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Intuos, "What's New in Photoshop CS6," <URL="http://www.ghacks.net/2011/12/07/displayopentabscountinfirefox/"> *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150067542A1 (en) * 2013-08-30 2015-03-05 Citrix Systems, Inc. Gui window with portal region for interacting with hidden interface elements
US9377925B2 (en) * 2013-08-30 2016-06-28 Citrix Systems, Inc. GUI window with portal region for interacting with hidden interface elements
US20160173543A1 (en) * 2014-12-11 2016-06-16 LiveLoop, Inc. Method and system for concurrent collaborative undo operations in computer application software
US10063603B2 (en) * 2014-12-11 2018-08-28 Liveloop, Inc Method and system for concurrent collaborative undo operations in computer application software
US10324618B1 (en) * 2016-01-05 2019-06-18 Quirklogic, Inc. System and method for formatting and manipulating digital ink
US10755029B1 (en) 2016-01-05 2020-08-25 Quirklogic, Inc. Evaluating and formatting handwritten input in a cell of a virtual canvas
US20170255378A1 (en) * 2016-03-02 2017-09-07 Airwatch, Llc Systems and methods for performing erasures within a graphical user interface
US10942642B2 (en) * 2016-03-02 2021-03-09 Airwatch Llc Systems and methods for performing erasures within a graphical user interface
US11550399B2 (en) * 2016-03-29 2023-01-10 Microsoft Technology Licensing, Llc Sharing across environments
US20180074775A1 (en) * 2016-06-06 2018-03-15 Quirklogic, Inc. Method and system for restoring an action between multiple devices
US10698561B2 (en) * 2017-06-12 2020-06-30 Google Llc Intelligent command batching in an augmented and/or virtual reality environment
CN110520826A (en) * 2017-06-12 2019-11-29 谷歌有限责任公司 Intelligence order in enhancing and/or reality environment, which is closed, to be criticized
US10976890B2 (en) 2017-06-12 2021-04-13 Google Llc Intelligent command batching in an augmented and/or virtual reality environment
US20180356956A1 (en) * 2017-06-12 2018-12-13 Google Inc. Intelligent command batching in an augmented and/or virtual reality environment
DE102017113763A1 (en) * 2017-06-21 2018-12-27 SMR Patents S.à.r.l. Method for operating a display device for a motor vehicle and motor vehicle
DE102017113763B4 (en) 2017-06-21 2022-03-17 SMR Patents S.à.r.l. Method for operating a display device for a motor vehicle and motor vehicle
US11444982B1 (en) * 2020-12-31 2022-09-13 Benjamin Slotznick Method and apparatus for repositioning meeting participants within a gallery view in an online meeting user interface based on gestures made by the meeting participants
US11546385B1 (en) 2020-12-31 2023-01-03 Benjamin Slotznick Method and apparatus for self-selection by participant to display a mirrored or unmirrored video feed of the participant in a videoconferencing platform
US11595448B1 (en) 2020-12-31 2023-02-28 Benjamin Slotznick Method and apparatus for automatically creating mirrored views of the video feed of meeting participants in breakout rooms or conversation groups during a videoconferencing session
US11621979B1 (en) 2020-12-31 2023-04-04 Benjamin Slotznick Method and apparatus for repositioning meeting participants within a virtual space view in an online meeting user interface based on gestures made by the meeting participants

Also Published As

Publication number Publication date
US9354707B2 (en) 2016-05-31
WO2015026497A1 (en) 2015-02-26
WO2015026498A1 (en) 2015-02-26
US20150058807A1 (en) 2015-02-26

Similar Documents

Publication Publication Date Title
US20150058753A1 (en) Sharing electronic drawings in collaborative environments
US10567481B2 (en) Work environment for information sharing and collaboration
US10795529B2 (en) Permitting participant configurable view selection within a screen sharing session
EP2926235B1 (en) Interactive whiteboard sharing
EP3014408B1 (en) Showing interactions as they occur on a whiteboard
US20220214784A1 (en) Systems and methods for a touchscreen user interface for a collaborative editing tool
Liu et al. Two‐finger gestures for 6DOF manipulation of 3D objects
CN108112270B (en) Providing collaborative communication tools within a document editor
US10359905B2 (en) Collaboration with 3D data visualizations
US9798944B2 (en) Dynamically enabling an interactive element within a non-interactive view of a screen sharing session
US20160034058A1 (en) Mobile Device Input Controller For Secondary Display
CN107735758B (en) Synchronized digital ink stroke presentation
US20130278507A1 (en) Multi-touch multi-user gestures on a multi-touch display
US20140157128A1 (en) Systems and methods for processing simultaneously received user inputs
US10848494B2 (en) Compliance boundaries for multi-tenant cloud environment
WO2017160603A1 (en) File workflow board
US10540070B2 (en) Method for tracking displays during a collaboration session and interactive board employing same
US9927892B2 (en) Multiple touch selection control
US20170329793A1 (en) Dynamic contact suggestions based on contextual relevance
WO2017176535A1 (en) Scenario based pinning in a collaboration environment
CN109413400A (en) A kind of projection process method and device
JP6488425B2 (en) Electronic conference management apparatus and electronic conference management method
JP2020149634A (en) Image processing apparatus, method, and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: CITRIX SYSTEMS, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ANDERSON, MATTHEW;MAYOT, FREDERIC;SIGNING DATES FROM 20131121 TO 20131125;REEL/FRAME:032080/0032

AS Assignment

Owner name: GETGO, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CITRIX SYSTEMS, INC.;REEL/FRAME:039970/0670

Effective date: 20160901

AS Assignment

Owner name: JPMORGAN CHASE BANK, N.A., NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:GETGO, INC.;REEL/FRAME:041588/0143

Effective date: 20170201

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: LOGMEIN, INC., MASSACHUSETTS

Free format text: RELEASE OF SECURITY INTEREST RECORDED AT REEL/FRAME 041588/0143;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:053650/0978

Effective date: 20200831

Owner name: GETGO, INC., FLORIDA

Free format text: RELEASE OF SECURITY INTEREST RECORDED AT REEL/FRAME 041588/0143;ASSIGNOR:JPMORGAN CHASE BANK, N.A.;REEL/FRAME:053650/0978

Effective date: 20200831