WO2019175237A1 - Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace - Google Patents

Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace Download PDF

Info

Publication number
WO2019175237A1
WO2019175237A1 PCT/EP2019/056276 EP2019056276W WO2019175237A1 WO 2019175237 A1 WO2019175237 A1 WO 2019175237A1 EP 2019056276 W EP2019056276 W EP 2019056276W WO 2019175237 A1 WO2019175237 A1 WO 2019175237A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote
computing device
remote participant
file
local
Prior art date
Application number
PCT/EP2019/056276
Other languages
English (en)
French (fr)
Inventor
Marco Valerio Masi
Cristiano Fumagalli
Original Assignee
Re Mago Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/923,943 external-priority patent/US10931733B2/en
Application filed by Re Mago Holding Ltd filed Critical Re Mago Holding Ltd
Priority to BR112020018877-8A priority Critical patent/BR112020018877A2/pt
Priority to RU2020133478A priority patent/RU2020133478A/ru
Priority to KR1020207029555A priority patent/KR20200131881A/ko
Priority to CN201980018738.1A priority patent/CN112106044A/zh
Priority to JP2020547345A priority patent/JP2021517302A/ja
Priority to EP19712894.5A priority patent/EP3765973A1/de
Publication of WO2019175237A1 publication Critical patent/WO2019175237A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/18File system types
    • G06F16/188Virtual file systems
    • G06F16/192Implementing virtual folder structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management

Definitions

  • Fig. 1 illustrates an example of the existing architecture of systems which make use of coupled hardware devices for user input.
  • the operating system 100A of Fig. 1 includes executing applications 101A and 102A, each of which have their own APIs, 101B and 102B, respectively.
  • the operating system 100A also has its own API 100B, as well as specialized drivers 100C, 101C, and 102C, configured to interface with hardware devices 100D, 101D, and 102D.
  • application API 101B is configured to interface with driver 101C which itself interfaces with hardware device 101D.
  • application API 102B is configured to interface with driver 102C which itself interfaces with hardware device 102D.
  • Fig. 7 illustrates a tool interface that can be part of the transparent layer according to an exemplary embodiment.
  • Fig. 14 illustrates another example of a transparent layer command determined based on one or more words identified in input voice data according to an exemplary embodiment.
  • Fig. 16 illustrates an example interface for adding new commands corresponding to user input according to an exemplary embodiment.
  • Fig. 19 illustrates a general settings interface that allows a user to customize various aspects of the interface, toggle input modes, and make other changes according to an exemplary embodiment.
  • Fig. 22 illustrates multiple representations of a collaboration workspace according to an exemplary embodiment.
  • FIG. 23 illustrates a flowchart for generating one or more live folders
  • Fig. 24 illustrates an example of a querying process according to an exemplary embodiment.
  • Fig. 25 illustrates an example of generating one or more local folders
  • Fig. 27 illustrates an example of the drag and drop process and detection according to an exemplary embodiment.
  • the transparent layer 203 is an API configured to interface between a virtual driver and an operating system and/or application(s) executing on the operating system.
  • the transparent layer 203 interfaces between the virtual driver 204 and API 201B of application 201 A, API 202 B of application 202 A, and operating system API 200B of operating system 200A.
  • the virtual driver 204 is configured to emulate drivers 205A and 205B, which interface with hardware devices 206A and 206B, respectively.
  • the virtual driver can receive user input that instructs the virtual driver on which virtual driver to emulate, for example, in the form of a voice command, a selection made on a user interface, and/or a gesture made by the user in front of a coupled web camera.
  • each of the connected hardware devices can operate in a“listening” mode and each of the emulated drivers in the virtual driver 204 can be configured to detect an initialization signal which serves as a signal to the virtual driver to switch to a particular emulation mode.
  • a user stating“start voice commands” can activate the driver corresponding to a microphone to receive a new voice command.
  • the user input can be converted into a transparent layer command and transmitted to the transparent layer 203 for execution.
  • the transparent layer command can include native commands in the identified context. For example, if the identified context is application 201 A, then the native commands would be in a format that is compatible with application API 201B of application 201 A. Execution of the transparent layer command can then be configured to cause execution of one or more native commands in the identified context. This is
  • Fig. 2 the architecture shown in Fig. 2 is for the purpose of explanation only, and it is understood that the number of applications executing, number and type of connected hardware devices, number of drivers, and emulated drivers can vary.
  • a user input is determined based at least in part on information captured by one or more hardware devices communicatively coupled to the system.
  • the system can refer to one or more computing devices executing the steps of the method, an apparatus comprising one or more processors and one or more memories executing the steps of the method, or any other computing system.
  • the user input can be determined by a virtual driver executing on the system.
  • virtual driver can be operating in an emulation mode in which it is emulating other hardware drivers and thereby receiving the captured information from a hardware device or can optionally receive the captured information from one or more other hardware drivers which are configured to interface with a particular hardware device.
  • accelerometers and/or a tilt sensors, a remote, a stylus, or any combination of these devices are accelerometers and/or a tilt sensors, a remote, a stylus, or any combination of these devices.
  • an object in the one or more images is recognized.
  • the object can be, for example, a hand, finger, or other body part of a user.
  • the object can also be a special purpose device, such as a stylus or pen, or a special-purpose hardware device, such as a motion tracking stylus/remote which is communicatively coupled to the system and which contains accelerometers and/or tilt sensors.
  • the object recognition can be performed by the virtual driver can be based upon earlier training, such as through a calibration routine run using the object.
  • Fig. 5 A illustrates an example of object recognition according to an exemplary embodiment. As shown in Fig. 5 A, image 501 includes a hand of the user that has been recognized as object 502. The recognition algorithm could of course be configured to recognize a different object, such as a finger.
  • one or more orientations and one or more positions of the recognized object are determined. This can be accomplished in a variety of ways. If the object is not a hardware device and is instead a body part, such as a hand or finger, the object can be mapped in a three-dimensional coordinate system using a known location of the camera as a reference point to determine the three dimensional coordinates of the object and the various angles relative to the X, Y, and Z axes. If the object is a hardware device and includes motion tracking hardware such as an accelerometer and/or tilt sensors, then the image information can be used in conjunction with the information indicated by the accelerometer and/or tilt sensors to determine the positions and orientations of the object.
  • motion tracking hardware such as an accelerometer and/or tilt sensors
  • the actual transparent layer command that is generated based on this input can be based upon user settings and/or an identified context.
  • the command can be a touch command indicating that an object at the coordinates of point 505 should be selected and/or opened.
  • the command can also be a pointing command indicating that a pointer (such as a mouse pointer) should be moved to the coordinates of point 505.
  • the command can be an edit command which modifies the graphical output at the location (such as to annotate the interface or draw an element).
  • Button 701 A allows a user to select the type of drawing tool used to graphically modify the user interface when the user input is input coordinates (such as coordinates based upon a user touching the screening with their hand or a stylus/remote).
  • the various drawing tools can include different brushes, colors, pens, highlighters, etc. These tools can result in graphical alterations of varying styles, thicknesses, colors, etc.
  • Button 701E can be used to capture a screenshot of the user interface and to export the screenshot as an image. This can be used in conjunction with the drawing mode of button 701B and the drawing tools of 701 A. After a user has marked up a particular user interface, the marked up version can be exported as an image.
  • Button 701H can be used to launch a whiteboard application that allows a user to create a drawing or write using draw mode on a virtual whiteboard.
  • Button 701 J can be used to open or close the tool interface 701. When closed, the tool interface can be minimized or removed entirely from the underlying user interface.
  • FIG. 9 illustrates a flowchart for identifying a context corresponding to the user input according to an exemplary embodiment.
  • operating system data 901, application data 902, and user input data 903 can all be used to determine a context 904.
  • Fig. 11 illustrates a flowchart for converting user input into transparent layer commands.
  • the transparent layer command can be determined based at least in part on the identified context 1102 and the user input 1103.
  • the transparent layer command can include one or more native commands configured to execute in one or more corresponding contexts.
  • the transparent layer command can also include response outputs to be transmitted to the virtual driver and on to hardware device(s).
  • the user input 1103 also determines the transparent layer command since user inputs are specifically mapped to certain native commands within one or more contexts and these native commands are part of the transparent layer command. For example, a voice command“Open email” can be mapped to a specific operating system native command to launch the email application Outlook. When voice input is received that includes the recognized words“Open email,” this results in a transparent layer command being determined which includes the native command to launch Outlook.
  • converting the user input into one or more transparent layer commands based at least in part on the identified can include determining a transparent layer command based at least in part on the identified one or more words and the identified context.
  • the transparent layer command can include at least one native command in the identified context, the at least one native command being configured to perform an action associated with the identified one or more words in the identified context.
  • Fig. 14 illustrates another example of a transparent layer command 1400 determined based on one or more words identified in input voice data according to an exemplary embodiment.
  • the one or more words are“open email.”
  • the transparent layer command 1400 includes the native command“outlook.exe,” which is an instruction to run a specific executable file that launches the outlook application.
  • Transparent layer command 1400 also includes a voice response“email opened” which will be output in response to receiving the voice command.
  • Fig. 15 illustrates a flowchart for executing the one or more transparent layer commands on the transparent layer according to an exemplary embodiment.
  • At step 1501 at least one native command in the transparent layer command is identified.
  • the native command can be, for example, designated as a native command within the structure of the transparent layer command, allowing for identification.
  • a response can be transmitted to hardware device(s). As discussed earlier, this response can be routed from the transparent layer to the virtual driver and on to the hardware device.
  • Figs. 16-19 illustrate additional features of the system disclosed herein.
  • Fig. 16 illustrates an example interface for adding new commands corresponding to user input according to an exemplary embodiment.
  • the dashboard in interface 1600 includes icons of applications 1601 which have already been added and can be launched using predetermined user inputs and hardware devices (e.g., voice commands).
  • the dashboard can also show other commands that are application-specific and that are mapped to certain user inputs.
  • Selection of addition button 1602 opens the add command menu 1603.
  • Item type Fixed Item to add on bottom bar menu / Normal Item to add in a drag menu
  • Icon Select the image icon
  • Background Select the background icon color
  • Color Select the icon color
  • Name Set the new item name
  • Voice command Set the voice activation command to open the new application
  • Feedback response Set the application voice response feedback
  • Command Select application type or custom command type to launch (e.g., launch application command, perform action within application command, close application command, etc.);
  • Process Start if launching a new process or application, the name of the process or application; and Parameter: any parameters to pass into the new process or application.
  • the system disclosed herein can be implemented on multiple networked computing devices and used an aid in conducting networked collaboration sessions.
  • the whiteboard functionality described earlier can be a shared whiteboard between multiple users on multiple computing devices.
  • a representation of a collaboration workspace hosted on a server and accessible to a plurality of participants over a web socket connection is transmitted on a user interface of a local computing device.
  • the representation of the collaboration workspace can include one or more remote participant objects corresponding to one or more remote computing devices connected to the server.
  • remote computing devices and remote participants refers to computing devices and participants other than the local participant and the local computing device.
  • Remote computing devices are separated from the local device by a network, such as a wide area network (WAN).
  • WAN wide area network
  • the remote participant objects indicate a remote participant and can take many forms.
  • a remote participant object can be an embedded video stream of a remote participant that is connected via videoconference or webcam.
  • the remote participant object can also be an icon representing the remote participant, an avatar of the remote participant, or any other visual or audio indicator of a particular remote participant.
  • the remote participant objects can be custom objects that can be dragged, moved and/or resized within the representation of the workspace.
  • one or more live folders corresponding to the one or more remote participant objects are generated on the local computing device, with each live folder being mapped to a network address of a remote computing device corresponding to the remote participant object.
  • Fig. 23 illustrates a flowchart for generating one or more live folders
  • server transmits the IP addresses (or other type of network addresses) of the other connected computing devices 2402 and 2403 to computing device 2401 A (the requesting computing device).
  • Server can also transmit information that allows computing device 2401 A to identify which IP address corresponds to which remote participant object, such as a user identification or other information.
  • the local computing device generates one or more local folders corresponding to the one or more remote participant objects.
  • the one or more local folders can be generated and stored on a memory of the local computing device.
  • a temporary cache can be created on the local computing device when the collaboration workspace session is initiated.
  • This temporary cache store information about the session, such as the meeting identifier and other session details.
  • the one or more local folders can be generated and stored within the temporary cache.
  • a custom data structure can be created that associates each remote participant object with a local folder.
  • the local folder can be incorporated as an invisible element into the user interface 2501 and placed at the same position of the corresponding remote participant object.
  • the remote participant object can have an exposed API that allows for linking to a corresponding local folder.
  • one or more live folders are generated by mapping the one or more local folders to the one or more IP addresses.
  • local folders F2 and F3 are mapped to the network addresses of remote computing devices 2604 and 2605, respectively.
  • Computing device 2604 corresponds to remote participant User 2
  • computing device 2605 corresponds to remote participant User 3.
  • the mapping of the local folders to the network addresses of the remote computing devices can be accomplished in a variety of ways.
  • Each local folder can have, as its address, the network address of the corresponding remote computing device.
  • the local folder is an instantiation of remote folder on the local storage.
  • a custom data structure or script can be configured to transfer the contents of a local folder to a destination network address.
  • the script can interface with the network connection (such as the web socket) in order to effectuate the transfer.
  • Many variations are possible and these examples are not intended to be limiting.
  • a drag-and-drop action can be input using a variety of input devices. For example, a user can drag-and-drop using a mouse. A user can also drag-and-drop using a hand gesture or stylus as discussed earlier. The earlier described techniques involving the virtual driver and/or the transparent can be used to detect the drag and drop motion.
  • the local computing device can be configured to store one or more spatial positions of one or more remote participant objects within the user interface.
  • the detection of whether a particular icon has been dragged and dropped proximate to a particular remote participant object can be performed by detecting a user input to drag an icon to a destination spatial position that is within a threshold distance from a spatial position of the remote participant object.
  • the threshold distance can be set by a user or can be some default value. For example, the threshold distance can be less than 10 pixels or less than zero pixels (in which case the dragged icon would have to intersect or overlap with the remote participant object in the user interface).
  • Fig. 28 illustrates another example of the drag and drop process and detection according to an exemplary embodiment.
  • Interface 2801 corresponds to a representation of the collaboration workspace.
  • Interface portion 2802 can be a section of the representation of the collaboration workspace listing various files.
  • the remote participant object 2803 is an embedded video stream, which can be received over the network (e.g., web socket) connection.
  • a local participant user
  • the file can be any type of file, such as an audio file, video file, audiovisual file, a text document, a spreadsheet, a slide presentation, etc.
  • a computing environment can have additional features.
  • the computing environment 3000 includes storage 3040, one or more input devices 3050, one or more output devices 3060, and one or more communication connections 3090.
  • the interconnection mechanism 3070 such as a bus, controller, or network interconnects the components of the computing environment 3000.
  • operating system software or firmware (not shown) provides an operating environment for other software executing in the computing environment 3000, and coordinates activities of the components of the computing environment 3000.
  • the storage 3040 can be removable or non-removable, and includes magnetic disks, magnetic tapes or cassettes, CD-ROMs, CD-RWs, DVDs, or any other medium which can be used to store information and which can be accessed within the computing environment 3000.
  • the storage 3040 can store instructions for the software 3080.
  • Computing environment 3000, display device 3060, and input device 3050 can be separate devices (e.g., a personal computer connected by wires to a monitor and mouse), can be integrated in a single device (e.g., a mobile device with a touch-display, such as a smartphone or a tablet), or any combination of devices (e.g., a computing device operatively coupled to a touch-screen display device, a plurality of computing devices attached to a single display device and input device, etc.).
  • Computing environment 3000 can be a set-top box, personal computer, or one or more servers, for example a farm of networked servers, a clustered server environment, or a cloud network of computing devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • Human Resources & Organizations (AREA)
  • Software Systems (AREA)
  • Strategic Management (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Position Input By Displaying (AREA)
PCT/EP2019/056276 2018-03-16 2019-03-13 Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace WO2019175237A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
BR112020018877-8A BR112020018877A2 (pt) 2018-03-16 2019-03-13 Método, dispositivo de computação local e meio de armazenamento legível por computador não transitório para transmissão de arquivos através de uma conexão de soquete da web em um espaço de trabalho de colaboração em rede
RU2020133478A RU2020133478A (ru) 2018-03-16 2019-03-13 Способ, устройство и машиночитаемый носитель данных для передачи файлов посредством двунаправленного сетевого соединения в сетевом совместном рабочем пространстве
KR1020207029555A KR20200131881A (ko) 2018-03-16 2019-03-13 네트워킹된 협업 워크스페이스에서 웹 소켓 연결을 통해 파일을 전송하기 위한 방법, 기기, 및 컴퓨터 판독 가능 매체
CN201980018738.1A CN112106044A (zh) 2018-03-16 2019-03-13 用于在网络协作工作空间中通过网络套接字连接传输文件的方法,设备和计算机可读介质
JP2020547345A JP2021517302A (ja) 2018-03-16 2019-03-13 ネットワーク化された共同ワークスペースにおけるウェブ・ソケット接続を介したファイルの送信のための方法、装置、及びコンピュータ可読媒体
EP19712894.5A EP3765973A1 (de) 2018-03-16 2019-03-13 Verfahren, vorrichtung und computerlesbares medium zur übertragung von dateien über eine websocket-verbindung in einem vernetzten zusammenarbeitsraum

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15/923,943 2018-03-16
US15/923,943 US10931733B2 (en) 2017-08-24 2018-03-16 Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace

Publications (1)

Publication Number Publication Date
WO2019175237A1 true WO2019175237A1 (en) 2019-09-19

Family

ID=65955176

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/056276 WO2019175237A1 (en) 2018-03-16 2019-03-13 Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace

Country Status (7)

Country Link
EP (1) EP3765973A1 (de)
JP (1) JP2021517302A (de)
KR (1) KR20200131881A (de)
CN (1) CN112106044A (de)
BR (1) BR112020018877A2 (de)
RU (1) RU2020133478A (de)
WO (1) WO2019175237A1 (de)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113868201B (zh) * 2021-12-02 2022-03-15 天津联想协同科技有限公司 一种多人协作分享文件的方法、装置及存储介质
KR102605522B1 (ko) * 2023-01-06 2023-11-24 한규태 펑션 보드를 이용한 고객 협동 개발 시스템 및 이에 포함되는 서버

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130042259A1 (en) * 2011-08-12 2013-02-14 Otoy Llc Drag and drop of objects between applications
US20150149929A1 (en) * 2013-11-22 2015-05-28 Dell Products, L.P. Managing Information and Content Sharing in a Virtual Collaboration Session

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7167897B2 (en) * 1996-05-08 2007-01-23 Apple Computer, Inc. Accessories providing a telephone conference application one or more capabilities independent of the teleconference application
US7587467B2 (en) * 1999-12-02 2009-09-08 Western Digital Technologies, Inc. Managed peer-to-peer applications, systems and methods for distributed data access and storage
US7634533B2 (en) * 2004-04-30 2009-12-15 Microsoft Corporation Systems and methods for real-time audio-visual communication and data collaboration in a network conference environment
US7821985B2 (en) * 2006-03-13 2010-10-26 Microsoft Corporation Network interface routing using computational context
US7958270B2 (en) * 2006-06-09 2011-06-07 Laurent Frederick Sidon Distribution of files from mobile devices
US9300912B2 (en) * 2008-03-28 2016-03-29 Microsoft Technology Licensing, Llc Software based whiteboard capture solution for conference room meetings
US20160140139A1 (en) * 2014-11-17 2016-05-19 Microsoft Technology Licensing, Llc Local representation of shared files in disparate locations
US10001913B2 (en) * 2015-04-01 2018-06-19 Dropbox, Inc. Shared workspaces with selective content item synchronization

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130042259A1 (en) * 2011-08-12 2013-02-14 Otoy Llc Drag and drop of objects between applications
US20150149929A1 (en) * 2013-11-22 2015-05-28 Dell Products, L.P. Managing Information and Content Sharing in a Virtual Collaboration Session

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OSADZINSKI ET AL: "The Network File System (NFS)", COMPUTER STANDARDS AND INTERFACES, ELSEVIER SEQUOIA. LAUSANNE, CH, vol. 8, no. 1, 1 January 1988 (1988-01-01), pages 45 - 48, XP024241097, ISSN: 0920-5489, [retrieved on 19880101], DOI: 10.1016/0920-5489(88)90076-1 *

Also Published As

Publication number Publication date
JP2021517302A (ja) 2021-07-15
BR112020018877A2 (pt) 2020-12-29
KR20200131881A (ko) 2020-11-24
RU2020133478A (ru) 2022-04-19
EP3765973A1 (de) 2021-01-20
CN112106044A (zh) 2020-12-18

Similar Documents

Publication Publication Date Title
US11483376B2 (en) Method, apparatus, and computer-readable medium for transmission of files over a web socket connection in a networked collaboration workspace
US20220382505A1 (en) Method, apparatus, and computer-readable medium for desktop sharing over a web socket connection in a networked collaboration workspace
US20190065012A1 (en) Method, apparatus, and computer-readable medium for propagating enriched note data objects over a web socket connection in a networked collaboration workspace
JP5442727B2 (ja) ユーザーインターフェイス表示上での教示動画の表示
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
EP3073368B1 (de) Menschliche schnittstellenvorrichtungs-eingabefusion
US10380038B2 (en) Method, apparatus, and computer-readable medium for implementation of a universal hardware-software interface
EP3765973A1 (de) Verfahren, vorrichtung und computerlesbares medium zur übertragung von dateien über eine websocket-verbindung in einem vernetzten zusammenarbeitsraum
US11334220B2 (en) Method, apparatus, and computer-readable medium for propagating cropped images over a web socket connection in a networked collaboration workspace
EP3803558A1 (de) Verfahren, vorrichtung und computerlesbares medium zur gemeinsamen desktop-nutzung über eine websocket-verbindung in einem vernetzten zusammenarbeitsraum
JP2021533456A (ja) ネットワーク化された共同ワークスペースにおいてウェブ・ソケット接続を介して拡充ノート・データ・オブジェクトを伝えるための方法、装置及びコンピュータ可読媒体
WO2019219848A1 (en) Method, apparatus, and computer-readable medium for propagating cropped images over a web socket connection in a networked collaboration workspace

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19712894

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020547345

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20207029555

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2019712894

Country of ref document: EP

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112020018877

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112020018877

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20200916