WO2019008320A1 - Procédé et système d'indication de réponse d'un participant à une réunion virtuelle - Google Patents

Procédé et système d'indication de réponse d'un participant à une réunion virtuelle Download PDF

Info

Publication number
WO2019008320A1
WO2019008320A1 PCT/GB2018/051619 GB2018051619W WO2019008320A1 WO 2019008320 A1 WO2019008320 A1 WO 2019008320A1 GB 2018051619 W GB2018051619 W GB 2018051619W WO 2019008320 A1 WO2019008320 A1 WO 2019008320A1
Authority
WO
WIPO (PCT)
Prior art keywords
meeting
emotive
data
users
virtual
Prior art date
Application number
PCT/GB2018/051619
Other languages
English (en)
Inventor
Maria Francisca Jones
Alexander Jones
Original Assignee
Maria Francisca Jones
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maria Francisca Jones filed Critical Maria Francisca Jones
Priority to KR1020207003348A priority Critical patent/KR20200037241A/ko
Priority to JP2019572423A priority patent/JP2020525946A/ja
Priority to SG11202000052WA priority patent/SG11202000052WA/en
Priority to AU2018298474A priority patent/AU2018298474A1/en
Priority to EP18737352.7A priority patent/EP3649588A1/fr
Priority to CA3068920A priority patent/CA3068920A1/fr
Priority to CN201880055827.9A priority patent/CN111066042A/zh
Publication of WO2019008320A1 publication Critical patent/WO2019008320A1/fr
Priority to ZA2020/00730A priority patent/ZA202000730B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • G06Q10/1095Meeting or appointment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling

Definitions

  • the present invention relates to a method and system for indicating a response of a participant in a virtual meeting.
  • Virtual meetings can also form the basis of a framework for social interactions between members of a group of users.
  • the interface hosting a virtual meeting can also be used as a means of providing many ancillary functions to accompany the meeting.
  • One aspect of the invention provides a system for indicating emotive responses in a virtual meeting, the system comprising at least one processor; and a memory storing instructions, which instructions being executable by the at least one processor to: create or select avatar data defining one or more avatars to represent one or more corresponding users in response to input from the one or more corresponding users; receive one or more user selections of meeting data defining one or more virtual meetings, a user selection comprising an indication that the user is attending the virtual meeting; generate an output for display of a virtual meeting with one or more avatars representing one or more users attending the meeting using the avatar data and the meeting data corresponding to the virtual meeting; receive emotive input data from one or more users indicative of an emotive response or body language of the one or more users attending the virtual meeting; process the avatar data using the emotive input data; and update the output for display of the virtual meeting to render the one or more avatars for the one or more users to display a respective emotive state dependent upon the respective emotive input data.
  • Another aspect of the invention provides a method of indicating emotive responses in a virtual meeting, the method comprising creating or select avatar data defining one or more avatars to represent one or more corresponding users in response to input from the one or more corresponding users; receiving one or more user selections of meeting data defining one or more virtual meetings, a user selection comprising an indication that the user is attending the virtual meeting; generating an output for display of a virtual meeting with one or more avatars representing one or more users attending the meeting using the avatar data and the meeting data corresponding to the virtual meeting; receiving emotive input data from one or more users indicative of an emotive response or body language of the one or more users attending the virtual meeting; processing the avatar data using the emotive input data; and updating the output for display of the virtual meeting to render the one or more avatars for the one or more users to display a respective emotive state dependent upon the respective emotive input data.
  • Another aspect of the invention provides a carrier medium or a storage medium carrying code executable by a processor to carry out the deferred search method.
  • Figure 1 is a schematic diagram illustrating a system according to one embodiment
  • Figure 2 is a flow diagram of a method using the system of figure 1 according to one embodiment
  • Figure 3 is a schematic illustration of a user interface for a virtual conference generated according to one embodiment
  • Figure 4 is a schematic diagram of a meeting using an augmented reality conference display according to one embodiment
  • Figure 5 is a schematic illustration of a user interface for an augmented reality conference display generated in the embodiment of figure 4;
  • Figure 6 is a schematic illustration of a user interface for a social meeting generated according to one embodiment.
  • Figure 7 is a schematic diagram of a basic computing device for use in one embodiment. DETAILED DESCRIPTION
  • data is described as being stored in at least one database.
  • database is intended to encompass any data structure (and/or combinations of multiple data structures) for storing and/or organizing data, including, but not limited to, relational databases (e.g., Oracle databases, mySQL databases, etc.), nonrelational databases (e.g., NoSQL databases, etc.), in-memory databases, spreadsheets, as comma separated values (CSV) files, extendible markup language (XML) files, TeXT (TXT) files, flat files, spreadsheet files, and/or any other widely used or proprietary format for data storage.
  • CSV comma separated values
  • XML extendible markup language
  • TXT TeXT
  • flat files flat files
  • spreadsheet files and/or any other widely used or proprietary format for data storage.
  • Databases are typically stored in one or more data stores.
  • each database referred to herein is to be understood as being stored in one or more data stores.
  • a "file system” may control how data is stored and/or retrieved (for example, a disk file system like FAT, NTFS, optical discs, etc., a flash file system, a tape file system, a database file system, a transactional file system, a network file system, etc.).
  • a disk file system like FAT, NTFS, optical discs, etc.
  • flash file system for example, a disk file system like FAT, NTFS, optical discs, etc., a flash file system, a tape file system, a database file system, a transactional file system, a network file system, etc.
  • the disclosure is described herein with respect to databases. However, the systems and techniques disclosed herein may be implemented with file systems or a combination of databases and file systems.
  • the term data store is intended to encompass any computer readable storage medium and/or device (or collection of data storage mediums and/or devices).
  • data stores include, but are not limited to, optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like.
  • Another example of a data store is a hosted storage environment that includes a collection of physical data storage devices that may be remotely accessible and may be rapidly provisioned as needed (commonly referred to as "cloud" storage).
  • the functions or algorithms described herein are implemented in hardware, software or a combination of software and hardware in one embodiment.
  • the software comprises computer executable instructions stored on computer readable carrier media such as memory or other type of storage devices.
  • described functions may correspond to modules, which may be software, hardware, firmware, or any combination thereof. Multiple functions are performed in one or more modules as desired, and the embodiments described are merely examples.
  • the software is executed on a digital signal processor, ASIC, microprocessor, or other type of processor operating on a system, such as a personal computer, server, a router, or other device capable of processing data including network interconnection devices.
  • Some embodiments implement the functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application- specific integrated circuit.
  • the exemplary process flow is applicable to software, firmware, and hardware implementations.
  • a generalized embodiment provides a method and system for indicating emotive responses in a virtual meeting, in which avatar data defining one or more avatars to represent one or more corresponding users in response to input from the one or more corresponding users is created or selected and one or more user selections of meeting data defining one or more virtual meetings is received.
  • a user selection comprises an indication that the user is attending the virtual meeting.
  • An output is generated for display of a virtual meeting with one or more avatars representing one or more users attending the meeting using the avatar data and the meeting data corresponding to the virtual meeting.
  • Emotive input data is received from one or more users indicative of an emotive response or body language of the one or more users attending the virtual meeting.
  • the avatar data is processed using the emotive input data, and the output for display of the virtual meeting is updated to render the one or more avatars for the one or more users to display a respective emotive state dependent upon the respective emotive input data.
  • the virtual meeting can be any form of meeting in a virtual environment, such as a business meeting, a conference, a social meeting, a chat room, a virtual shop etc.
  • a virtual environment such as a business meeting, a conference, a social meeting, a chat room, a virtual shop etc.
  • the display of an emotive state in the virtual environment enables interaction with other users via the avatars to indicate emotive states of users.
  • the emotive state of the avatars can be manipulated simply to reflect the emotive state of the user to simply interact with other users by body language and without requiring text of any other form of indications.
  • Body language in avatars is the most natural form of expression of emotions to other users via the virtual environment.
  • the virtual meeting can be a 'pure' virtual meeting where all of the images of the participants are generated as avatars.
  • the virtual meeting may be an augmented reality meeting in which video images of one or more participants in a meeting are displayed, and the augmented reality meeting has one or more avatars representing one or more users overlaid on the video data with the video images of the participants. In this way, those participants who are not part of the 'real' meeting can express themselves and interact using the body language of their avatars.
  • Interaction input can be received from one or more users attending the virtual meeting to cause the avatars to perform required interaction, and the output for display of the virtual meeting is updated to render the one or more avatars for the one or more users from which interaction data is received to display the required interaction.
  • the interaction can include the emotive interaction of a greeting, including shaking hands, 'high fiving', hugging or kissing.
  • the user interface can, in one embodiment, be provided as a conventional web site having a displayed output and a pointer device and keyboard input by a user.
  • the interface can be provided by any form of visual output and any form of input such as keyboard, touch screen, pointer device (such as a mouse, trackball, trackpad, or pen device), audio recognition hardware and/or software to recognize a sounds or speech from a user, gesture recognition input hardware and/or software, etc.
  • the method and system can be used with the method and system disclosed in copending US patent application number , filed on the same date as this application and entitled "VIRTUAL OFFICE", the content of which is hereby incorporated by reference in its entirety.
  • the virtual meeting can be part of a virtual office to allow users to control their avatars to interact with images of items of office equipment to cause the items of office equipment to perform office functions.
  • the method and system can be used with the method and apparatus disclosed in copending US patent application number , filed on the same date as this application and entitled "METHOD AND APPARATUS TO TRANSFER DATA FROM A FIRST COMPUTER STATE TO A DIFFERENT COMPUTER STATE", the content of which is hereby incorporated by reference in its entirety.
  • the method and system can be used with the method and apparatus disclosed in copending US patent application number , filed on the same date as this application and entitled "EVENT BASED DEFERRED SEARCH METHOD AND SYSTEM", the content of which is hereby incorporated by reference in its entirety.
  • the method and system can be used with the method and apparatus disclosed in co-pending US patent application number US 15/395,343, filed 30 th December 2016 and entitled "USER INTERFACE METHOD AND APPARATUS", the content of which is hereby incorporated in its entirety.
  • the user interface of US 15/395,343 can provide a means by which the user interacts with the system for inputs and selections.
  • the method and system can be used with the electronic transaction method and system disclosed in copending US patent application number US 15/395,487, filed 30 th December 2016 and entitled "AN ELECTRONIC TRANSACTION METHOD AND APPARATUS", the content of which is hereby incorporated in its entirety.
  • Figure 1 illustrates a generalized system according to one embodiment.
  • Figure 1 illustrates two client devices 100A and 100B, each for use by a user.
  • the client devices 100A and 100B can comprise any type of computing or processing machine, such as a personal computer, a laptop, a tablet computer, a personal organizer, a mobile device, smart phone, a mobile telephone, a video player, a television, a multimedia device, personal digital assistant, etc.
  • each client device executes a web browser 101A and 101B to enable it to interact with hosted web pages at a server system 1000.
  • the web browser 101A and 10 IB can be replaced by an application running on the client devices 100A and 100B.
  • the client devices 100A and 100B are connected to a network, which is this example is the internet 50.
  • the network can comprise any suitable communications network for networking computer devices.
  • the server system 1000 comprises any number of server computers connected to the internet 50.
  • the server system 1000 operates to provide the service according to embodiments of the invention.
  • the server system 1000 comprises a web server 110 to host web pages to be accessed and rendered by the browsers 101A and 101B.
  • An application server 120 is connected to the web server 110 to provide dynamic data for the web server 110.
  • the application server 120 is connected to a data store 195.
  • the data store 195 stores data in a number of different databases, namely a user database 130, an avatar database 140, a virtual world data store 150, a meeting database 160, and an emotional response database 170.
  • the user database 130 stores information on the user, which can include an identifier, name, age, username and password, date of birth, address, etc.
  • the avatar database 140 can store data on avatars available to be created by users to represent themselves and the user generated avatars associated with the user data.
  • the virtual world data store 150 stores data required to create the virtual meeting environments.
  • the meeting database 160 can store data on specific meetings, including a meeting identifier, a meeting name, associated users attending the meeting (hence indirectly the avatars to be rendered in the virtual meeting), an identifier for any video stream to be rendered as part of an augmented reality virtual meeting, meeting date, meeting login information, etc.
  • the emotional response database 170 can store data indicative of a set of emotional responses that can be selected by a user and used to modify the rendered appearance of the avatars.
  • the avatar data and processing for rendering in the virtual environment can be structured to allow each of the emotional responses to be applied.
  • the emotional responses can be such things as: smile, laugh, cry, greet by handshake, hug or kiss, bored, frown, cross angry, acknowledged, relaxed, interested/a look of intent, etc.
  • Figure 2 is a flow diagram of a process for indicating emotive responses in a virtual meeting using the system of figure 1 according to one embodiment.
  • step S 10 a user creates or selects an avatar to represent them in a virtual meeting.
  • step S 11 a user selection of meeting data defining a virtual meeting is received. A user selection comprises an indication that the user is attending the virtual meeting.
  • step S 12 an output for display of the virtual meeting is generated with an avatar representing the users attending the meeting using the avatar data and the meeting data corresponding to the virtual meeting.
  • emotive input data is received from the user indicative of an emotive response or body language of the user attending the virtual meeting.
  • step S 14 the avatar data is processed using the emotive input data and in step S 15 the output for display of the virtual meeting is updated to render the avatar for the user to display an emotive state dependent upon the emotive input data.
  • Figure 3 is a schematic illustration of a user interface for a virtual conference generated according to one embodiment.
  • the display 200 includes a virtual conference area 201 to display the virtual conference and a reaction menu area 202 displaying user selectable menu items to enable a user to select to input an emotions response or body language to be applied to their avatar in the virtual conference for interaction with other attendees.
  • the other attendees will be able to see the user's emotional reaction as applied to their avatar in the virtual conference display area enabling them to react accordingly, for example by changing the emotive response displayed by their own avatar or by taking some other action in the virtual conference.
  • the menu is illustrated as a text menu, the menu could comprise icons or images depicting various emotional states that the user can select to modify their avatar's appearance and behaviour to display the emotional response and body language according to the user's selection.
  • a menu could also be displayed to a user to allow a user to select sounds or music that the avatar could output in the virtual conference e.g. selected wording or readymade phrases like' 'greetings', 'hey', 'what's up?' or a birthday or greeting messages that could be ready made or composable by the user. These could be selectable in different accents, such as American or English or even an impersonation of a famous person.
  • a translation option which translates and replays a message, such as part of what the user wants to say, e.g. speak French when being Romanic.
  • This can be a pre-saved recording or the system may translate what user (avatar) has just said, although it may be a bit delayed.
  • there is a prerecorded and saved message option to use where the user is able to record and play back, via their avatar as, for example, a response to another avatar or guest user that they are meeting.
  • the display 200 includes outside the area 201 for the virtual conference a shared message area 203 that can be used to share messages with any other user individually, in groups or globally to the virtual conference attendees. Also, outside the area 201 for the virtual conference, a shared display area 204 is displayed. In this example, it corresponds to a virtual white board 203 in the virtual conference so that anything drawn on the shared display area will appear on the virtual white board 203.
  • the virtual conference area 201 there are displayed avatars of attendees of the meeting. Four are seated. Two attendees 206 are shown greeting each other by shaking hands. To achieve this, the users corresponding to the avatars 206 have selected a reaction menu item to shake hands. One avatar 207 for a user is shown displaying anger. One avatar 208 is shown smiling.
  • the virtual conference can be controlled to operate as a conventional conference, with each user of a client device being able to speak to input audio for transmission to the client devices of the other attendees.
  • documents can be entered into the meeting by placing them on the table in the virtual display. The location of the placement will effect who can see them. To show them to everyone, copies of the document may be placed before everyone.
  • Documents can be dragged into a virtual filing cabinet 214 to file them, or the user can select to find or a file in the virtual filing cabinet 214 or search the virtual filing cabinet 214 to cause a filing system to be searched to find documents. Users can make their avatars move in the virtual conference and when they leave the conference, they can be shown exiting through a door 205.
  • the perspective displayed of the virtual conference for each attendee can vary depending upon their assigned seating position around the table.
  • Figure 4 is a schematic diagram of a meeting using an augmented reality conference display according to one embodiment.
  • a physical real world conference is taking place around a table with four participants.
  • a display 300 displaying participants attending virtually using their avatars 301 and 302.
  • the avatar 301 has been controlled by its respective user by an emotive input to reflect a happy or smiley face.
  • the avatar 302 has been controlled by its respective user by an emotive input to reflect an angry or annoyed face.
  • the augmented reality conference can be controlled to operate as a conventional conference, with each user of a client device being able to speak to input audio for transmission to the client devices of the other attendees and to speakers associated with the display 300.
  • documents can be entered into the meeting by placing them on the table in the virtual display 300. The location of the placement can effect who can see them. To show them to everyone, copies of the document need to be placed before everyone.
  • documents can be dragged into a virtual filing cabinet 304 to file them. Users can make their avatars move in the virtual conference and when they leave the conference, they can be shown exiting through a door 303.
  • a video camera or webcam 305 is provided to provide a video feed of the real attendees to the remote or virtual attendees' computers, as shown in figure 5.
  • Figure 5 is a schematic illustration of a user interface for an augmented reality conference display generated for a virtual attendee of the embodiment of figure 4
  • the display 350 includes an augmented reality conference area 310 to display the augmented reality conference comprising a video stream of the physical attendees and a virtual conference segment conjoined.
  • a reaction menu area 380 displays user selectable menu items to enable the user to select to input an emotional response or body language to be applied to their avatar in the augmented reality conference for interaction with other attendees. The other attendees will be able to see the user's emotional and physical reaction as applied to their avatar in the augmented reality conference display area enabling them to react accordingly, for example by changing the emotive response displayed by their own avatar or by taking some other action in the augmented reality conference.
  • the menu is illustrated as a text menu, the menu could comprise icons or images depicting various emotional states that the user can select to modify their avatar's appearance and behaviour to display the emotional response and body language according to the user's selection.
  • a user can select to share music data, which assists in displaying a user's mood, or expression of emotion, or it can be used in response to another users response e.g. to play, share, or save or to enjoy the tune or song, e.g. a happy song to share with other user (avatar).
  • a user's mood can be displayed by playing saved or selected music e.g. sad music for feeling down, sad lonely, or blue, or happy music in that they are feeling good.
  • a use is able to tune into a radio station and find a tune that is apt for the user's emotion at the time.
  • a user is to be able to select and apply colours
  • Chronotherapy sometimes called colour therapy
  • colour therapy e.g. virtual paint in different colours.
  • a user may select to paint a virtual bedroom in a magical sparkly colour, or a deep dark colour to show friends how the user is feeling in the users virtual space.
  • the augmented reality conference can be controlled to operate as a conventional conference, with each user of a client device attending the virtual conference segment being able to speak to input audio for transmission to the client devices of the other virtual attendees and to the speaker associated with the display 300 for the physical (real) attendees.
  • documents that are physically entered into the real conference can be entered into the virtual conference by placing them on the table in the virtual display segment of the augmented reality conference. The location of the placement will effect who can see them.
  • copies of the document can be placed before every virtual attendee.
  • Documents can be dragged into a virtual filing cabinet 304 to file them. Users can make their avatars move in the virtual segment of the augmented reality conference and when they leave the conference, they can be shown exiting through a door 3.
  • the display 350 includes a shared message area 360 that can be used to share messages with any other user individually, in groups or globally to the augmented reality conference attendees. Also, a shared display area 370 is displayed.
  • Figure 6 is a schematic illustration of a user interface for a social meeting generated according to one embodiment.
  • a display 400 includes a virtual meeting area 410 in which avatars can be displayed in a virtual environment.
  • avatar 403 has been controlled by its user to smile
  • avatar 402 has been controlled to laugh
  • the two avatars 401 in the foreground have been controlled to greet each other by shaking hands.
  • a reaction menu area 404 displays user selectable menu items to enable a user to select to input an emotional response or body language to be applied to their avatar in the virtual meeting for interaction with other attendees.
  • the other attendees will be able to see the user's emotional reaction as applied to their avatar in the virtual meeting display area 410 enabling them to react accordingly, for example by changing the emotive response displayed by their own avatar or by taking some other action in the virtual meeting.
  • the display 400 includes outside the area 410 for the virtual meeting a shared message area 405 that can be used to share messages with any other user individually, in groups or globally to the virtual meeting attendees. Also, outside the area 410 for the virtual meeting, a shared display area 406 is displayed. In this example, it corresponds to a news item shared between the two users represented by the avatars 402 and 403.
  • the message area displays a private message exchange between avatar 403 (David) and avatar 402 (Steve) related to the news item.
  • avatars emotional response has been adjusted by input by the associated users to reflect their interaction regarding the news item.
  • the system can be controlled to allow users to join and move between meetings that take place in different rooms.
  • These rooms could be displayed schematically as, for example, a room map to allow a user to select to move from one room to another to join and leave a meeting.
  • the rooms can represent different types of meetings e.g. a games room meeting, a coffee table meeting etc.
  • users can set up meetings and invite other users to the meetings with the virtual location and time of the meeting being set by the inviting user.
  • identifiers of the avatars can be displayed or alternatively or in addition, a list of attendees can be displayed.
  • the virtual meeting using avatars could be in the environment related to any corresponding real world environment, such as in a shop, or in a gym.
  • the user input to set the emotional state of the avatar is based on a simple menu selection.
  • a camera can be provided to take a picture or video of a user's face and possibly body and determine an emotional response of the user.
  • the user could be provided with the ability to input free text by typing or by recognition of speech to describe their emotional response to control their avatar.
  • the picture or video of the user could also be used to capture the user's current clothing and to adapt the avatar to represent the different cloths worn by the user, e.g. outfits, a suit and tie, a dress, fancy dress etc. This can be used to facilitate the user's ability to dress smart or casual in a virtual meeting.
  • a user can choose a dress to wear or a suit and tie which can be changed for each meeting, e.g. a different colour tie.
  • the avatar generated can be selected by the user to take any form.
  • the avatar could be an animal with the user's own features included or any other character mixed in with the user's i.e. human features which can be adapted.
  • the virtual meeting is in a virtual restaurant or a social gathering involving virtual food and/or drink.
  • FIG. 7 is a block diagram that illustrates a basic computing device 600 in which the example embodiment(s) of the present invention may be embodied.
  • Computing device 600 and its components, including their connections, relationships, and functions, is meant to be exemplary only, and not meant to limit implementations of the example embodiment(s).
  • Other computing devices suitable for implementing the example embodiment(s) may have different components, including components with different connections, relationships, and functions.
  • the computing device 600 can comprise any of the servers or the user device as illustrated in figure 1 for example.
  • Computing device 600 may include a bus 602 or other communication mechanism for addressing main memory 606 and for transferring data between and among the various components of device 600.
  • Computing device 600 may also include one or more hardware processors 604 coupled with bus 602 for processing information.
  • a hardware processor 604 may be a general purpose microprocessor, a system on a chip (SoC), or other processor.
  • Main memory 606 such as a random access memory (RAM) or other dynamic storage device, also may be coupled to bus 602 for storing information and software instructions to be executed by processor(s) 604. Main memory 606 also may be used for storing temporary variables or other intermediate information during execution of software instructions to be executed by processor(s) 604.
  • RAM random access memory
  • main memory 606 also may be used for storing temporary variables or other intermediate information during execution of software instructions to be executed by processor(s) 604.
  • computing device 600 into a special-purpose computing device that is customized to perform the operations specified in the software instructions.
  • the terms "software”, “software instructions”, “computer program”, “computer-executable instructions”, and “processor-executable instructions” are to be broadly construed to cover any machine- readable information, whether or not human-readable, for instructing a computing device to perform specific operations, and including, but not limited to, application software, desktop applications, scripts, binaries, operating systems, device drivers, boot loaders, shells, utilities, system software, JAVASCRIPT, web pages, web applications, plugins, embedded software, microcode, compilers, debuggers, interpreters, virtual machines, linkers, and text editors.
  • Computing device 600 also may include read only memory (ROM) 608 or other static storage device coupled to bus 602 for storing static information and software instructions for processor(s) 604.
  • ROM read only memory
  • static storage device coupled to bus 602 for storing static information and software instructions for processor(s) 604.
  • One or more mass storage devices 610 may be coupled to bus 602 for persistently storing information and software instructions on fixed or removable media, such as magnetic, optical, solid-state, magnetic-optical, flash memory, or any other available mass storage technology.
  • the mass storage may be shared on a network, or it may be dedicated mass storage.
  • at least one of the mass storage devices 610 (e.g., the main hard disk for the device) stores a body of program and data for directing operation of the computing device, including an operating system, user application programs, driver and other support files, as well as other data files of all sorts.
  • Computing device 600 may be coupled via bus 602 to display 612, such as a liquid crystal display (LCD) or other electronic visual display, for displaying information to a computer user.
  • display 612 such as a liquid crystal display (LCD) or other electronic visual display
  • a touch sensitive surface incorporating touch detection technology e.g., resistive, capacitive, etc.
  • touch detection technology e.g., resistive, capacitive, etc.
  • touch gesture e.g., finger or stylus
  • An input device 614 may be coupled to bus 602 for communicating information and command selections to processor 604.
  • input device 614 may include one or more physical buttons or switches such as, for example, a power (on/off) button, a "home” button, volume control buttons, or the like.
  • cursor control 616 such as a mouse, a trackball, a cursor, a touch screen, or direction keys for communicating direction information and command selections to processor 604 and for controlling cursor movement on display 612.
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • Other input device embodiments include an audio or speech recognition input module to recognize audio input such as speech, a visual input device capable of recognizing gestures by a user, and a keyboard.
  • display 612, input device 614, and cursor control 616 are external components (i.e., peripheral devices) of computing device 600, some or all of display 612, input device 614, and cursor control 616 are integrated as part of the form factor of computing device 600 in other configurations.
  • any other form of user output device can be sued such as an audio output device or a tactile (vibrational) output device.
  • Functions of the disclosed systems, methods, and modules may be performed by computing device 600 in response to processor(s) 604 executing one or more programs of software instructions contained in main memory 606. Such software instructions may be read into main memory 606 from another storage medium, such as storage device(s) 610 or a transmission medium. Execution of the software instructions contained in main memory 606 cause processor(s) 604 to perform the functions of the example embodiment(s).
  • Non-volatile media includes, for example, non-volatile random access memory (NVRAM), flash memory, optical disks, magnetic disks, or solid-state drives, such as storage device 610.
  • Volatile media includes dynamic memory, such as main memory 606.
  • storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, flash memory, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 602.
  • Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • a machine readable medium carrying instructions in the form of code can comprise a non-transient storage medium and a transmission medium.
  • Various forms of media may be involved in carrying one or more sequences of one or more software instructions to processor(s) 604 for execution.
  • the software instructions may initially be carried on a magnetic disk or solid-state drive of a remote computer.
  • the remote computer can load the software instructions into its dynamic memory and send the software instructions over a telephone line using a modem.
  • a modem local to computing device 600 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal.
  • An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 602.
  • Bus 602 carries the data to main memory 606, from which processor(s) 604 retrieves and executes the software instructions.
  • the software instructions received by main memory 606 may optionally be stored on storage device(s) 610 either before or after execution by processor(s) 604.
  • Computing device 600 also may include one or more communication interface(s) 618 coupled to bus 602.
  • a communication interface 618 provides a two-way data communication coupling to a wired or wireless network link 620 that is connected to a local network 622 (e.g., Ethernet network, Wireless Local Area Network, cellular phone network, Bluetooth wireless network, or the like).
  • Communication interface 618 sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information.
  • communication interface 618 may be a wired network interface card, a wireless network interface card with an integrated radio antenna, or a modem (e.g., ISDN, DSL, or cable modem).
  • modem e.g., ISDN, DSL, or cable modem
  • Network link(s) 620 typically provide data communication through one or more networks to other data devices.
  • a network link 620 may provide a connection through a local network 622 to a host computer or to data equipment operated by an Internet Service Provider (ISP).
  • ISP in turn provides data communication services through the world wide packet data communication network now commonly referred to as the "Internet”.
  • Internet Internet
  • Local network(s) 622 and Internet use electrical, electromagnetic or optical signals that carry digital data streams.
  • the signals through the various networks and the signals on network link(s) 620 and through communication interface(s) 618, which carry the digital data to and from computing device 600, are example forms of transmission media.
  • Computing device 600 can send messages and receive data, including program code, through the network(s), network link(s) 620 and communication interface(s) 618.
  • a server might transmit a requested code for an application program through Internet, ISP, local network(s) 622 and communication interface(s) 618.
  • the received code may be executed by processor 604 as it is received, and/or stored in storage device 610, or other non-volatile storage for later execution.
  • a carrier medium such as a non-transient storage medium storing code for execution by a processor of a machine to carry out the method, or a transient medium carrying processor executable code for execution by a processor of a machine to carry out the method.
  • Embodiments can be implemented in programmable digital logic that implements computer code. The code can be supplied to the programmable logic, such as a processor or microprocessor, on a carrier medium.
  • a carrier medium is a transient medium i.e. a signal such as an electrical, electromagnetic, acoustic, magnetic, or optical signal.
  • Another form of carrier medium is a non-transitory storage medium that stores the code, such as a solid-state memory, magnetic media (hard disk drive), or optical media (Compact disc (CD) or digital versatile disc (DVD)).

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Transfer Between Computers (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé permettant d'indiquer des réponses émotionnelles dans une réunion virtuelle, ledit procédé consistant à : créer ou sélectionner des données d'avatar définissant un ou plusieurs avatars pour représenter un ou plusieurs utilisateurs correspondants en réponse à une entrée de l'utilisateur ou des utilisateurs correspondants ; recevoir une ou plusieurs sélections utilisateur de données de réunion définissant une ou plusieurs réunions virtuelles, une sélection utilisateur comprenant une indication spécifiant que l'utilisateur participe à la réunion virtuelle ; générer une sortie pour l'affichage d'une réunion virtuelle avec un ou plusieurs avatars représentant un ou plusieurs utilisateurs participant à la réunion à l'aide des données d'avatar et des données de réunion correspondant à la réunion virtuelle ; recevoir des données d'entrée émotionnelles d'un ou de plusieurs utilisateurs indiquant une réponse émotionnelle ou un langage corporel de l'utilisateur ou des utilisateurs participant à la réunion virtuelle ; traiter les données d'avatar à l'aide des données d'entrée émotionnelles ; et mettre à jour la sortie d'affichage de la réunion virtuelle pour restituer l'avatar ou les avatars pour l'utilisateur ou les utilisateurs afin d'afficher un état émotionnel respectif en fonction des données d'entrée émotionnelles respectives.
PCT/GB2018/051619 2017-07-05 2018-06-13 Procédé et système d'indication de réponse d'un participant à une réunion virtuelle WO2019008320A1 (fr)

Priority Applications (8)

Application Number Priority Date Filing Date Title
KR1020207003348A KR20200037241A (ko) 2017-07-05 2018-06-13 가상 미팅 참여자 응답 표시 방법 및 시스템
JP2019572423A JP2020525946A (ja) 2017-07-05 2018-06-13 仮想会議の参加者の反応を示す方法及びシステム
SG11202000052WA SG11202000052WA (en) 2017-07-05 2018-06-13 Virtual meeting participant response indication method and system
AU2018298474A AU2018298474A1 (en) 2017-07-05 2018-06-13 Virtual meeting participant response indication method and system
EP18737352.7A EP3649588A1 (fr) 2017-07-05 2018-06-13 Procédé et système d'indication de réponse d'un participant à une réunion virtuelle
CA3068920A CA3068920A1 (fr) 2017-07-05 2018-06-13 Procede et systeme d'indication de reponse d'un participant a une reunion virtuelle
CN201880055827.9A CN111066042A (zh) 2017-07-05 2018-06-13 虚拟会议参与者响应指示方法和系统
ZA2020/00730A ZA202000730B (en) 2017-07-05 2020-02-04 Virtual meeting participant response indication method and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1710840.8A GB201710840D0 (en) 2017-07-05 2017-07-05 Virtual meeting participant response indication method and system
GB1710840.8 2017-07-05

Publications (1)

Publication Number Publication Date
WO2019008320A1 true WO2019008320A1 (fr) 2019-01-10

Family

ID=59592638

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2018/051619 WO2019008320A1 (fr) 2017-07-05 2018-06-13 Procédé et système d'indication de réponse d'un participant à une réunion virtuelle

Country Status (10)

Country Link
EP (1) EP3649588A1 (fr)
JP (1) JP2020525946A (fr)
KR (1) KR20200037241A (fr)
CN (1) CN111066042A (fr)
AU (1) AU2018298474A1 (fr)
CA (1) CA3068920A1 (fr)
GB (1) GB201710840D0 (fr)
SG (1) SG11202000052WA (fr)
WO (1) WO2019008320A1 (fr)
ZA (1) ZA202000730B (fr)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10952006B1 (en) 2020-10-20 2021-03-16 Katmai Tech Holdings LLC Adjusting relative left-right sound to provide sense of an avatar's position in a virtual space, and applications thereof
US10979672B1 (en) 2020-10-20 2021-04-13 Katmai Tech Holdings LLC Web-based videoconference virtual environment with navigable avatars, and applications thereof
CN113014852A (zh) * 2019-12-19 2021-06-22 斑马智行网络(香港)有限公司 信息提示方法、装置及设备
US11070768B1 (en) 2020-10-20 2021-07-20 Katmai Tech Holdings LLC Volume areas in a three-dimensional virtual conference space, and applications thereof
US11076128B1 (en) 2020-10-20 2021-07-27 Katmai Tech Holdings LLC Determining video stream quality based on relative position in a virtual space, and applications thereof
US11095857B1 (en) 2020-10-20 2021-08-17 Katmai Tech Holdings LLC Presenter mode in a three-dimensional virtual conference space, and applications thereof
CN113593720A (zh) * 2020-04-30 2021-11-02 京东方科技集团股份有限公司 一种远程会诊控制方法和控制系统、计算机设备和介质
US11184362B1 (en) 2021-05-06 2021-11-23 Katmai Tech Holdings LLC Securing private audio in a virtual conference, and applications thereof
JP2022018733A (ja) * 2020-07-16 2022-01-27 ヤフー株式会社 提供プログラム、提供方法、および提供装置
CN114500429A (zh) * 2022-01-24 2022-05-13 北京百度网讯科技有限公司 语音房内虚拟形象的控制方法、装置及电子设备
US11457178B2 (en) 2020-10-20 2022-09-27 Katmai Tech Inc. Three-dimensional modeling inside a virtual video conferencing environment with a navigable avatar, and applications thereof
US20220353220A1 (en) * 2021-04-30 2022-11-03 Zoom Video Communications, Inc. Shared reactions within a video communication session
GB2607331A (en) * 2021-06-03 2022-12-07 Kal Atm Software Gmbh Virtual interaction system
US11562531B1 (en) 2022-07-28 2023-01-24 Katmai Tech Inc. Cascading shadow maps in areas of a three-dimensional environment
US11593989B1 (en) 2022-07-28 2023-02-28 Katmai Tech Inc. Efficient shadows for alpha-mapped models
US11651108B1 (en) 2022-07-20 2023-05-16 Katmai Tech Inc. Time access control in virtual environment application
US11682164B1 (en) 2022-07-28 2023-06-20 Katmai Tech Inc. Sampling shadow maps at an offset
US11700354B1 (en) 2022-07-21 2023-07-11 Katmai Tech Inc. Resituating avatars in a virtual environment
US11704864B1 (en) 2022-07-28 2023-07-18 Katmai Tech Inc. Static rendering for a combination of background and foreground objects
US11711494B1 (en) 2022-07-28 2023-07-25 Katmai Tech Inc. Automatic instancing for efficient rendering of three-dimensional virtual environment
US11741664B1 (en) 2022-07-21 2023-08-29 Katmai Tech Inc. Resituating virtual cameras and avatars in a virtual environment
US11743430B2 (en) 2021-05-06 2023-08-29 Katmai Tech Inc. Providing awareness of who can hear audio in a virtual conference, and applications thereof
US11748939B1 (en) 2022-09-13 2023-09-05 Katmai Tech Inc. Selecting a point to navigate video avatars in a three-dimensional environment
US11776203B1 (en) 2022-07-28 2023-10-03 Katmai Tech Inc. Volumetric scattering effect in a three-dimensional virtual environment with navigable video avatars
US11876630B1 (en) 2022-07-20 2024-01-16 Katmai Tech Inc. Architecture to control zones
US11928774B2 (en) 2022-07-20 2024-03-12 Katmai Tech Inc. Multi-screen presentation in a virtual videoconferencing environment
US11956571B2 (en) 2022-07-28 2024-04-09 Katmai Tech Inc. Scene freezing and unfreezing
US12009938B2 (en) 2022-07-20 2024-06-11 Katmai Tech Inc. Access control in zones
US12022235B2 (en) 2022-07-20 2024-06-25 Katmai Tech Inc. Using zones in a three-dimensional virtual environment for limiting audio and video

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111583415B (zh) * 2020-05-08 2023-11-24 维沃移动通信有限公司 信息处理方法、装置和电子设备
WO2022145040A1 (fr) * 2020-12-31 2022-07-07 株式会社I’mbesideyou Terminal d'évaluation de réunion vidéo, système d'évaluation de réunion vidéo et programme d'évaluation de réunion vidéo
CN113014471B (zh) * 2021-01-18 2022-08-19 腾讯科技(深圳)有限公司 会话处理方法,装置、终端和存储介质
US11855796B2 (en) 2021-03-30 2023-12-26 Snap Inc. Presenting overview of participant reactions within a virtual conferencing system
US11381411B1 (en) * 2021-03-30 2022-07-05 Snap Inc. Presenting participant reactions within a virtual conferencing system
KR102527398B1 (ko) * 2021-11-23 2023-04-28 엔에이치엔클라우드 주식회사 화상미팅 프로그램 기반의 가상 피팅 방법 및 그 시스템

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100287510A1 (en) * 2009-05-08 2010-11-11 International Business Machines Corporation Assistive group setting management in a virtual world

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6608636B1 (en) * 1992-05-13 2003-08-19 Ncr Corporation Server based virtual conferencing
US5347306A (en) * 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
EP1038396B1 (fr) * 1997-12-09 2003-01-29 BRITISH TELECOMMUNICATIONS public limited company Complement de service de conference
JP3679350B2 (ja) * 2001-05-28 2005-08-03 株式会社ナムコ プログラム、情報記憶媒体及びコンピュータシステム
US8243116B2 (en) * 2007-09-24 2012-08-14 Fuji Xerox Co., Ltd. Method and system for modifying non-verbal behavior for social appropriateness in video conferencing and other computer mediated communications
CN101930284B (zh) * 2009-06-23 2014-04-09 腾讯科技(深圳)有限公司 一种实现视频和虚拟网络场景交互的方法、装置和系统
WO2013152453A1 (fr) * 2012-04-09 2013-10-17 Intel Corporation Communication à l'aide d'avatars interactifs
US9503682B2 (en) * 2014-12-17 2016-11-22 Fuji Xerox Co., Ltd. Systems and methods for conveying physical state of a remote device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100287510A1 (en) * 2009-05-08 2010-11-11 International Business Machines Corporation Assistive group setting management in a virtual world

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113014852A (zh) * 2019-12-19 2021-06-22 斑马智行网络(香港)有限公司 信息提示方法、装置及设备
CN113593720A (zh) * 2020-04-30 2021-11-02 京东方科技集团股份有限公司 一种远程会诊控制方法和控制系统、计算机设备和介质
JP2022018733A (ja) * 2020-07-16 2022-01-27 ヤフー株式会社 提供プログラム、提供方法、および提供装置
US11070768B1 (en) 2020-10-20 2021-07-20 Katmai Tech Holdings LLC Volume areas in a three-dimensional virtual conference space, and applications thereof
US11076128B1 (en) 2020-10-20 2021-07-27 Katmai Tech Holdings LLC Determining video stream quality based on relative position in a virtual space, and applications thereof
US11095857B1 (en) 2020-10-20 2021-08-17 Katmai Tech Holdings LLC Presenter mode in a three-dimensional virtual conference space, and applications thereof
US10979672B1 (en) 2020-10-20 2021-04-13 Katmai Tech Holdings LLC Web-based videoconference virtual environment with navigable avatars, and applications thereof
US11290688B1 (en) 2020-10-20 2022-03-29 Katmai Tech Holdings LLC Web-based videoconference virtual environment with navigable avatars, and applications thereof
US10952006B1 (en) 2020-10-20 2021-03-16 Katmai Tech Holdings LLC Adjusting relative left-right sound to provide sense of an avatar's position in a virtual space, and applications thereof
US11457178B2 (en) 2020-10-20 2022-09-27 Katmai Tech Inc. Three-dimensional modeling inside a virtual video conferencing environment with a navigable avatar, and applications thereof
US11843567B2 (en) * 2021-04-30 2023-12-12 Zoom Video Communications, Inc. Shared reactions within a video communication session
US20220353220A1 (en) * 2021-04-30 2022-11-03 Zoom Video Communications, Inc. Shared reactions within a video communication session
US11184362B1 (en) 2021-05-06 2021-11-23 Katmai Tech Holdings LLC Securing private audio in a virtual conference, and applications thereof
US11743430B2 (en) 2021-05-06 2023-08-29 Katmai Tech Inc. Providing awareness of who can hear audio in a virtual conference, and applications thereof
GB2607331A (en) * 2021-06-03 2022-12-07 Kal Atm Software Gmbh Virtual interaction system
CN114500429A (zh) * 2022-01-24 2022-05-13 北京百度网讯科技有限公司 语音房内虚拟形象的控制方法、装置及电子设备
US11651108B1 (en) 2022-07-20 2023-05-16 Katmai Tech Inc. Time access control in virtual environment application
US11928774B2 (en) 2022-07-20 2024-03-12 Katmai Tech Inc. Multi-screen presentation in a virtual videoconferencing environment
US11876630B1 (en) 2022-07-20 2024-01-16 Katmai Tech Inc. Architecture to control zones
US12009938B2 (en) 2022-07-20 2024-06-11 Katmai Tech Inc. Access control in zones
US12022235B2 (en) 2022-07-20 2024-06-25 Katmai Tech Inc. Using zones in a three-dimensional virtual environment for limiting audio and video
US11741664B1 (en) 2022-07-21 2023-08-29 Katmai Tech Inc. Resituating virtual cameras and avatars in a virtual environment
US11700354B1 (en) 2022-07-21 2023-07-11 Katmai Tech Inc. Resituating avatars in a virtual environment
US11711494B1 (en) 2022-07-28 2023-07-25 Katmai Tech Inc. Automatic instancing for efficient rendering of three-dimensional virtual environment
US11776203B1 (en) 2022-07-28 2023-10-03 Katmai Tech Inc. Volumetric scattering effect in a three-dimensional virtual environment with navigable video avatars
US11704864B1 (en) 2022-07-28 2023-07-18 Katmai Tech Inc. Static rendering for a combination of background and foreground objects
US11682164B1 (en) 2022-07-28 2023-06-20 Katmai Tech Inc. Sampling shadow maps at an offset
US11956571B2 (en) 2022-07-28 2024-04-09 Katmai Tech Inc. Scene freezing and unfreezing
US11593989B1 (en) 2022-07-28 2023-02-28 Katmai Tech Inc. Efficient shadows for alpha-mapped models
US11562531B1 (en) 2022-07-28 2023-01-24 Katmai Tech Inc. Cascading shadow maps in areas of a three-dimensional environment
US11748939B1 (en) 2022-09-13 2023-09-05 Katmai Tech Inc. Selecting a point to navigate video avatars in a three-dimensional environment

Also Published As

Publication number Publication date
ZA202000730B (en) 2023-12-20
AU2018298474A1 (en) 2020-02-20
CN111066042A (zh) 2020-04-24
EP3649588A1 (fr) 2020-05-13
CA3068920A1 (fr) 2019-01-10
JP2020525946A (ja) 2020-08-27
KR20200037241A (ko) 2020-04-08
GB201710840D0 (en) 2017-08-16
SG11202000052WA (en) 2020-02-27

Similar Documents

Publication Publication Date Title
US20170302709A1 (en) Virtual meeting participant response indication method and system
WO2019008320A1 (fr) Procédé et système d'indication de réponse d'un participant à une réunion virtuelle
US11460970B2 (en) Meeting space collaboration in augmented reality computing environments
US11595338B2 (en) System and method of embedding rich media into text messages
US10838574B2 (en) Augmented reality computing environments—workspace save and load
US8244830B2 (en) Linking users into live social networking interactions based on the users' actions relative to similar content
CN110573224A (zh) 三维环境创作和生成
EP3776146A1 (fr) Environnements informatiques de réalité augmentée
WO2018085132A1 (fr) Re-hébergement de contenu web intégré par signalisation inter-iframes
KR101750788B1 (ko) 스토리 보드 구현 방법 및 시스템과 스토리 보드 내의 선택 객체 송수신 방법 및 시스템
CN115113773B (zh) 信息的处理方法、装置、计算机可读存储介质和电子装置
US11972173B2 (en) Providing change in presence sounds within virtual working environment
KR20180135532A (ko) 스토리 보드 구현 방법 및 시스템
US20240118785A1 (en) Interoperability for translating and traversing 3d experiences in an accessibility environment
KR20230113006A (ko) 채팅로그형 컨텐츠 서비스 제공방법 및 그 장치
JP5872723B1 (ja) グループ構成員がそれぞれ情報端末を用いて精神的交流をするためのクラウドサービスにおける愛玩アバターの登場
KR20180134719A (ko) 스토리 키보드 구현 방법 및 시스템, 이를 이용한 선택 객체 송수신 방법 및 시스템
Krudop CollaborativeWork Supported by Cloud Computing and Wireless Data Exchange Between Smartphones and InteractiveTabletops

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18737352

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019572423

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 3068920

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018737352

Country of ref document: EP

Effective date: 20200205

ENP Entry into the national phase

Ref document number: 2018298474

Country of ref document: AU

Date of ref document: 20180613

Kind code of ref document: A