EP4329292A1 - Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, endgerät und anzeigeverfahren - Google Patents

Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, endgerät und anzeigeverfahren Download PDF

Info

Publication number
EP4329292A1
EP4329292A1 EP22791366.2A EP22791366A EP4329292A1 EP 4329292 A1 EP4329292 A1 EP 4329292A1 EP 22791366 A EP22791366 A EP 22791366A EP 4329292 A1 EP4329292 A1 EP 4329292A1
Authority
EP
European Patent Office
Prior art keywords
group
information
unit
terminal device
groups
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22791366.2A
Other languages
English (en)
French (fr)
Other versions
EP4329292A4 (de
Inventor
Yoshinori Maeda
Yasuharu Asano
Keiichi Yamada
Akira Inoue
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of EP4329292A1 publication Critical patent/EP4329292A1/de
Publication of EP4329292A4 publication Critical patent/EP4329292A4/de
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • G06F40/35Discourse or dialogue representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1831Tracking arrangements for later retrieval, e.g. recording contents, participants activities or behavior, network status
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/216Handling conversation history, e.g. grouping of messages in sessions or threads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/155Conference systems involving storage of or access to video conference sessions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1822Conducting the conference, e.g. admission, detection, selection or grouping of participants, correlating users to one or more conference sessions, prioritising transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, a terminal device, and a display method.
  • a meeting (also referred to as a "remote meeting") by a plurality of users is performed remotely by communicating a voice or an image through the Internet, such as a web conference.
  • a technology for dividing users participating in such a remote meeting into a plurality of groups is known (for example, Non Patent Literature 1).
  • Non Patent Literature 1 " Use breakout rooms in Teams meetings", Microsoft Corporation ⁇ Internet> https://web.archive.org/web/20210122112205/https://support. microsoft.com/en-us/office/use-breakout-rooms-in-teams-meetings-7delf48a-da07-466c-a5ab-4ebace28e461 (Searched on April 15, 2021 )
  • the present disclosure proposes an information processing apparatus, an information processing method, a terminal device, and a display method capable of efficiently confirming information on a plurality of groups.
  • an information processing apparatus includes an acquisition unit that acquires group content information regarding a content of a conversation in at least one or more groups among a plurality of groups in which a plurality of users participating in a remote meeting are divided and can have a conversation in each group; and an output unit that outputs plural group list information that displays the group content information in association with a corresponding group together with the plurality of groups.
  • FIG. 1 is a diagram illustrating an example of information processing according to an embodiment of the present disclosure.
  • the information processing according to the embodiment of the present disclosure is realized by a remote lecture system 1 including a remote meeting server 100, a terminal device 10, and a plurality of member terminals 20.
  • a remote lecture system 1 including a remote meeting server 100, a terminal device 10, and a plurality of member terminals 20.
  • each of the member terminals 20 may be described as a member terminal 20a, a member terminal 20b, and a member terminal 20c. Note that, in a case where the member terminal 20a, the member terminal 20b, the member terminal 20c, and the like are described without being particularly distinguished, they are described as "member terminal 20".
  • the remote lecture system 1 has a function of dividing users (members) participating in a remote meeting (hereinafter also referred to as "meeting") into a plurality of groups and performing a group work. As described above, in the remote lecture system 1, a plurality of the users is assigned to each group, and it is possible to perform a group work such as discussion (conversation) and production among members belonging to each group.
  • FIG. 1 illustrates a case where the plurality of users participating in a meeting is divided into a plurality of groups such as groups A to D and performs a group work.
  • a host (an administrator) of a meeting such as a lecturer or a facilitator divides a plurality of the users participating in the meeting into a plurality of groups and sets a group work.
  • the lecturer FC will be described as an example of an administrator.
  • the lecturer FC who is a lecturer of the meeting divides the plurality of members A-1 to D-4 and the like into the plurality of groups A to D using the terminal device 10, and sets a group work. Note that a setting example regarding the group work will be described later.
  • step numbers such as S11-1 illustrated in FIG. 1 are for describing each processing (signs) and do not indicate an order of processing, and each processing is executed at any time according to the progress of the remote meeting such as a group work.
  • FIG. 1 the three members A-1, A-2, and A-3 are illustrated for the sake of explanation, but it is assumed that a large number of users (for example, 100, 1000, or the like) participate in a remote meeting provided by the remote lecture system 1 and are divided into groups.
  • the member terminal 20 used by the member A-1 is described as a member terminal 20a
  • the member terminal 20 used by the member A-2 is described as a member terminal 20b
  • the member terminal 20 used by the member A-3 is described as a member terminal 20c.
  • An application hereinafter, also referred to as a "remote meeting application" for participating in a remote meeting provided by the remote lecture system 1 is installed in each member terminal 20, and each user participates in the remote meeting using the remote meeting application.
  • the plurality of members A-1 to D-4 and the like are divided into the plurality of groups A to D, and work related to a group work such as discussion (conversation) in the group is performed in each group.
  • the members A-1, A-2, A-3, and A-4 have a conversation on the basis of, for example, a topic related to the Olympics.
  • the members B-1, B-2, B-3, and B-4 have a conversation on the basis of, for example, a topic related to the Olympics.
  • the members belonging to the group have a conversation.
  • Each member terminal 20 transmits and receives information regarding the remote meeting to and from the remote meeting server 100 as needed.
  • Each member terminal 20 transmits information such as an utterance and an image of the user who uses the member terminal 20 to the remote meeting server 100, and receives information output for the remote meeting from the remote meeting server 100.
  • Step S11-1 communication (transmission/reception) of information regarding the remote meeting input/output by the member terminal 20a is performed (Step S11-1).
  • the member terminal 20a transmits information such as an utterance and an image of the member A-1 to the remote meeting server 100, and receives information output for the remote meeting from the remote meeting server 100.
  • the member terminal 20a displays an intra-group screen CT11 (see FIG. 2 ) corresponding to the group A to which the member A-1 belongs, and voice-outputs utterances of other members in the group A. In this manner, each member terminal 20 displays or voice-outputs information regarding the group to which the member belongs.
  • Step S11-2 communication (transmission/reception) of information regarding the remote meeting input/output by the member terminal 20b is performed (Step S11-2).
  • the member terminal 20b transmits information such as an utterance and an image of the member A-2 to the remote meeting server 100, and receives information output for the remote meeting from the remote meeting server 100.
  • the member terminal 20b displays an intra-group screen corresponding to the group A to which the member A-2 belongs, and voice-outputs utterances of other members in the group A.
  • Step S11-3 communication (transmission/reception) of information regarding the remote meeting input/output by the member terminal 20c is performed (Step S11-3).
  • the member terminal 20c transmits information such as an utterance and an image of the member A-3 to the remote meeting server 100, and receives information output for the remote meeting from the remote meeting server 100.
  • the member terminal 20b displays an intra-group screen corresponding to the group A to which the member A-3 belongs, and voice-outputs utterances of other members in the group A. Note that similar processing is also executed in the member terminal 20 used by each member of the members A-4, B-1 to B-4, C-1 to C-4, and D-1 to D-4, but is not illustrated in FIG. 1 .
  • the member terminal 20 used by the member B-1 displays the intra-group screen CT21 (see FIG. 2 ) corresponding to the group B to which the member B-1 belongs, and voice-outputs the utterances of other members in the group B.
  • the member terminal 20 used by the member D-1 displays the intra-group screen CT41 (see FIG. 2 ) corresponding to the group B to which the member D-1 belongs, and voice-outputs the utterances of other members in the group D.
  • Step S11 is a step of performing communication (transmission and reception) of information regarding a remote meeting such as a group work between the remote meeting server 100 and each member terminal 20.
  • Steps S11-1 to S11-3 are not limited to one time, and are executed as needed according to the progress of the remote meeting such as a group work.
  • the remote meeting server 100 executes processing of collecting information received from each member terminal 20 and generating plural group list information that displays group content information regarding a content of a conversation in each group in association with a corresponding group (Step S12).
  • the remote meeting server 100 generates group content information SM1 of the group A on the basis of the utterance histories of the members A-1, A-2, A-3, and A-4.
  • the group content information SM1 may include information such as the number of appearances of each keyword in the conversation in the group A and the summary content of the conversation in the group A. However, the details will be described later.
  • the remote meeting server 100 generates group content information SM2 of the group B on the basis of the utterance histories of the members B-1, B-2, B-3, and B-4.
  • the remote meeting server 100 generates the group content information SM3 of the group C on the basis of the utterance histories of the members C-1, C-2, C-3, and C-4.
  • the remote meeting server 100 generates the group content information SM4 of the group D on the basis of the utterance histories of the members D-1, D-2, D-3, and D-4.
  • the remote meeting server 100 generates plural group list information CT1 in which each of the group content information SM1 to SM4 is associated with the corresponding groups A to D.
  • the remote meeting server 100 associates the member lists ML1 to ML4 of the groups A to D with the group content information SM1 to SM4, and generates the plural group list information CT1 arranged in the areas AR1 to AR4.
  • the remote meeting server 100 transmits the plural group list information CT1 to the terminal device 10 (Step S13).
  • the terminal device 10 that has received the plural-group list information CT1 from the remote meeting server 100 displays the plural-group list information CT1 (Step S14).
  • the member list ML1 indicating the members of the group A and the group content information SM1 are arranged in association with each other in the area AR1.
  • the member list ML1 is arranged in a left region (also referred to as a "first region") in the area AR1 of the plural group list information CT1
  • the group content information SM1 is arranged in a right region (also referred to as a "second region") in the area AR1 of the plural group list information CT1.
  • the group content information SM1 of the group A is displayed in association with the member list ML1 of the group A.
  • the member list ML21 is arranged in the first region on the left side in the area AR2 of the plural group list information CT1, and the group content information SM2 is arranged in the second region on the right side in the area AR2 of the plural group list information CT1.
  • the group content information SM2 of the group B is displayed in association with the member list ML2 of the group B.
  • the group content information SM3 of the group C is displayed in the area AR3 in association with the member list ML3 of the group C
  • the group content information SM4 of the group D is displayed in the area AR4 in association with the member list ML4 of the group D.
  • the terminal device 10 displays the plural group list information CT1 indicating the group content information in association with the corresponding group.
  • the remote lecture system 1 can enable a lecturer, a facilitator, or the like who sets a group work such as a lecturer FC to efficiently confirm information on a plurality of groups.
  • the terminal device 10 requests comment transmission to the member of the group A in response to the comment transmission operation to the member of the group A by the lecturer FC (Step S15).
  • the lecturer FC requests the remote meeting server 100 to transmit a comment to the member of the group A by performing an operation on an operation user interface (UI) arranged in an area AR5 of the plural group list information CT1.
  • the terminal device 10 transmits comment transmission information including designation information indicating that the transmission destination of the comment is the group A and character information indicating the comment to be transmitted to the member of the group A to the remote meeting server 100 according to the operation by the lecturer FC.
  • the terminal device 10 requests the remote meeting server 100 to transmit a comment to the member of the group A.
  • the lecturer FC may request comment transmission with reference to the assist information, which will be described later.
  • the remote meeting server 100 that has received the comment transmission information from the terminal device 10 transmits the comment indicated by the comment transmission information to the member terminal 20 of the member of the group A.
  • the remote meeting server 100 transmits the comment indicated by the comment transmission information to the member terminal 20a of the member A-1 (Step S16-1).
  • the remote meeting server 100 transmits the comment indicated by the comment transmission information to the member terminal 20b of the member A-2 (Step S16-2).
  • the remote meeting server 100 transmits the comment indicated by the comment transmission information to the member terminal 20c of the member A-3 (Step S16-3).
  • the remote meeting server 100 transmits the comment indicated by the comment transmission information to the member terminal 20 of the member A-4.
  • Step S16 When Steps S16-1 to S16-3 are described without distinction, they are collectively referred to as Step S16. Steps S16-1 to S16-3 are executed as needed at a timing when conditions are satisfied according to the progress of a remote meeting such as a group work. Furthermore, Step S16 may be performed together with Step S11. Furthermore, the comment may be transmitted not only to the members of the group A but also to the members of the groups B, C, and D, or may be collectively transmitted to all the members of the groups A to D.
  • a lecturer, a facilitator, or the like who sets a group work such as the lecturer FC can make a comment at a desired timing with respect to a group or the like in which a conversation is not active, and thus, it is possible to efficiently operate the group work.
  • FIG. 2 is a diagram illustrating an example of display of the entire remote lecture system. Note that description of points similar to those in FIG. 1 will be omitted as appropriate.
  • the plural group list information CT1 illustrated in FIG. 2 is an overall grasp screen, is similar to the plural group list information CT1 of FIG. 1 , and indicates the group content information in association with the corresponding group.
  • the plural group list information CT1 is displayed on the terminal device 10 used by the lecturer FC such as the lecturer and the facilitator.
  • FIG. 2 illustrates a state in which a member A-2 is speaking in a group A, a member B-3 is speaking in a group B, a member C-1 is speaking in a group C, and a member D-42 is speaking in a group D.
  • An area (second area) on the right side of each of the areas AR1 to AR4 is a log information area of a corresponding group, and a keyword or summary information of a discussion content is displayed.
  • the group content information of the corresponding group is displayed in the second area of each of the areas AR1 to AR4.
  • the area AR5 is an operation UI area, and an interface for performing each operation, such as a button used when it is desired to send a comment to all groups, is arranged.
  • the group GPA illustrated in FIG. 2 indicates a group A to which the members A-1 to A-4 belong.
  • the intra-group screen CT11 shows an intra-group screen corresponding to the group A displayed on the member terminal 20 used by the member A-1.
  • Various types of information of the corresponding group such as member information and logs of the corresponding group are displayed on the intra-group screen.
  • an area AR11 of the intra-group screen CT11 is an utterance display area, and the content uttered by the member in the group A is displayed.
  • an area other than the area AR11 of the intra-group screen CT11 is a material area, and a screen shared in the group A is displayed.
  • the intra-group screen CT14 indicates an intra-group screen corresponding to the group A displayed on the member terminal 20 used by the member A-4, and indicates the same content as the intra-group screen CT11.
  • an area AR14 of the intra-group screen CT14 is a remarks display area, and displays information similar to the area AR11 of the intra-group screen CT11.
  • an area other than the area AR14 of the intra-group screen CT14 is a material area, and displays information similar to that of the intra-group screen CT11.
  • the group GPB illustrated in FIG. 2 indicates a group B to which the members B-1 to B-4 belong.
  • the intra-group screen CT21 shows an intra-group screen corresponding to the group B displayed on the member terminal 20 used by the member B-1.
  • an area AR21 of the intra-group screen CT21 is an utterance display area, and the content uttered by the member in the group B is displayed.
  • an area other than the area AR21 of the intra-group screen CT21 is a material area, and a screen shared in the group B is displayed.
  • the intra-group screen CT24 indicates an intra-group screen corresponding to the group B displayed on the member terminal 20 used by the member B-4, and indicates the same content as the intra-group screen CT21.
  • an area AR24 of the intra-group screen CT24 is a remarks display area, and displays information similar to the area AR21 of the intra-group screen CT21.
  • an area other than the area AR24 of the intra-group screen CT24 is a material area, and displays information similar to that of the intra-group screen CT21.
  • the group GPD illustrated in FIG. 2 indicates a group D to which the members D-1 to D-4 belong.
  • the intra-group screen CT41 shows an intra-group screen corresponding to the group D displayed on the member terminal 20 used by the member D-1.
  • an area AR41 of the intra-group screen CT41 is an utterance display area, and the content uttered by the member in the group D is displayed.
  • an area other than the area AR41 of the intra-group screen CT41 is a material area, and a screen shared in the group D is displayed.
  • the intra-group screen CT44 indicates an intra-group screen corresponding to the group D displayed on the member terminal 20 used by the member D-4, and indicates the same content as the intra-group screen CT41.
  • an area AR44 of the intra-group screen CT44 is a remarks display area, and displays information similar to the area AR41 of the intra-group screen CT41.
  • an area other than the area AR44 of the intra-group screen CT44 is a material area, and displays information similar to that of the intra-group screen CT41.
  • the display example of FIG. 2 is merely an example, and any display mode can be adopted as the plural group list information CT1 as long as the entire group work can be grasped.
  • the remote lecture system 1 provides plural group list information CT1 which is an example of an overall grasp screen viewed by a lecturer or a facilitator, intra-group screens CT11 to CT44 which are examples of intra-group screens viewed by members of each group, and the like.
  • group list information CT1 is an example of an overall grasp screen viewed by a lecturer or a facilitator
  • intra-group screens CT11 to CT44 which are examples of intra-group screens viewed by members of each group, and the like.
  • the keywords extracted from the utterance information discussed in each group and the information summarizing the contents thereof are displayed on the overall grasp screen.
  • the overall grasp screen also displays information indicating who of the members belonging to the group is expressing an opinion.
  • FIG. 3 is a diagram illustrating an example of display to the administrator.
  • FIG. 3 illustrates, as an example, a display on the terminal device 10 used by the lecturer FC for the group work for discussing the "2021 Olympics". Note that, in FIG. 3 , the same points as those in FIGS. 1 and 2 are denoted by the same reference numerals, and the description thereof will be omitted as appropriate.
  • FIG. 3 illustrates display contents of the group content information SM1 of the group A and the group content information SM2 of the group B among the group content information SM1 to SM4 displayed in the second areas of the areas AR1 to AR4 in FIG. 1 as an example.
  • the group content information SM1 indicates a case where information such as the number of appearances of each keyword in the conversation in the group A and the summary content of the conversation in the group A is displayed in a display mode of keyword/summary display.
  • the group content information SM1 indicates that the number of appearances of the keywords "corona” and “increase” is eight in the conversation in the group A, which is the maximum number of appearances.
  • the group content information SM1 indicates that the number of appearances of the keyword “postponement” is seven and the number of appearances of the keywords "holding” and "infected person” is four in the conversation in the group A.
  • the group content information SM1 indicates that the number of appearances of the keyword “economic effect” is three, and the number of appearances of the keywords "player” and "year 2022” is two in the conversation in the group A.
  • the group content information SM1 indicates that the summary of the conversation in the group A is "It is dangerous to hold the event in the corona crisis. Infected people may increase. Should we consider postponement?".
  • the remote meeting server 100 generates the above-described keyword and information indicating the number of appearances thereof by analyzing the log information of the conversation in the group A.
  • the remote meeting server 100 generates the above-described summary by analyzing the log information of the conversation in the group A.
  • the keyword extraction and the summary generation may be appropriately performed using various conventional techniques such as sentence analysis and natural language processing, and a detailed description thereof will be omitted.
  • the group content information SM2 indicates a case where information such as the number of appearances of each keyword in the conversation in the group B and the summary content of the conversation in the group B is displayed in a display mode of keyword/summary display.
  • the group content information SM2 indicates that the number of appearances of the keyword "countermeasure” is 10 times, which is the maximum number of appearances in the conversation in the group B.
  • the group content information SM2 indicates that the number of appearances of the keyword "corona” is nine and the number of appearances of the keyword “vaccine” is seven in the conversation in the group B.
  • the group content information SM2 indicates that the number of appearances of the keyword “medicine” is six and the number of appearances of the keywords "holding” and "IOC” is two in the conversation in the group B.
  • the group content information SM2 indicates that the summary of the conversation in the group B is "Vaccine development is important. Countermeasures should be taken to prevent corona from spreading. The problem is that no countermeasures have been taken in border control.”
  • the remote meeting server 100 generates the above-described keyword and information indicating the number of appearances thereof by analyzing the log information of the conversation in the group B.
  • the remote meeting server 100 generates the above-described summary by analyzing the log information of the conversation in the group B.
  • the discussion contents of the group A and the group B are illustrated as an example.
  • the group A it seems that there is a discussion on whether to hold or postpone the Olympics.
  • the group B seems to be talking about vaccine development and issues with infection spread.
  • the remote lecture system 1 can cause the administrator to confirm what is being spoken in each group.
  • the lecturer may send a comment to the group B so that the lecturer can discuss whether or not to hold the Olympics.
  • the area AR2 of the group B is selected (clicked or the like) on the plural group list information CT1 which is the entire grasping screen, an intention to send the comment is displayed toward the group B, and then the comment is sent only to the group B by talking to the group B.
  • the remote lecture system 1 may provide the comment to the member terminal 20 in various modes, but this point will be described later.
  • the remote lecture system 1 solves a situation in which it is difficult for a lecturer or a facilitator to grasp a discussion content and a progress degree of each group in a group work performed in a remote environment.
  • the remote lecture system 1 it is possible to give an overview of who is speaking, what is speaking, and whether the discussion is actively performed in each group.
  • the remote lecture system 1 can provide an environment for a lecturer or a facilitator to smoothly and efficiently progress a group work in a group work performed in a lecture or a brainstorming in a remote environment. Furthermore, the remote lecture system 1 can provide an environment in which participants and brainstorming participants can have a thorough discussion. Furthermore, the remote lecture system 1 solves a situation in which it is difficult for a lecturer or a facilitator to provide information such as a comment to each group, but this point will be described later.
  • the remote lecture system 1 illustrated in FIG. 4 will be described.
  • the remote lecture system 1 includes a remote meeting server 100, a terminal device 10, and a plurality of member terminals 20.
  • the remote meeting server 100, the terminal device 10, and each of the plurality of member terminals 20 are communicably connected by wire or wirelessly via a predetermined communication network (network N).
  • FIG. 4 is a diagram illustrating a configuration example of a remote lecture system according to an embodiment. Note that, although only three member terminals 20 are illustrated in FIG. 4 , the remote lecture system 1 includes the member terminals 20 whose number is larger than or equal to the number of users participating in the remote meeting. Furthermore, the remote lecture system 1 illustrated in FIG. 4 may include a plurality of the remote meeting servers 100 and a plurality of the terminal devices 10.
  • the remote meeting server 100 is a computer used to provide a remote meeting service to the user.
  • the remote meeting server 100 is an information processing apparatus that provides a meeting service for performing group work or the like.
  • the remote meeting server 100 transmits, to the terminal device 10, a plurality of groups list information for displaying group content information regarding a content of a conversation in at least one or more groups among the plurality of groups in association with a corresponding group together with the plurality of groups.
  • the remote meeting server 100 transmits the comment to the member received from the terminal device 10 to the corresponding member terminal 20.
  • the remote meeting server 100 has a function of voice recognition.
  • the remote meeting server 100 has functions of natural language understanding (NLU) and automatic speech recognition (ASR).
  • the remote meeting server 100 may include software modules for voice signal processing, voice recognition, utterance semantic analysis, interaction control, and the like.
  • the remote meeting server 100 may convert the user's utterance into text, and estimate the user's utterance content using the converted utterance (that is, character information of the utterance).
  • the remote meeting server 100 may communicate with a speech recognition server having a function of natural language understanding and automatic speech recognition, and acquire utterance converted into text by the speech recognition server or information indicating estimated utterance content from the speech recognition server.
  • the terminal device 10 is a computer that displays, together with a plurality of groups, plural-group list information that displays group content information regarding a content of a conversation in at least one or more groups among the plurality of groups in association with a corresponding group.
  • the terminal device 10 is a device used by a user (administrator) such as a lecturer or a facilitator who manages (facilitates) a remote meeting (meeting). For example, the terminal device 10 invites other users (members) to a meeting or divides other users (members) into groups at the time of group work according to an operation of a user who is an administrator (host).
  • the terminal device 10 outputs information regarding the remote meeting.
  • the terminal device 10 displays the image (video) of the remote meeting and outputs the voice of the remote meeting by voice.
  • the terminal device 10 transmits the utterance and image (video) of the user to the remote meeting server 100, and receives the voice and image (video) of the remote meeting from the remote meeting server 100.
  • the terminal device 10 transmits a comment to the member.
  • the terminal device 10 receives an input by the user.
  • the terminal device 10 receives a voice input by a user's utterance or an input by a user's operation.
  • the terminal device 10 may be any device as long as the processing in the embodiment can be realized.
  • the terminal device 10 may be any device as long as it has a function of performing information display, voice output, and the like of a remote meeting.
  • the terminal device 10 may be a device such as a notebook personal computer (PC), a tablet terminal, a desktop PC, a smartphone, a smart speaker, a television, a mobile phone, or a personal digital assistant (PDA).
  • PC notebook personal computer
  • PDA personal digital assistant
  • the terminal device 10 may have a function of voice recognition such as natural language understanding and automatic voice recognition.
  • voice recognition such as natural language understanding and automatic voice recognition.
  • the terminal device 10 may convert the user's utterance into text, and estimate the user's utterance content using the converted utterance (that is, character information of the utterance).
  • the member terminal 20 is a device used by a user (member) participating in the remote meeting.
  • the member terminal 20 outputs information regarding the remote meeting.
  • the member terminal 20 displays the image (video) of the remote meeting and outputs the voice of the remote meeting by voice.
  • the member terminal 20 transmits the utterance and image (video) of the user to the remote meeting server 100, and receives the voice and image (video) of the remote meeting from the remote meeting server 100.
  • the member terminal 20 displays the intra-group screen.
  • the member terminal 20 receives a comment from an administrator such as a lecturer or a facilitator.
  • the member terminal 20 displays the received comment from the administrator.
  • the member terminal 20 receives an input from the user.
  • the member terminal 20 receives a voice input by user's utterance and an input by user's operation.
  • the member terminal 20 may be any device as long as the processing in the embodiment can be realized.
  • the member terminal 20 may be any device as long as it has a function of performing information display, voice output, and the like of a remote meeting.
  • the member terminal 20 may be a device such as a notebook PC, a tablet terminal, a desktop PC, a smartphone, a smart speaker, a television, a mobile phone, or a PDA.
  • the member terminal 20 may have a function of voice recognition such as natural language understanding and automatic voice recognition.
  • the member terminal 20 may convert the user's utterance into text, and estimate the user's utterance content using the converted utterance (that is, character information of the utterance).
  • the member terminal 20 may be any device as long as it can participate in a meeting and perform processing as a member other than an administrator (host) such as a lecturer.
  • the member terminal 20 is similar to the terminal device 10 in performing input and output related to a remote meeting except for input and output specific to an administrator such as a lecturer.
  • the functional configuration of the member terminal 20 is also similar to that of the terminal device 10 illustrated in FIG. 8 , and thus a detailed description of the functional configuration will be omitted.
  • FIG. 5 is a diagram illustrating a configuration example of a remote meeting server according to an embodiment of the present disclosure.
  • the remote meeting server 100 includes a communication unit 110, a storage unit 120, and a control unit 130.
  • the remote meeting server 100 may include an input unit (for example, a keyboard, a mouse, or the like) that receives various operations from an administrator or the like of the remote meeting server 100, and a display unit (for example, a liquid crystal display or the like) for displaying various information.
  • an input unit for example, a keyboard, a mouse, or the like
  • a display unit for example, a liquid crystal display or the like
  • the communication unit 110 is realized by, for example, a network interface card (NIC) or the like. Then, the communication unit 110 is connected to the network N (see FIG. 4 ) in a wired or wireless manner, and transmits and receives information to and from other information processing apparatuses such as the terminal device 10 and the member terminal 20. Furthermore, the communication unit 110 may transmit and receive information to and from a user terminal (not illustrated) used by the user.
  • NIC network interface card
  • the storage unit 120 is realized by, for example, a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or an optical disk. As illustrated in FIG. 5 , the storage unit 120 according to the embodiment includes a user information storage unit 121, a group information storage unit 122, and a history information storage unit 123.
  • a semiconductor memory element such as a random access memory (RAM) or a flash memory
  • a storage device such as a hard disk or an optical disk.
  • the storage unit 120 includes a user information storage unit 121, a group information storage unit 122, and a history information storage unit 123.
  • the user information storage unit 121 stores various types of information regarding the user. For example, the user information storage unit 121 stores information on the user participating in the remote meeting. The user information storage unit 121 stores attribute information and the like of each user. The user information storage unit 121 stores user information corresponding to information for identifying each user (user ID or the like) in association with each other.
  • the group information storage unit 122 stores various types of information regarding the group.
  • the group information storage unit 122 stores various types of information regarding a plurality of groups corresponding to each of the plurality of groups in the remote meeting.
  • FIG. 6 is a diagram illustrating an example of a group information storage unit according to the embodiment of the present disclosure.
  • the group information storage unit 122 includes items such as "group ID”, "group content information", and "member”.
  • the "group ID” indicates identification information for identifying a group.
  • the “group content information” indicates group content information in the corresponding group.
  • the “group content information” indicates group content information based on the dialogue in the group.
  • FIG. 6 illustrates an example in which conceptual information such as "SM1" is stored in the "group content information", but in practice, specific information such as the number of appearances of each keyword and summary content is stored.
  • the "member” indicates a member of the group.
  • information (user ID or the like) for identifying a member (user) belonging to a group is stored.
  • the group content information of the group (group A) identified by the group ID "GP1" is the group content information SM1. Furthermore, it is indicated that a user (member A-1) identified by the user ID "A-1”, a user (member A-2) identified by the user ID “A-2”, a user (member A-3) identified by the user ID "A-3”, a user (member A-4) identified by the user ID "A-4", and the like belong to the group A as members.
  • group information storage unit 122 is not limited to the above, and may store various types of information depending on the purpose.
  • the history information storage unit 123 stores various types of information regarding the history of the past group works.
  • the history information storage unit 123 stores contents discussed in past lectures, brainstorming, and the like.
  • FIG. 7 is a diagram illustrating an example of a history information storage unit according to the embodiment of the present disclosure.
  • the history information storage unit 123 includes items such as "keyword”, "summary content", and "case example of comments".
  • the history information storage unit 123 is not limited to the above, and may store various types of information depending on the purpose.
  • the history information storage unit 123 may store a log ID for identifying each log.
  • the "keyword” indicates a keyword corresponding to the summary content and the case example of comments.
  • the "keyword” is a keyword designated by an administrator such as a lecturer or a keyword extracted from a dialogue.
  • the "summary content” indicates a summary of content related to a keyword in a group dialogue.
  • the "case example of comments” indicates a case example of comments made by an administrator such as a lecturer for the corresponding keyword and summary content.
  • the example illustrated in FIG. 7 indicates that the summary content of the keywords "holding” and “postponement” is “It is dangerous to hold the event during the corona crisis. Should we consider postponement?". That is, it is indicated that the summary of the content of the dialogue in the group with respect to the keywords and “postponement” is “It is dangerous to hold the event during the corona crisis. Should we consider postponement?".
  • the summary content for the keyword “medical care” is “Medical sites are under strain.” That is, it indicates that the summary of the content of the dialogue in the group for the keyword “medical care” is “Medical sites are under strain.”.
  • history information storage unit 123 is not limited to the above, and may store various types of information depending on the purpose.
  • the storage unit 120 may store various types of information other than the above.
  • the storage unit 120 stores various types of information regarding the remote meeting.
  • the storage unit 120 stores various data for providing output data to the terminal device 10 and the member terminal 20.
  • the storage unit 120 stores various types of information used for generating information to be displayed on the terminal device 10 and the member terminal 20.
  • the storage unit 120 stores information regarding content displayed by an application (a remote meeting application or the like) installed in the terminal device 10 and the member terminal 20.
  • the storage unit 120 stores information regarding the content displayed by the remote meeting application.
  • the storage unit 120 may store various types of information used for providing the remote meeting service to the user.
  • the storage unit 120 stores information of a voice recognition application (program) that realizes a voice recognition function.
  • the terminal device 10 can execute voice recognition by activating a voice recognition application (also simply referred to as "voice recognition").
  • the storage unit 120 stores various types of information used for voice recognition.
  • the storage unit 120 stores information of a dictionary (voice recognition dictionary) used for the voice recognition dictionary.
  • the storage unit 120 stores information on a plurality of voice recognition dictionaries.
  • the control unit 130 is implemented by, for example, a central processing unit (CPU), a micro processing unit (MPU), or the like executing a program (for example, an information processing program or the like according to the present disclosure) stored inside the remote meeting server 100 using a random access memory (RAM) or the like as a work area.
  • a program for example, an information processing program or the like according to the present disclosure
  • RAM random access memory
  • the control unit 130 is realized by, for example, an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • the control unit 130 includes an acquisition unit 131, a processing unit 132, an extraction unit 133, a generation unit 134, and a transmission unit 135, and realizes or executes a function and an action of information processing described below.
  • the internal configuration of the control unit 130 is not limited to the configuration illustrated in FIG. 5 , and may be another configuration as long as information processing to be described later is performed.
  • a connection relationship among the processing units included in the control unit 130 is not limited to the connection relationship illustrated in FIG. 5 , and may be another connection relationship.
  • the acquisition unit 131 acquires various information.
  • the acquisition unit 131 acquires various information from an external information processing apparatus.
  • the acquisition unit 131 acquires various types of information from the terminal device 10 and the member terminal 20.
  • the acquisition unit 131 acquires information detected by the terminal device 10 from the terminal device 10.
  • the acquisition unit 131 acquires information detected by the member terminal 20 from the member terminal 20.
  • the acquisition unit 131 acquires various information from the storage unit 120.
  • the acquisition unit 131 acquires information regarding the remote meeting.
  • the acquisition unit 131 acquires information such as an utterance or an image of the user.
  • the acquisition unit 131 receives information such as an utterance and an image of the user who uses the terminal device 10 from the terminal device 10.
  • the acquisition unit 131 receives information such as an utterance or an image of the user who uses the member terminal 20 from the member terminal 20.
  • the acquisition unit 131 acquires group content information regarding the content of a conversation in at least one or more groups among a plurality of groups in which a plurality of users participating in a remote meeting are divided and can have a conversation in each group.
  • the acquisition unit 131 acquires the group content information from the group information storage unit 122.
  • the processing unit 132 executes various processes.
  • the processing unit 132 executes image processing.
  • the processing unit 132 executes processing related to voice recognition.
  • the processing unit 132 executes voice recognition processing using the information stored in the storage unit 120.
  • the processing unit 132 converts the utterance of the user into a text by converting the utterance of the user into character information.
  • the processing unit 132 can be realized by using an existing utterance semantic analysis technology.
  • the processing unit 132 analyzes the content of the utterance of the user.
  • the processing unit 132 estimates the content of the user's utterance by analyzing the user's utterance using various conventional technologies as appropriate. For example, the processing unit 132 analyzes the content of the user's utterance by the functions of natural language understanding (NLU) and automatic speech recognition (ASR).
  • NLU natural language understanding
  • ASR automatic speech recognition
  • the processing unit 132 estimates (specifies) the content of the utterance of the user by semantic analysis using character information corresponding to the utterance of the user. For example, the processing unit 132 estimates the content of the utterance of the user corresponding to the character information by analyzing the character information appropriately using various conventional techniques such as syntax analysis.
  • the processing unit 132 executes processing related to data holding.
  • the processing unit 132 accumulates information such as the image/audio and the voice recognition result transmitted from each of the terminal device 10 and the member terminal 20.
  • the processing unit 132 stores, in the storage unit 120, information such as image/audio and the voice recognition result transmitted from each of the terminal device 10 and the member terminal 20.
  • the processing unit 132 executes keyword extraction processing.
  • the processing unit 132 extracts a keyword on the basis of the conversation of the group.
  • the processing unit 132 extracts the designated keyword or the utterance corresponding to the extracted keyword from the utterances of the members in the group.
  • the processing unit 132 searches the utterance history of the member in the group using the designated keyword or the extracted keyword as the extraction keyword, and extracts an utterance including a character string matching the extraction keyword.
  • the processing unit 132 executes summary generation processing.
  • the processing unit 132 selects an utterance corresponding to the keyword from among the utterances of the members in the group, and generates a summary on the basis of the selected utterance.
  • the processing unit 132 selects an utterance corresponding to the keyword from among the utterances of the members in the group, and uses a sentence of the selected utterance as a sentence of the summary. For example, the processing unit 132 may select an utterance including a predetermined number or more of keywords from among the utterances of the members in the group, and use a sentence of the selected utterance as a sentence of the summary. Furthermore, the processing unit 132 may select an utterance including a predetermined number or more of keywords from among the utterances of the members in the group, and generate a summary using a sentence of the selected utterance. In this case, the processing unit 132 generates a summary by using the sentence of the selected utterance by a technology related to summary generation.
  • the extraction unit 133 searches for various types of information and extracts information.
  • the extraction unit 133 extracts a target comment from a history of comments made in the past.
  • the extraction unit 133 extracts the target comment corresponding to the target group on the basis of the information regarding the target group to be assisted.
  • the extraction unit 133 extracts the target comment by searching the history with the designated keyword.
  • the extraction unit 133 extracts the target comment on the basis of the similarity between each comment in the history and the keyword.
  • the extraction unit 133 extracts the target comment on the basis of a comparison between the similarity between each comment in the history and the keyword and a designated threshold value.
  • the generation unit 134 generates various types of information.
  • the generation unit 134 generates various types of information on the basis of information from an external information processing apparatus or information stored in the storage unit 120.
  • the generation unit 134 generates various types of information on the basis of information from other information processing apparatuses such as the terminal device 10 and the member terminal 20.
  • the generation unit 134 generates various types of information on the basis of information stored in the user information storage unit 121, the group information storage unit 122, or the history information storage unit 123.
  • the generation unit 134 generates various types of information to be displayed on the terminal device 10 or the member terminal 20 on the basis of the information determined by the processing unit 132.
  • the generation unit 134 executes various processing related to an image to be provided to the terminal device 10 or the member terminal 20.
  • the generation unit 134 arranges the voice and the image of the group in which the user of each member terminal 20 participates as a member into information in a necessary output form.
  • the generation unit 134 generates output data to be provided to the member terminal 20 by using the adjusted parameter.
  • the generation unit 134 generates output data used for information output of the remote meeting in the member terminal 20. For example, the generation unit 134 generates output data including parameters indicating the volume of each group, the arrangement position and size of an image of each group, and the like.
  • the generation unit 134 may generate a display screen (content) to be displayed on the terminal device 10 or the member terminal 20 as output data.
  • the generation unit 134 may generate a screen (content) to be provided to the terminal device 10 or the member terminal 20 by appropriately using various technologies such as Java (registered trademark).
  • the generation unit 134 may generate a screen (content) to be provided to the terminal device 10 or the member terminal 20 on the basis of a format such as CSS, JavaScript (registered trademark), or HTML.
  • the generation unit 134 may generate a screen (content) in various formats such as joint photographic experts group (JPEG), graphics interchange format (GIF), and portable network graphics (PNG).
  • JPEG joint photographic experts group
  • GIF graphics interchange format
  • PNG portable network graphics
  • the transmission unit 135 functions as an output unit that executes output processing.
  • the transmission unit 135 transmits information to the terminal device 10.
  • the transmission unit 135 transmits the information generated by the generation unit 134 to the terminal device 10.
  • the transmission unit 135 transmits the output data generated by the generation unit 134 to the terminal device 10.
  • the transmission unit 135 transmits information to the member terminal 20.
  • the transmission unit 135 transmits the information generated by the generation unit 134 to the member terminal 20.
  • the transmission unit 135 transmits the output data generated by the generation unit 134 to the member terminal 20.
  • the transmission unit 135 outputs the plural group list information displaying the group content information in association with the corresponding group together with the plurality of groups.
  • the transmission unit 135 transmits the plural group list information to the terminal device 10 used by the administrator (lecturer or the like) who manages the meeting (meeting).
  • the transmission unit 135 transmits assist information for making a comment to at least one or more groups among the plurality of groups to the terminal device 10.
  • the transmission unit 135 transmits information regarding comments made in the past to the terminal device 10 as the assist information.
  • FIG. 8 is a diagram illustrating a configuration example of the terminal device according to an embodiment of the present disclosure.
  • the terminal device 10 includes a communication unit 11, a voice input unit 12, a voice output unit 13, a camera 14, a display unit 15, an operation unit 16, a storage unit 17, and a control unit 18.
  • the communication unit 11 is realized by, for example, an NIC, a communication circuit, or the like. Then, the communication unit 11 is connected to a predetermined communication network (network) in a wired or wireless manner, and transmits and receives information to and from an external information processing apparatus. For example, the communication unit 11 is connected to a predetermined communication network in a wired or wireless manner, and transmits and receives information to and from the remote meeting server 100.
  • a predetermined communication network network in a wired or wireless manner
  • the voice input unit 12 functions as an input unit that receives an operation by a user's voice (utterance).
  • the voice input unit 12 is, for example, a microphone or the like, and detects the voice.
  • the voice input unit 12 detects user's utterance.
  • the voice input unit 12 receives a voice input indicating an operation related to the group displayed in the plural group list information. Note that the voice input unit 12 may have any configuration as long as it can detect user's utterance information necessary for processing.
  • the voice output unit 13 is realized by a speaker that outputs a voice, and is an output device for outputting various types of information as a voice.
  • the voice output unit 13 voice-outputs the content provided from the remote meeting server 100.
  • the voice output unit 13 outputs a voice corresponding to the information displayed on the display unit 15.
  • the terminal device 10 inputs and outputs a voice using the voice input unit 12 and the voice output unit 13.
  • the camera 14 includes an image sensor that detects an image.
  • the camera 14 captures an image of the user participating in the remote meeting.
  • the camera 14 may be built in the terminal device 10 and disposed on an upper part of the display unit 15.
  • the camera 14 may be an in-camera built in the terminal device 10.
  • the display unit 15 is a display screen of a tablet terminal or the like realized by, for example, a liquid crystal display, an organic electro-luminescence (EL) display, or the like, and is a display device for displaying various types of information.
  • a liquid crystal display for example, a liquid crystal display, an organic electro-luminescence (EL) display, or the like
  • EL organic electro-luminescence
  • the display unit 15 displays various types of information regarding the remote meeting.
  • the display unit 15 displays content.
  • the display unit 15 displays various types of information received from the remote meeting server 100.
  • the display unit 15 outputs the information on the remote meeting received from the remote meeting server 100.
  • the display unit 15 displays, together with a plurality of groups in which a plurality of users participating in a remote meeting are divided and can have a conversation in each group, plural group list information that displays group content information regarding a content of a conversation in at least one or more groups among the plurality of groups in association with a corresponding group.
  • the display unit 15 changes the display mode according to the user's operation received by the voice input unit 12 or the operation unit 16. In a case where an operation of displaying the assist information to the user is received by the voice input unit 12 or the operation unit 16, the display unit 15 displays the assist information. The display unit 15 displays information regarding the past comment as the assist information.
  • the display unit 15 changes the display mode of the group content information.
  • the display unit 15 displays the plural group list information in which the group content information is displayed in association with the corresponding group together with the plurality of groups.
  • the display unit 15 displays assist information for making a comment to at least one or more groups among the plurality of groups.
  • the display unit 15 displays information regarding comments made in the past as the assist information.
  • the operation unit 16 functions as an input unit that receives various user operations.
  • the operation unit 16 receives an operation related to the group displayed in the plural group list information from the user who uses the terminal device 10.
  • the operation unit 16 is a keyboard, a mouse, or the like.
  • the operation unit 16 may have a touch panel capable of realizing functions equivalent to those of a keyboard and a mouse.
  • the operation unit 16 receives various operations from the user via the display screen by a function of a touch panel realized by various sensors.
  • the operation unit 16 receives various operations from the user via the display unit 15.
  • the operation unit 16 receives an operation such as a designation operation by the user via the display unit 15 of the terminal device 10.
  • a capacitance method is mainly adopted in the tablet terminal, but any method may be adopted as long as the user's operation can be detected and the function of the touch panel can be realized, such as a resistive film method, a surface acoustic wave method, an infrared method, and an electromagnetic induction method, which are other detection methods.
  • the terminal device 10 is not limited to the above, and may have a configuration of receiving (detecting) various information as an input.
  • the terminal device 10 may have a line-of-sight sensor that detects the line of sight of the user.
  • the line-of-sight sensor detects the line-of-sight direction of the user using an eye tracking technology on the basis of detection results of the camera 14, the optical sensor, the motion sensor (all not illustrated), and the like mounted on the terminal device 10, for example.
  • the line-of-sight sensor determines a gaze region at which the user is gazing on the screen on the basis of the detected line-of-sight direction.
  • the line-of-sight sensor may transmit line-of-sight information including the determined gaze region to the remote meeting server 100.
  • the terminal device 10 may include a motion sensor that detects a gesture or the like of the user.
  • the terminal device 10 may receive an operation by a gesture of the user by the motion sensor.
  • the storage unit 17 is realized by, for example, a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the storage unit 17 stores, for example, various types of information received from the remote meeting server 100.
  • the storage unit 17 stores, for example, information regarding an application (for example, a remote meeting application or the like) installed in the terminal device 10, for example, a program or the like.
  • the storage unit 17 stores user information.
  • the storage unit 17 stores an utterance history (a history of a speech recognition result) and an action history of the user.
  • the control unit 18 is a controller, and is implemented by, for example, a CPU, an MPU, or the like executing various programs stored in a storage device such as the storage unit 17 inside the terminal device 10 using a RAM as a work area.
  • the various programs include a program of an application (for example, a remote meeting application) that performs information processing.
  • the control unit 18 is a controller, and is realized by, for example, an integrated circuit such as an ASIC or an FPGA.
  • the control unit 18 includes an acquisition unit 181, a transmission unit 182, a reception unit 183, and a processing unit 184, and implements or executes a function and an action of information processing described below.
  • the internal configuration of the control unit 18 is not limited to the configuration illustrated in FIG. 8 , and may be another configuration as long as information processing to be described later is performed.
  • the connection relationship among the processing units included in the control unit 18 is not limited to the connection relationship illustrated in FIG. 8 , and may be another connection relationship.
  • the acquisition unit 181 acquires various types of information. For example, the acquisition unit 181 acquires various types of information from an external information processing apparatus. For example, the acquisition unit 181 stores the acquired various types of information in the storage unit 17. The acquisition unit 181 acquires user's operation information received by the operation unit 16.
  • the acquisition unit 181 acquires utterance information of the user.
  • the acquisition unit 181 acquires the utterance information of the user detected by the voice input unit 12.
  • the transmission unit 182 transmits information to the remote meeting server 100 via the communication unit 11.
  • the transmission unit 182 transmits information regarding the remote meeting to the remote meeting server 100.
  • the reception unit 183 transmits information input by user's utterance, operation, or the like.
  • the reception unit 183 transmits a comment by an administrator such as a lecturer or a facilitator.
  • the reception unit 183 receives information from the remote meeting server 100 via the communication unit 11.
  • the reception unit 183 receives information provided by the remote meeting server 100.
  • the reception unit 183 receives content from the remote meeting server 100.
  • the processing unit 184 executes various processes.
  • the processing unit 184 executes processing according to the user's operation received by the voice input unit 12 or the operation unit 16. In a case where an operation of giving a comment to at least one or more groups among the plurality of groups is received by the voice input unit 12 or the operation unit 16, the processing unit 184 executes processing of transmitting a comment to a group corresponding to the operation. In a case where an operation of giving a comment to all of the plurality of groups is received by the voice input unit 12 or the operation unit 16, the processing unit 184 executes processing of transmitting the comment to all of the plurality of groups.
  • the processing unit 184 displays various types of information via the display unit 15.
  • the processing unit 184 functions as a display control unit that controls display on the display unit 15.
  • the processing unit 184 outputs various types of information by voice via the voice output unit 13.
  • the processing unit 184 functions as a sound output control unit that controls sound output of the voice output unit 13.
  • the processing unit 184 causes the display unit 15 to display, together with a plurality of groups in which a plurality of users participating in a remote meeting are divided and can have a conversation in each group, plural group list information that displays group content information regarding a content of a conversation in at least one or more groups among the plurality of groups in association with a corresponding group.
  • the processing unit 184 changes the display mode according to the user's operation received by the voice input unit 12 or the operation unit 16. In a case where an operation of displaying the assist information to the user is received by the voice input unit 12 or the operation unit 16, the processing unit 184 displays the assist information on the display unit 15. The processing unit 184 causes the display unit 15 to display information regarding the past comment as the assist information.
  • the processing unit 184 causes the display unit 15 to change the display mode of the group content information.
  • the processing unit 184 causes the display unit 15 to display the plural group list information in which the group content information is displayed in association with the corresponding group together with the plurality of groups.
  • the processing unit 184 causes the display unit 15 to display assist information for making a comment to at least one or more groups among the plurality of groups.
  • the processing unit 184 causes the display unit 15 to display information regarding comments made in the past as the assist information.
  • the processing unit 184 outputs the information received by the acquisition unit 181.
  • the processing unit 184 outputs the content provided from the remote meeting server 100.
  • the processing unit 184 outputs the content received by the acquisition unit 181 via the voice output unit 13 or the display unit 15.
  • the processing unit 184 displays content via the display unit 15.
  • the processing unit 184 outputs the content by voice via the voice output unit 13.
  • the processing unit 184 transmits various types of information to an external information processing apparatus via the communication unit 11.
  • the processing unit 184 transmits various types of information to the remote meeting server 100.
  • the processing unit 184 transmits various types of information stored in the storage unit 17 to an external information processing apparatus.
  • the processing unit 184 transmits various types of information acquired by the acquisition unit 181 to the remote meeting server 100.
  • the processing unit 184 transmits the sensor information acquired by the acquisition unit 181 to the remote meeting server 100.
  • the processing unit 184 transmits the operation information of the user received by the operation unit 16 to the remote meeting server 100.
  • the processing unit 184 transmits information such as an utterance and an image of the user who uses the terminal device 10 to the remote meeting server 100.
  • each processing by the control unit 18 described above may be realized by, for example, JavaScript (registered trademark) or the like.
  • each unit of the control unit 18 may be realized by, for example, a predetermined application.
  • processing such as information processing by the control unit 18 may be realized by control information received from an external information processing apparatus.
  • the control unit 18 may include, for example, an application control unit that controls a predetermined application or a dedicated application.
  • FIG. 9 is a flowchart illustrating a processing procedure of the information processing apparatus according to the embodiment of the present disclosure. Specifically, FIG. 9 is a flowchart illustrating a procedure of information processing by the remote meeting server 100 which is an example of the information processing apparatus.
  • the remote meeting server 100 acquires group content information regarding a content of a conversation in at least one or more groups among a plurality of groups in which a plurality of participants who participate in a remote meeting are divided and can have a conversation in each group (Step S101).
  • the remote meeting server 100 transmits the plural group list information that displays the group content information in association with the corresponding group together with the plurality of groups (Step S102).
  • the remote meeting server 100 transmits, to the terminal device 10, the plural group list information for displaying the group content information in association with the corresponding group together with the plurality of groups.
  • the terminal device 10 displays the plural group list information in which the group content information is displayed in association with the corresponding group together with the plurality of groups.
  • FIG. 10 is a flowchart illustrating a processing procedure related to the group work. Note that, in the following, a case where the remote lecture system 1 performs processing will be described as an example, but the processing illustrated in FIG. 10 may be performed by any device such as the remote meeting server 100 and the terminal device 10 according to a device configuration included in the remote lecture system 1.
  • the remote lecture system 1 causes the process to branch according to whether or not the group work is started (Step S201). In a case where the remote lecture system 1 does not start the group work (Step S201: No), the remote lecture system 1 ends the process related to the group work.
  • the remote lecture system 1 sets the group work (Step S202). For example, when starting a group work, the remote lecture system 1 sets the group work as illustrated in FIG. 22 .
  • Step S203 the remote lecture system 1 reflects the set keyword and threshold value in the extraction processing (Step S204), and performs processing of Step S205 and subsequent steps.
  • Step S203: No the remote lecture system 1 performs the processing of Step S205 and subsequent steps without performing processing of Step S204.
  • the remote lecture system 1 branches the processing according to whether or not the lecturer confirms the assist (Step S205).
  • the remote lecture system 1 feeds back the corresponding content from held data (Step S206), and performs processing of Step S207 and subsequent steps.
  • the remote meeting server 100 transmits the assist information to the terminal device 10 as feedback of the corresponding content from the held data.
  • Step S205 the remote lecture system 1 performs the processing of Step S207 and subsequent steps without performing the processing of Step S206.
  • the remote lecture system 1 branches the processing according to whether or not the lecturer sends the comment (Step S207).
  • the remote lecture system 1 feeds back the content to the participant side (Step S208), and performs the processing of Step S209 and subsequent steps.
  • the remote meeting server 100 transmits the comment of the lecturer to the member terminal 20 as feedback of the content to the participant side.
  • Step S207 the remote lecture system 1 performs the processing of Step S209 and subsequent steps without performing the processing of Step S208.
  • the remote lecture system 1 branches the processing according to whether or not an utterance is made in the group (Step S209).
  • the remote lecture system 1 performs analysis processing of the utterance content (Step S210), and registers the processing result as held data (Step S211).
  • the remote lecture system 1 performs the processing of Step S212 and subsequent steps.
  • the remote meeting server 100 performs the analysis processing of the utterance content on the basis of the information of the utterance in the group, and registers the processing result as held data.
  • Step S209 No
  • the remote lecture system 1 performs the processing of S212 and subsequent steps without performing the processing of Steps S210 and S211.
  • the remote lecture system 1 branches the processing according to whether or not to end the group work (Step S212). For example, the remote lecture system 1 determines whether or not to end the group work using an arbitrary end condition such as a set time or a condition related to convergence of the group work. In a case of ending the group work (Step S212: Yes), the remote lecture system 1 ends the processing related to the group work.
  • Step S212 the remote lecture system 1 returns to Step S205 and repeats the processing.
  • the display mode illustrated in FIG. 3 is merely an example, and the group content information may be displayed in various display modes.
  • the group content information which is the information of the overall grasp screen can be changed so that the log information can be further visually recognized. This point will be described below. Note that descriptions of the same points as those described above will be omitted as appropriate.
  • FIG. 11 is a diagram illustrating an example of display by a distribution graph.
  • the group content information SM5 illustrates an example of display of the group content information according to the display mode of the distribution graph.
  • Each circle in the group content information SM5 indicates the number of appearances of the keyword displayed in the circle. That is, the larger the size of the circle, the larger the number of appearances of the keyword. Furthermore, the distance between the circles indicates a keyword similarity relationship. For example, a distance between circles of keywords having high similarity is shortened.
  • the terminal device 10 displays the group content information SM5.
  • the terminal device 10 may display the group content information not only in characters but also in a graph form. By changing the display, the terminal device 10 can allow the administrator to easily (intuitively) grasp whether the discussion diverges or is gathered in the group, whether there is a difference in the degree of progress with another group, and the like.
  • the number of appearances of the extracted keywords is expressed in the size of a circle, and word vectors of the respective keywords are projected, so that words having a potentially close relationship are expressed in a form in which circles are close to each other.
  • the administrator can grasp the contents of the graphical discussion. By checking these pieces of information, the administrator can smoothly perform facilitation.
  • FIG. 12 is a diagram illustrating an example of display by frequency time series.
  • the group content information SM6a indicates the number of appearances (frequency) of each keyword at the time point (10:00) corresponding to the point PT displayed on the time bar.
  • the waveform of the group content information SM6a indicates the frequency of each keyword. That is, the frequency of the keyword arranged on the wave is indicated. The higher the position of the waveform, the larger the number of appearances of the keyword.
  • each keyword is arranged on the basis of a similarity relationship. For example, keywords having high similarity are arranged at close positions.
  • the terminal device 10 displays the group content information SM6a.
  • the terminal device 10 can time-forward and display the group content information SM6a according to an operation of a user such as an administrator.
  • the administrator who uses the terminal device 10 may perform time forwarding (reproduction) by selecting the reproduction button PB or the like, or may perform the time forwarding by moving the point PT.
  • FIG. 12 illustrates a display example after the group content information SM6b is time-forwarded from the group content information SM6a.
  • the group content information SM6b indicates the number of appearances (frequency) of each keyword at the time point (10:10) corresponding to the point PT moved by the administrator who uses the terminal device 10. The other points are similar to those of the group content information SM6a.
  • FIG. 12 illustrates that the group content information SM6b has a higher corona and postponement frequency than the group content information SM6a.
  • the terminal device 10 may display the group content information such that a change with time can be confirmed. As a result, the terminal device 10 can cause the administrator to easily (intuitively) grasp the situation change in the group.
  • the remote lecture system 1 provides a diagram in which the type of the extracted keyword and the appearance frequency of the keyword can be easily grasped. Furthermore, the frequency time series display in FIG. 12 has a function capable of confirming a transition due to a time change, and the administrator such as a lecturer can confirm what kind of discussion flow is being performed by performing time forwarding (reproduction button). By checking these pieces of information, the administrator can smoothly perform facilitation.
  • the administrator can grasp whether the discussion contents of the entire group converge or diverge by referring to each display. For example, in a case where various keywords are widely dispersed, it can be seen that the discussion is widened. Furthermore, in a case where only one or surrounding keywords are intensively frequently used, it can be understood that the discussion is focused on a certain theme.
  • FIG. 13 is a diagram illustrating an example of display by the radar chart.
  • the group content information SM7 indicates an example of display of the group content information according to the display mode of the radar chart.
  • the group content information SM7 shows a radar chart for five keywords of corona, vaccine, holding, postponement, and infection. The larger the value corresponding to each keyword, the larger the number of appearances of the keyword.
  • the terminal device 10 displays the group content information SM7.
  • the terminal device 10 by displaying the terminal device 10 in the form of a radar chart, it is possible to allow the administrator to easily (intuitively) grasp whether the discussion diverges or is gathered in the group, whether there is a difference in the degree of progress with another group, and the like.
  • FIG. 14 is a diagram illustrating an example of display by a line graph.
  • the group content information SM8 illustrates an example of display of the group content information according to the display mode of the line graph.
  • the group content information SM8 shows a line graph for five keywords of corona, vaccine, holding, postponement, and infection. The larger the value corresponding to each keyword, the larger the number of appearances of the keyword.
  • the terminal device 10 displays the group content information SM8.
  • the terminal device 10 can display the group content information such that a change with time can be confirmed by displaying the group content information in the form of a line graph. As a result, the terminal device 10 can cause the administrator to easily (intuitively) grasp the situation change in the group.
  • FIG. 15 is a diagram illustrating an example of display by a pie chart.
  • the group content information SM9 illustrates an example of display of the group content information according to the display mode of the pie chart.
  • the group content information SM9 shows a pie chart for five keywords of corona, vaccine, holding, postponement, and infection. The larger the ratio of the region corresponding to each keyword (the larger the angle), the larger the number of appearances of the keyword.
  • the terminal device 10 displays the group content information SM9.
  • the terminal device 10 can cause the administrator to easily (intuitively) grasp whether the discussion is emanating or converging in the group, whether there is a difference in the degree of progress with another group, and the like.
  • FIG. 16 is a diagram illustrating an example of display by a bar graph.
  • the group content information SM10 illustrates an example of display of the group content information according to the display mode of the bar graph.
  • the group content information SM10 shows bar graphs for 5 keywords of corona, vaccine, holding, postponement, infection. The larger the ratio of the region corresponding to each keyword, the larger the number of appearances of the keyword.
  • the terminal device 10 displays the group content information SM10.
  • the terminal device 10 can display the group content information such that a change with time can be confirmed by displaying a plurality of bar graphs arranged along time. As a result, the terminal device 10 can cause the administrator to easily (intuitively) grasp the situation change in the group.
  • the remote lecture system 1 performs corresponding processing according to an operation or the like by a lecturer.
  • FIG. 17 is a diagram illustrating an example of the plural group list information and the assist information.
  • an area AR5 which is an operation UI area of the plural group list information CT1 includes a plurality of operation buttons AR51 to AR54.
  • the terminal device 10 receives an operation of the lecturer on the plurality of operation buttons AR51 to AR54 and the like.
  • the operation button AR51 described as "line-of-sight correspondence On/Off" indicates whether or not to accept input by a line of sight. Note that a case of receiving an input by a line of sight will be described later.
  • an operation button AR52 described as "overall comment” is a button for transmitting a comment to all the members of all the groups. In a case of sending the comment to the entire group, the lecturer can collectively send the comment to all the groups by selecting the operation button AR52.
  • the terminal device 10 transmits, to the remote meeting server 100, comment transmission information including the comment of the lecturer, designation information indicating that the transmission destination of the comment is all groups (that is, all members), and character information indicating the comment to be transmitted to the members of all groups.
  • the remote meeting server 100 that has received the comment transmission information from the terminal device 10 transmits the comment indicated by the comment transmission information to the member terminals 20 of the members of all the groups.
  • the remote lecture system 1 can quickly provide information on contents common to all participants and time management to all participants.
  • an operation button AR53 indicated as "display switching” is a button for switching the display.
  • the lecturer can switch the display content of the plural group list information CT1 by selecting the operation button AR53.
  • the lecturer designates a target group (target group) from the groups A to D, and selects the operation button AR53 to switch the display mode of the group content information of the target group.
  • the terminal device 10 receives the selection as the operation of designating the target group.
  • the terminal device 10 switches the display mode of the group content information SM1 of the group A. For example, in a case where the group content information SM1 is displayed in the keyword/summary display, the terminal device 10 switches the display mode of the group content information SM1 to the frequency time series, the distribution graph, or the like according to the selection of the operation button AR53. Note that the terminal device 10 may sequentially switch the display mode in a predetermined order according to the selection of the operation button AR53, or may receive designation as to which display mode to switch from the lecturer.
  • FIG. 18 is a diagram illustrating an example of providing a comment using assistance.
  • FIG. 19 is a diagram illustrating another example of the content of the assist.
  • the remote lecture system 1 can hold the contents discussed in the lecture, the brainstorming, or the like so far in the history information storage unit 123 or the like, for example.
  • the remote lecture system 1 can perform the assist to an administrator such as a lecturer by using the held data.
  • the assist is performed by searching past content similar to what is currently discussed and providing what is being discussed as current assistance.
  • the assistance to the member side such as the participant can be controlled by the lecturer side.
  • An operation button AR54 written as "assist search reference" in the area AR5 is a button for the lecturer to receive the assist of the comment.
  • the lecturer can search the comment record for the discussion from the past log information by selecting the operation button AR54, and can receive the assist on the basis of the search result.
  • the remote meeting server 100 generates information (also referred to as "assist information") for assisting the administrator such as the lecturer, and transmits the generated assist information to the terminal device 10. This point will be described below.
  • the remote lecture system 1 performs a search according to the selection of the lecturer. For example, the lecturer designates a group (target group) to be assisted. As a result, the remote lecture system 1 searches on the basis of the information of the target group to be assisted.
  • the terminal device 10 displays information for designating a search target such as keyword designation, similarity designation, and group designation. For example, the terminal device 10 displays information other than the case example of comments which is the search result (assist information) among the information indicated in the search information AS1. After inputting the information to be designated, the lecturer requests the remote meeting server 100 to perform a search by selecting a search button BT1 in the search information AS1.
  • the terminal device 10 requests the remote meeting server 100 to perform a search based on the keyword, the similarity, the group, and the like designated by the lecturer.
  • the terminal device 10 transmits designation information indicating the keyword, the similarity, the group, and the like designated by the lecturer to the remote meeting server 100, thereby requesting the remote meeting server 100 to perform search.
  • the remote meeting server 100 may extract (select) a keyword on the basis of a conversation in all groups and perform a search on the basis of the keyword.
  • the remote meeting server 100 may determine the keyword on the basis of the agenda of the group work and perform the search on the basis of the keyword.
  • the remote meeting server 100 may determine a keyword corresponding to the agenda of the group work using the information of the keyword list associated with the agenda, and perform search on the basis of the keyword.
  • the remote meeting server 100 may extract (select) a keyword on the basis of a conversation in the designated group, and perform search on the basis of the keyword. Note that the above is merely an example, and the remote meeting server 100 may determine the keyword using various information.
  • the remote meeting server 100 performs search using a predetermined setting value (default value) as a threshold value.
  • the remote meeting server 100 performs search using the designated similarity as a threshold value.
  • the remote meeting server 100 performs search using a default value (for example, 0.6 or the like) as a similarity threshold value on the basis of the designated keyword or the conversations of all the groups.
  • the terminal device 10 requests to perform the search on the basis of the conversation of the group A by transmitting the information designating the group A.
  • the remote lecture system 1 calculates the similarity of the keyword information and the summary content, searches for a comment example that can be a reference from the past data, and feeds back the comment example to the lecturer.
  • the remote meeting server 100 searches for past comments (past comments) in the history information storage unit 123.
  • the remote meeting server 100 performs a search using a designated keyword, a keyword extracted from a conversation of a designated group, or the like as a search query (target keyword).
  • target keyword target keyword
  • the remote meeting server 100 extracts keywords such as "postponement", "holding”, “medical care", “economic”, and the like on the basis of the conversation of the group A.
  • the remote meeting server 100 may use the designated keywords "postponement", "holding”, “medical care", “economic”, and the like.
  • the remote meeting server 100 calculates the similarity between the keyword associated with each past comment in the history information storage unit 123 and the search query (target keyword), and extracts the past comment of which the calculated similarity is greater than or equal to a threshold value.
  • the remote meeting server 100 vectorizes each keyword, and calculates cosine similarity between the vectors of the keywords as similarity between the keywords.
  • the remote meeting server 100 converts keywords into vectors using an arbitrary model (vector conversion model) such as Word2Vec or a bag of words (BoW).
  • the remote meeting server 100 extracts the past comment associated with the keyword whose similarity with the search query (target keyword) is equal to or greater than a threshold value. Note that the above is merely an example, and the remote meeting server 100 may calculate the similarity by appropriately using various types of information.
  • the remote meeting server 100 extracts past comments such as "List merits and demerits when holding", "What is needed to support?", and "What factors can be considered for economic promotion?”.
  • the remote meeting server 100 extracts past comments "List merits and demerits when holding” associated with the keywords “postponement” and "holding”.
  • the remote meeting server 100 extracts the past comment "What is needed to support?” associated with the keywords “medical care” and “infected person”.
  • the remote meeting server 100 extracts the past comment "What factors can be considered for economic promotion?” associated with the keyword “economic effect”.
  • the remote meeting server 100 transmits the extracted past comment to the terminal device 10 as assist information.
  • the remote meeting server 100 transmits search information AS1 including the extracted past comment as assist information to the terminal device 10.
  • the terminal device 10 that has received the search information AS1 displays the search information AS1.
  • the lecturer can confirm the past comment provided as the assist information.
  • the lecturer can obtain information from the searched result (assist information), generate a comment for the group, and provide the comment to the group.
  • the lecturer can transmit the comment to the group using the assist information as it is.
  • the lecturer side can control information to be sent to the group with a check box or the like.
  • the comment candidate designated in the check box is actually transmitted to the group as a comment.
  • FIG. 18 illustrates a case where the check boxes CK1 and CK2 are selected among the check boxes CK1 to CK3.
  • the terminal device 10 selects the past comment "List merits and demerits when holding” and the past comment "What is needed to support?” as the comment to be transmitted.
  • the lecturer After designating the comment to be transmitted, the lecturer requests the remote meeting server 100 to transmit the comment to be transmitted by selecting the transmission button BT2 in the search information AS1.
  • the terminal device 10 requests the remote meeting server 100 to transmit the comment to be transmitted designated by the lecturer.
  • the terminal device 10 transmits designation information indicating the comment to be transmitted, the group, and the like designated by the lecturer to the remote meeting server 100, thereby requesting the remote meeting server 100 to transmit the comment.
  • the terminal device 10 requests the remote meeting server 100 to transmit two comments of the past comment "List merits and demerits when holding" corresponding to the check box CK1 and the past comment "What is needed to support?" corresponding to the check box CK2 to the group A.
  • the remote meeting server 100 transmits the comment to the member terminal 20 used by the member of the group (Step S21).
  • the remote meeting server 100 transmits the two comments of the past comment "List merits and demerits when holding” and the past comment "What is needed to support?" to the member terminal 20 of the group A as comments from the lecturer.
  • the remote meeting server 100 transmits the comment information CM1 indicating two lecturer comments of "List merits and demerits when holding" and "What is needed to support?" to the member terminal 20 of the group A.
  • the member terminal 20 that has received the comment information CM1 displays the comment information CM1.
  • the remote lecture system 1 can facilitate the comment by the administrator such as the lecturer to the member. Therefore, in the remote lecture system 1, even in a case where the lecturer is a new lecturer, it is possible to provide a group work close to an expert lecturer.
  • the lecturer himself/herself can learn by referring to the comments made by the past lecturer using the past data. For example, in the remote lecture system 1, when the lecturer learns, the content of the registration data can be referred by designating the keyword of the "assist search reference" without performing the group work.
  • the terminal device 10 designates the similarity threshold value as "0.4", and transmits information designating the group A, thereby requesting the remote meeting server 100 to perform search with the threshold value set to "0.4" on the basis of the conversation of the group A.
  • the remote meeting server 100 transmits search information AS2 including the extracted past comment as assist information to the terminal device 10.
  • the terminal device 10 that has received the search information AS2 displays the search information AS2.
  • the similarity threshold value is designated as "0.4”
  • five past comments which are more than three in a case where the search is performed with the default value indicated in the search information AS1 are extracted.
  • the lecturer can obtain feedback of more candidates by lowering the threshold value of the similarity.
  • the remote meeting server 100 extracts past comments such as "List merits and demerits when holding", "What is needed to support?", and "What factors can be considered for economic promotion?”.
  • the remote meeting server 100 extracts past comments “List merits and demerits when holding", "What is important if you are in a management position?", and "What was necessary not to postpone?" associated with the keywords “postponement” and "holding”.
  • the remote meeting server 100 extracts the past comment "What is needed to support?” associated with the keywords “medical care” and "infected person”.
  • the remote meeting server 100 extracts the past comment "What factors can be considered for economic promotion?” associated with the keyword “economic effect”.
  • the lecturer selects a comment to be transmitted to the member by selecting a check box corresponding to the past comment to be the comment to be transmitted among the check boxes CK11 to CK15.
  • the remote lecture system 1 provides the member with the comment to be transmitted as the comment of the lecturer according to the selection of the lecturer.
  • FIG. 20 is a diagram illustrating an example of the assist contents.
  • the assist contents as illustrated in FIG. 20 are generated by the remote meeting server 100 on the basis of information such as a lecturer side data holding list.
  • the comment list CL1 in FIG. 20 is generated from past comments in the history information storage unit 123.
  • the remote meeting server 100 extracts the past comment from the history information storage unit 123 and generates a comment list CL1 using the extracted past comment. Since the past comment illustrated in the comment list CL1 is similar to the past comment illustrated in the search information AS2 of FIG. 19 , the description thereof will be omitted.
  • FIG. 21 is a diagram illustrating an example of keywords and summary contents.
  • FIG. 21 illustrates, as an example, a case where four keywords "postponement”, “holding”, “corona”, and “medical care” designated by a lecturer are targeted for the discussion regarding the 2021 Olympics.
  • a dialogue log DA1 of FIG. 21 illustrates a flow of dialogue in a group between 10:00 and 10:10 for the discussion about the 2021 Olympics.
  • the remote meeting server 100 collects information indicated in the dialogue log DA1.
  • the remote meeting server 100 generates list information of the keyword and the summary content associated with the time as illustrated in dialogue summary data DT1 on the basis of the dialogue log DA1 (Step S31).
  • the dialogue summary data DT1 indicates an example of a holding list of the participant side data.
  • the remote meeting server 100 generates the dialogue summary data DT1 by analyzing the dialogue log DA1 for four keywords "postponement", "holding”, “corona”, and "medical care” designated by the lecturer.
  • the remote meeting server 100 stores the dialogue log DA1 and the dialogue summary data DT1 in the storage unit 120.
  • FIG. 22 is a diagram illustrating an example of a flow of setting a group work.
  • FIG. 22 illustrates, as an example, a case where a lecturer starts the group work.
  • a first screen FS1 in FIG. 22 illustrates an initial screen.
  • the lecturer operates the terminal device 10 on which the first screen FS1 is displayed to set the number of groups and keywords.
  • a second screen FS2 in FIG. 22 indicates a screen on which the number of groups (In the case of FIG. 22 , four) is designated and on which member division is performed.
  • the lecturer operates the terminal device 10 on which the second screen FS2 is displayed to assign a member to each group.
  • a third screen FS3 in FIG. 22 illustrates a screen for setting a keyword.
  • the lecturer operates the terminal device 10 on which the third screen FS3 is displayed to set the keyword.
  • a fourth screen FS4 in FIG. 22 is a screen for setting a threshold value of keyword similarity.
  • the lecturer operates the terminal device 10 on which the fourth screen FS4 is displayed to set the similarity of the keywords.
  • the lecturer when starting a group work, the lecturer can designate the number of groups, members to be included in the group, and keywords to be extracted (picked up) in the group work.
  • the keyword is not an essential designation item. In a case where no designation is made, the remote lecture system 1 automatically handles proper nouns and common nouns captured as keywords.
  • the remote lecture system 1 selects and extracts a word similar to the designated keyword (a similar word obtained using a degree of similarity using a word vector or the like) with the designated keyword as a center.
  • the remote lecture system 1 can also set a threshold value for determination as a similar word.
  • FIG. 22 is merely an example, and an arbitrary setting mode can be adopted for the setting of the group work.
  • a plurality of members participating in a group work may be automatically divided into a plurality of groups.
  • the member may change the group to which the member belongs.
  • the operation on the terminal device 10 is not limited to the above-described keyboard, mouse, touch panel, voice input, and the like, and may be received by various input modes.
  • the terminal device 10 may receive an operation by the line of sight of the operator (user).
  • designation of the operation buttons and the like designation using a line of sight is also possible. This point will be described with reference to FIGS. 23 and 24 .
  • FIGS. 23 and 24 are diagrams illustrating an example of a display mode. Note that description of the same points as those described above will be omitted as appropriate.
  • the line-of-sight information on a line of sight of the lecturer FC is acquired from the camera 14, and a pointer SP1 indicating a gaze point of the line of sight of the lecturer FC is displayed on the screen of the terminal device 10 based on the line-of-sight information.
  • the terminal device 10 displays the pointer SP1 superimposed on the overall grasp screen.
  • the remote lecture system 1 various processes are executed according to the position of the pointer SP1, which is the line-of-sight pointer of the lecturer FC.
  • the pointer SP1 when the pointer SP1 is included in the area of any group, the voice of the lecturer FC is transmitted to the member terminal 20 of the group in which the pointer SP1 is located.
  • the comment may be sent from the lecturer FC to the group on the basis of the line-of-sight information of the lecturer FC.
  • the lecturer FC moves a position of a pointer SP2, which is the line-of-sight pointer, into the area AR1 of the group A, and utters the comment in a state where the area AR1 of the group A is selected (feedback such as highlight of the frame of the area AR1 is performed), so that the content is sent to the group A.
  • the terminal device 10 transmits the utterance information and designation information indicating that the designated group is the group A to the remote meeting server 100.
  • the remote meeting server 100 that has received the utterance information and the designation information transmits the utterance information to the member terminal 20 of the group A designated by the designation information. Then, the member terminal 20 that has received the utterance information outputs the utterance information.
  • the comment may be sent on the basis of the information regarding the line of sight of the lecturer FC.
  • the lecturer FC moves the position of the pointer SP2, which is the line-of-sight pointer, into the region 52 written as the "overall comment", and utters the comment in a state where the "overall comment” is selected (feedback such as highlight of the frame of the region 52 is performed), so that the content is sent to the entire group.
  • the voice of the lecturer FC is transmitted to the member terminals 20 of all the groups.
  • the above operation may be an operation as a touch panel.
  • FIG. 25 is a diagram illustrating an example of components of the remote lecture system. Note that the function of each element illustrated in FIG. 23 may be realized by any of various devices such as the remote meeting server 100, the terminal device 10, and the member terminal 20 included in the remote lecture system 1.
  • the remote lecture system 1 includes three functional elements of a first element EL1, a second element EL2, and a third element EL3.
  • the first element EL1 is an element corresponding to a lecturer (administrator), and its function is realized by the terminal device 10.
  • the second element EL2 is an element corresponding to the participant (member), and the function thereof is realized by the member terminal 20.
  • the third element EL3 is an element corresponding to an entity that provides various services to the lecturer and the participant, and a function thereof is realized by the remote meeting server 100.
  • the third element EL3 has three functions of an input/output control unit FC1, an information processing unit FC2, and a data holding unit FC3.
  • the information processing unit FC2 includes an image processing unit FC21 and a voice processing unit FC22.
  • the input/output control unit FC1 corresponds to the communication unit 110 of the remote meeting server 100
  • the information processing unit FC2 corresponds to the control unit 130 of the remote meeting server 100
  • the data holding unit FC3 corresponds to the storage unit 120 of the remote meeting server 100.
  • the image processing unit FC21 and the voice processing unit FC22 correspond to the processing unit 132 of the control unit 130.
  • the input/output control unit FC1 performs control for sending an image and audio input to the second element EL2 (system) and other information from a touch panel, a button, and the like to the information processing unit FC2, and performs control for presenting the information processed by the information processing unit FC2 to the system user.
  • the information processing unit FC2 uses information from the input/output control unit FC1 to create data to be held and create information to be output.
  • the image processing unit FC21 has a function of analyzing information such as eye lines, face movements, directions, and facial expressions of lecturers and participants.
  • the voice processing unit FC22 has a function of removing noise from an input voice and converting waveform information into text information.
  • image information and information such as a touch panel and a button are used for switching a display method for a system user and controlling input information.
  • the information processing unit FC2 creates a keyword and a summary of contents from the utterance information converted into text.
  • the information processing unit FC2 counts the number of times of the appeared keywords and performs classification based on the similarity of the summary contents, edits the keywords in an easy-to-understand form for the system user, and generates information to be output.
  • the information processing unit FC2 refers to the extracted keyword or summary content information and the information held in the data holding unit FC3, and creates information to be fed back to the system user.
  • the extracted keyword and summary content information are divided on the side of the system user as a lecturer and on the side of a participant, and the respective data are sent to the data holding unit FC3.
  • the information of the keyword or the summary content extracted from the participant side is sent to the data holding unit FC3 together with the time information of the current time.
  • the data holding unit FC3 holds information processed by the information processing unit FC2.
  • Data on the lecturer side and extraction information on the participant side are held as time series data.
  • the lecturer can correct the held data if there is an error in the summary content or the like through dialogue with the participants or a summary interview.
  • the member terminal 20 or the terminal device 10 functions as a device that transmits information such as an utterance or an image of the user to the remote meeting server 100 and outputs information received from the remote meeting server 100, that is, a so-called thin client.
  • the remote lecture system 1 has been described as an example of a configuration of a so-called centralized system such as a client server system that executes main processing on the server side. Note that the above is merely an example, and any mode can be adopted for division of functions in the remote lecture system 1.
  • the system configuration in which the remote meeting server 100 performs image processing and voice recognition has been described as an example, but the image processing and the voice recognition may be performed by each of the terminal device 10 and the member terminal 20.
  • the terminal device 10 may perform search processing of past comments, generation processing of assist information, and the like.
  • the terminal device 10 functions as an information processing apparatus that outputs the plural group list information displaying the group content information in association with the corresponding group together with the plurality of groups.
  • the terminal device 10 holds information stored in the storage unit 120, and has functions of the processing unit 132, the extraction unit 133, and the generation unit 134.
  • the terminal device 10 functions as a rich client that performs processing related to voice recognition, information search (extraction), generation, and output (display). Then, the remote meeting server 100 collects information from the terminal device 10 and each member terminal 20, and provides necessary information to the terminal device 10 and each member terminal 20.
  • the remote lecture system 1 may not include the remote meeting server.
  • the remote lecture system 1 may execute main processing on the terminal (client) side of the user, for example, and the server may manage only information regarding the remote meeting or may have a system configuration not including the server, that is, a configuration of a so-called autonomous distributed system.
  • the remote lecture system 1 may have either a centralized configuration or an autonomous distributed configuration.
  • the remote lecture system 1 may have any function division mode and any device configuration as long as it can provide the service related to the remote meeting described above.
  • each component of each device illustrated in the drawings is functionally conceptual, and is not necessarily physically configured as illustrated in the drawings. That is, a specific form of distribution and integration of each device is not limited to the illustrated form, and all or a part thereof can be functionally or physically distributed and integrated in an arbitrary unit according to various loads, usage conditions, and the like.
  • the information processing apparatus (for example, in the embodiment, the remote meeting server 100 or the terminal device 10) according to the present disclosure includes the acquisition unit (the acquisition unit 131 or the acquisition unit 181 in the embodiment) and the output unit (the transmission unit 135 or the display unit 15 in the embodiment).
  • the acquisition unit acquires group content information regarding the content of a conversation in at least one or more groups among a plurality of groups in which a plurality of users participating in a remote meeting are divided and can have a conversation in each group.
  • the output unit outputs the plural group list information displaying the group content information in association with the corresponding group together with the plurality of groups.
  • the information processing apparatus outputs the plural group list information that displays the group content information in association with the corresponding group together with the plurality of groups, so that the information about the plural groups can efficiently be confirmed.
  • the output unit outputs assist information for making a comment to at least one or more groups among the plurality of groups.
  • the information processing apparatus can appropriately assist the dialogue by the group by outputting the assist information for making a comment to the group.
  • the output unit outputs information regarding comments made in the past as the assist information.
  • the information processing apparatus can appropriately assist the dialogue by the group by outputting the information regarding the comment made in the past.
  • the information processing apparatus includes an extraction unit (the extraction unit 133 in the embodiment).
  • the extraction unit extracts a target comment from a history of comments made in the past.
  • the information processing apparatus can enable appropriate assistance based on the past comment by extracting the target comment from the history of the comments made in the past.
  • the extraction unit extracts the target comment corresponding to the target group on the basis of the information regarding the target group to be assisted.
  • the information processing apparatus can enable the assistance based on the appropriate past comment corresponding to the group by extracting the target comment corresponding to the target group to be assisted.
  • the extraction unit extracts the target comment by searching the history with the designated keyword.
  • the information processing apparatus can perform the assistance based on the appropriate past comment corresponding to the keyword by searching the history with the designated keyword and extracting the target comment.
  • the extraction unit extracts the target comment on the basis of the similarity between each comment in the history and the keyword.
  • the information processing apparatus can extract the target comment on the basis of the similarity between each comment in the history and the keyword, thereby enabling the assistance based on the appropriate past comment corresponding to the keyword.
  • the output unit transmits the plural group list information to the terminal device.
  • the information processing apparatus can efficiently check the information on the plurality of groups by transmitting the plural group list information to the terminal device.
  • the output unit transmits the plural group list information to the terminal device used by the administrator who manages the meeting.
  • the information processing apparatus can efficiently check the information on the plurality of groups by transmitting the plural group list information to the terminal device used by the administrator who manages the meeting.
  • the terminal device (the terminal device 10 in the embodiment) according to the present disclosure includes the display unit (the display unit 15 in the embodiment).
  • the display unit displays, together with a plurality of groups in which a plurality of users participating in a remote meeting are divided and can have a conversation in each group, plural group list information that displays group content information regarding a content of a conversation in at least one or more groups among the plurality of groups in association with a corresponding group.
  • the terminal device displays the plural group list information that displays the group content information in association with the corresponding group together with the plurality of groups, so that the information about the plurality of groups can efficiently be confirmed.
  • the terminal device includes an input unit (the voice input unit 12 or the operation unit 16 in the embodiment).
  • the input unit receives an operation related to the group displayed in the plural group list information from the user who uses the terminal device.
  • the terminal device can efficiently confirm the information about the plurality of groups by receiving the operation related to the group displayed in the plural group list information, and can improve the convenience for the user of the operation related to the group displayed in the plural group list information.
  • the display unit changes the display mode according to the operation of the user received by the input unit.
  • the terminal device can display appropriate information according to the operation by changing the display mode according to the operation of the user.
  • the display unit displays the assist information in a case where the input unit receives an operation of displaying the assist information to the user.
  • the terminal device can assist at an appropriate timing by displaying the assist information according to the operation of the user.
  • the display unit displays information regarding the past comment as the assist information.
  • the terminal device can appropriately assist the dialogue by the group by displaying the information regarding the comment made in the past.
  • the display unit changes the display mode of the group content information.
  • the terminal device can display appropriate information according to the operation by changing the display mode of the group content information according to the operation of the user.
  • the terminal device includes a processing unit (the processing unit 184 in the embodiment).
  • the processing unit executes processing according to the user's operation received by the input unit. As described above, by executing the processing according to the operation of the user, the terminal device can efficiently confirm the information on the plurality of groups, and can appropriately execute the processing according to the operation of the user.
  • the processing unit executes processing of transmitting the comment to the group corresponding to the operation.
  • the terminal device can transmit the comment to the group at an appropriate timing by executing the process of transmitting the comment to the group corresponding to the operation.
  • the processing unit executes processing of transmitting the comment to all of the plurality of groups.
  • the terminal device can collectively transmit the comment to all of the plurality of groups by executing the processing of transmitting the comment to all of the plurality of groups.
  • the information processing apparatus such as the remote meeting server 100 and the terminal device 10 according to each embodiment described above is realized by the computer 1000 having a configuration as illustrated in FIG. 26 , for example.
  • FIG. 26 is a hardware configuration diagram illustrating an example of the computer 1000 that implements the functions of the information processing apparatus.
  • the remote meeting server 100 according to the embodiment will be described as an example.
  • the computer 1000 includes a CPU 1100, a RAM 1200, a read only memory (ROM) 1300, a hard disk drive (HDD) 1400, a communication interface 1500, and an input/output interface 1600. Each unit of the computer 1000 is connected by a bus 1050.
  • the CPU 1100 operates on the basis of a program stored in the ROM 1300 or the HDD 1400, and controls each unit. For example, the CPU 1100 develops a program stored in the ROM 1300 or the HDD 1400 in the RAM 1200, and executes processing corresponding to various programs.
  • the ROM 1300 stores a boot program such as a basic input output system (BIOS) executed by the CPU 1100 when the computer 1000 is activated, a program depending on hardware of the computer 1000, and the like.
  • BIOS basic input output system
  • the HDD 1400 is a computer-readable recording medium that non-transiently records a program executed by the CPU 1100, data used by the program, and the like.
  • the HDD 1400 is a recording medium that records an information processing program according to the present disclosure, which is an example of program data 1450.
  • the communication interface 1500 is an interface for the computer 1000 to connect to an external network 1550 (for example, the Internet).
  • the CPU 1100 receives data from other device or transmits data generated by the CPU 1100 to other device via the communication interface 1500.
  • the input/output interface 1600 is an interface for connecting an input/output device 1650 and the computer 1000.
  • the CPU 1100 receives data from an input device such as a keyboard and a mouse via the input/output interface 1600.
  • the CPU 1100 transmits data to an output device such as a display, a speaker, or a printer via the input/output interface 1600.
  • the input/output interface 1600 may function as a media interface that reads a program or the like recorded in a predetermined recording medium (media).
  • the medium is, for example, an optical recording medium such as a digital versatile disc (DVD) or a phase change rewritable disk (PD), a magneto-optical recording medium such as a magneto-optical disk (MO), a tape medium, a magnetic recording medium, a semiconductor memory, or the like.
  • an optical recording medium such as a digital versatile disc (DVD) or a phase change rewritable disk (PD)
  • a magneto-optical recording medium such as a magneto-optical disk (MO)
  • a tape medium such as a magnetic tape, a magnetic recording medium, a semiconductor memory, or the like.
  • the CPU 1100 of the computer 1000 implements the functions of the control unit 130 and the like by executing the information processing program loaded on the RAM 1200.
  • the HDD 1400 stores an information processing program according to the present disclosure and data in the storage unit 120. Note that the CPU 1100 reads the program data 1450 from the HDD 1400 and executes the program data 1450, but as another example, these programs may be acquired from another device via the external network 1550.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)
EP22791366.2A 2021-04-22 2022-02-24 Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, endgerät und anzeigeverfahren Pending EP4329292A4 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021072658 2021-04-22
PCT/JP2022/007572 WO2022224584A1 (ja) 2021-04-22 2022-02-24 情報処理装置、情報処理方法、端末装置及び表示方法

Publications (2)

Publication Number Publication Date
EP4329292A1 true EP4329292A1 (de) 2024-02-28
EP4329292A4 EP4329292A4 (de) 2024-10-02

Family

ID=83722782

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22791366.2A Pending EP4329292A4 (de) 2021-04-22 2022-02-24 Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, endgerät und anzeigeverfahren

Country Status (2)

Country Link
EP (1) EP4329292A4 (de)
WO (1) WO2022224584A1 (de)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024185096A1 (ja) * 2023-03-08 2024-09-12 日本電信電話株式会社 遠隔対話システム、方法およびプログラム

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6047903B2 (ja) * 2012-03-27 2016-12-21 富士通株式会社 グループ作業支援方法、グループ作業支援プログラム、グループ作業支援サーバ及びグループ作業支援システム
JP6028582B2 (ja) * 2013-01-16 2016-11-16 大日本印刷株式会社 サーバ装置、プログラム及び通信システム
JP2014191355A (ja) * 2013-03-26 2014-10-06 Oki Data Corp 文字入力装置及び文字入力方法
US20210076002A1 (en) * 2017-09-11 2021-03-11 Michael H Peters Enhanced video conference management
JP6783029B2 (ja) * 2018-03-22 2020-11-11 Kddi株式会社 研修におけるユーザ同士の議論内容を分析する装置、プログラム及び方法
US20200302817A1 (en) * 2019-03-21 2020-09-24 Foundry College, Inc. Online classroom system and method for conducting breakout groups
EP4322090A4 (de) * 2021-04-06 2024-09-18 Sony Group Corp Informationsverarbeitungsvorrichtung und informationsverarbeitungsverfahren

Also Published As

Publication number Publication date
EP4329292A4 (de) 2024-10-02
WO2022224584A1 (ja) 2022-10-27

Similar Documents

Publication Publication Date Title
US20220382989A1 (en) Multimodal Entity and Coreference Resolution for Assistant Systems
JP6939037B2 (ja) 会議コンテンツを表現する方法、プログラム、及び装置
US10891436B2 (en) Device and method for voice-driven ideation session management
US10257241B2 (en) Multimodal stream processing-based cognitive collaboration system
US8553065B2 (en) System and method for providing augmented data in a network environment
US11114095B2 (en) Information processing device
US20030187632A1 (en) Multimedia conferencing system
US11457061B2 (en) Creating a cinematic storytelling experience using network-addressable devices
WO2020139121A1 (en) Systems and methods for recognizing a speech of a speaker
US20210034663A1 (en) Systems and methods for managing voice queries using pronunciation information
US20200403816A1 (en) Utilizing volume-based speaker attribution to associate meeting attendees with digital meeting content
WO2020186701A1 (zh) 用户位置的查找方法、装置、设备及介质
US11863711B2 (en) Speaker segment analysis for conferences
US20190341059A1 (en) Automatically identifying speakers in real-time through media processing with dialog understanding supported by ai techniques
US20240171418A1 (en) Information processing device and information processing method
US20210034662A1 (en) Systems and methods for managing voice queries using pronunciation information
EP4329292A1 (de) Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren, endgerät und anzeigeverfahren
KR102135077B1 (ko) 인공지능 스피커를 이용한 실시간 이야깃거리 제공 시스템
EP4331188A1 (de) Automatisierte aufnahmehervorhebungen für konferenzen
CN113111658B (zh) 校验信息的方法、装置、设备和存储介质
JP7196393B2 (ja) 情報提示装置、情報提示システム、情報提示方法およびプログラム
JP6709709B2 (ja) 情報処理装置、情報処理システム、情報処理方法及びプログラム
US11430429B2 (en) Information processing apparatus and information processing method
WO2021092415A1 (en) Systems and methods for automating voice commands
US20230069287A1 (en) Server device, conference assistance system, conference assistance method, and non-transitory computer readable storage medium

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231122

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20240903

RIC1 Information provided on ipc code assigned before grant

Ipc: G06Q 10/10 20230101ALI20240828BHEP

Ipc: H04N 21/488 20110101ALI20240828BHEP

Ipc: G10L 15/00 20130101ALI20240828BHEP

Ipc: G06F 16/33 20190101ALI20240828BHEP

Ipc: H04N 7/15 20060101AFI20240828BHEP