WO2023182230A1 - Information processing device and program - Google Patents

Information processing device and program Download PDF

Info

Publication number
WO2023182230A1
WO2023182230A1 PCT/JP2023/010696 JP2023010696W WO2023182230A1 WO 2023182230 A1 WO2023182230 A1 WO 2023182230A1 JP 2023010696 W JP2023010696 W JP 2023010696W WO 2023182230 A1 WO2023182230 A1 WO 2023182230A1
Authority
WO
WIPO (PCT)
Prior art keywords
image signal
user terminals
information processing
processing device
unit
Prior art date
Application number
PCT/JP2023/010696
Other languages
French (fr)
Japanese (ja)
Inventor
正行 沼尾
Original Assignee
国立大学法人大阪大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人大阪大学 filed Critical 国立大学法人大阪大学
Priority to JP2024510138A priority Critical patent/JP7549936B2/en
Publication of WO2023182230A1 publication Critical patent/WO2023182230A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities

Definitions

  • the present invention relates to an information processing device and a program.
  • a web conference system that connects a plurality of terminals whose access is accepted by a predetermined server.
  • the server mediates the exchange of image signals (video) and audio signals between the terminals that have received access, thereby enabling exchange between the terminals that have received access.
  • a web conference system a system has been proposed in which participants in a meeting can have a direct talk (direct talk) (for example, see Patent Document 1).
  • the server mediates between accessed terminals.
  • the server to be connected to each participating web conference system also differs. Therefore, it is not possible to directly conduct a web conference between terminals connected to different web conference systems.
  • connections can be easily established between a plurality of terminals.
  • the present invention has been made in view of these points, and an object of the present invention is to provide an information processing device and a program that can easily establish a connection between a plurality of terminals.
  • the present invention provides an information processing device that is connected to a plurality of terminals and shares an image signal among the plurality of terminals, comprising: an image signal acquisition unit that obtains an image signal output from each of the plurality of terminals; Establishing a connection between the identification unit that identifies the image signal that has a correspondence relationship with the acquired content and the terminal that outputs the identified image signal, and outputs the image from each of the terminals.
  • the present invention relates to an information processing apparatus including a combining unit that combines signals, and an output unit that outputs the combined image signals to each of the terminals.
  • the identifying unit identifies the acquired image signal in which an image showing a predetermined relationship is included as the image signal having a corresponding relationship.
  • the identifying unit identifies, in the acquired image signal, the image signal that includes a condition indicating a predetermined relationship as the image signal that has a corresponding relationship.
  • the synthesis unit synthesizes the image signal obtained from one of the terminals by superimposing the image signal obtained from the other terminal.
  • the present invention also provides a program that causes a computer to function as an information processing device that is connected to a plurality of terminals and causes image signals to be shared among the plurality of terminals, wherein Establishing a connection between an image signal acquisition unit that acquires an image signal, a identification unit that identifies the image signal that corresponds to the acquired content, and the terminal that outputs the identified image signal,
  • the present invention relates to a program that functions as a combining unit that combines the image signals output from each of the terminals, and an output unit that outputs the combined image signal to each of the terminals.
  • the present invention can provide an information processing device and a program that can easily establish connections between multiple terminals.
  • FIG. 1 is a configuration diagram showing an overview of an information processing system including an information processing device according to an embodiment of the present invention.
  • FIG. 1 is a block diagram showing the configuration of an information processing device according to an embodiment.
  • FIG. 2 is a screen diagram output by the information processing device of one embodiment.
  • 1 is a flowchart showing the flow of operations of an information processing device according to an embodiment.
  • the information processing device 1 is, for example, an electronic computer such as a personal computer or a tablet.
  • the information processing device 1 transmits video output (hereinafter also referred to as an image signal) and audio output (hereinafter also referred to as an audio signal) between user terminals 201, 202, 203, and 204 of a plurality of users participating in a web conference.
  • video output hereinafter also referred to as an image signal
  • audio output hereinafter also referred to as an audio signal
  • the information processing device 1 is, for example, a device that synthesizes the video output and audio output of user terminals 201, 202, 203, and 204 participating in different web conferences.
  • the information processing device 1 combines image signals between user terminals 201, 202, 203, and 204 that meet a predetermined condition among image signals in a web conference. Further, the information processing device 1 synthesizes audio signals between the user terminals 201, 202, 203, and 204 that meet a predetermined condition. The information processing device 1 delivers the combined image signal and audio signal to the user terminals 201, 202, 203, and 204 that are the sources of the combination. Thereby, the information processing apparatus 1 is intended to enable connection between user terminals participating in different web conference systems.
  • the information processing system includes, for example, user terminals 201, 202, 203, 204, conference servers 301, 302, and information processing apparatus 1, as shown in FIG.
  • the user terminals 201, 202, 203, and 204 are, for example, electronic computers such as personal computers or tablets. User terminals 201, 202, 203, and 204 are provided for each user who connects to the web conference.
  • the user terminals 201, 202, 203, and 204 are devices that can output image signals and audio signals.
  • the user terminals 201, 202, 203, and 204 output image signals including, for example, a user's avatar or face image. Further, the user terminals 201, 202, 203, and 204 output audio signals including the user's voice acquired by the user terminals 201, 202, 203, and 204, for example.
  • the conference servers 301 and 302 are, for example, servers.
  • Conference servers 301 and 302 provide a platform for conducting a web conference between user terminals 201, 202, 203, and 204.
  • the conference servers 301 and 302 establish connections to the user terminals 201, 202, 203, and 204 that have requested connections via the network N.
  • the conference servers 301, 302 provide the image signals and video signals output from the connected user terminals 201, 202, 203, 204 to the other connected user terminals 201, 202, 203, 204.
  • the conference servers 301 and 302 realize a web conference between the user terminals 201, 202, 203, and 204 with which the connection has been established.
  • the information processing device 1 is connected to a plurality of user terminals 201, 202, 203, and 204 used by users, and allows image signals to be shared among the plurality of user terminals 201, 202, 203, and 204.
  • the information processing device 1 enables, for example, establishing communication between user terminals 201, 202, 203, and 204 connected to different web conference systems.
  • the information processing device 1 is connected to each of conference servers 301 and 302, for example.
  • the information processing device 1 is connected to conference servers 301 and 302 as a host that hosts a web conference, for example.
  • the information processing device 1 includes an image signal acquisition section 11, a specifying section 12, a combining section 13, and an output section 14.
  • the information processing device 1 as a host of a web conference, creates a meeting room for holding a web conference on each of the conference servers 301 and 302.
  • the information processing device 1 transmits the created URL for participating in the web conference to the user terminals 201, 202, 203, and 204.
  • the information processing device 1 can connect to another conference server, for example, when the connection of the user terminals 201, 202, 203, 204 to one of the conference servers 301, 302 is malfunctioning. It can be done.
  • the information processing device 1 sends the URL to the user terminals 201, 202, 203, and 204, as well as the room name and code (QR code (registered trademark), etc.) of the breakout room for entering the breakout room. May be distributed.
  • QR code registered trademark
  • the image signal acquisition unit 11 is realized, for example, by the operation of a CPU.
  • the image signal acquisition unit 11 acquires image signals output from each of the plurality of user terminals 201, 202, 203, and 204.
  • the image signal acquisition unit 11 acquires, for example, images displayed on the user terminals 201, 202, 203, and 204 of users participating in the web conference as image signals. Further, the image signal acquisition unit 11 acquires, as an audio signal, the audio input to the user terminals 201, 202, 203, and 204 of users participating in the web conference, for example.
  • the image signal acquisition unit 11 serves as a host of a web conference executed on each of the conference servers 301 and 302, and receives image signals and audio signals from user terminals 201, 202, 203, and 204 of users participating in the web conference. get.
  • the image signal acquisition unit 11 includes a plurality of user terminals 201, 202, 203, 204 connected to predetermined conference servers 301, 302, at least one of which is connected to a different conference server 301, 302. Image signals are acquired from 202, 203, and 204.
  • the image signal acquisition unit 11 acquires image signals from user terminals 201 and 202 connected to the conference server 301 and user terminals 203 and 204 connected to the conference server 302.
  • the identification unit 12 is realized, for example, by the operation of a CPU.
  • the specifying unit 12 specifies an image signal that has a correspondence relationship with the acquired image signal.
  • the identifying unit 12 identifies, for example, an image signal in which an image showing a predetermined relationship is included in the acquired image signal as an image signal having a corresponding relationship. Further, the specifying unit 12 specifies, in the acquired image signal, an image signal that includes a condition indicating a predetermined relationship as an image signal having a corresponding relationship.
  • the identification unit 12 determines in advance the movement of an avatar representing a user to a predetermined position on the screen where the user can interact with the specific user. Identify as a defined relationship. Further, the specifying unit 12 specifies entry into a predetermined room (breakout room) as a predetermined condition, for example, in image signals acquired from other user terminals 201, 202, 203, and 204. That is, the identification unit 12 determines, for example, in the image signals acquired from other user terminals 201, 202, 203, and 204, the user's behavior indicating a desire to interact with a limited number of other users, Identify the behavior of the user who has been tagged.
  • a predetermined room breakout room
  • the identifying unit 12 identifies the movement of the avatar to a position on the image indicating one room name in the image signal of one user terminal 201, 202, 203, 204. Further, the identification unit 12 identifies a change in the image indicating entry into a room having the same room name in the image signals of other user terminals 201, 202, 203, and 204. The identification unit 12 identifies these movements and entrances (image transitions) as image signals having a predetermined correspondence relationship.
  • the identifying unit 12 identifies, for example, the movement of the user's avatar to the position where the breakout conference is to be started, using image signals acquired from the user terminals 201 and 202. Further, the specifying unit 12 specifies, for example, a transition of an image indicating entry into a breakout room with respect to image signals acquired from the user terminals 203 and 204. The specifying unit 12 specifies, among image signals acquired from the user terminals 201, 202, 203, and 204, image signals that have a corresponding relationship that is recognized as an entry into the same room.
  • the synthesis unit 13 is realized, for example, by the operation of a CPU.
  • the combining unit 13 establishes a connection between the user terminals 201, 202, 203, and 204 outputting the identified image signals.
  • the synthesis unit 13 outputs the audio signals acquired from the user terminals 201, 202, 203, and 204 that output the identified image signals between the user terminals 201, 202, 203, and 204, so that the user terminals 201, A connection between 202, 203, and 204 is established.
  • the combining unit 13 combines image signals output from the respective user terminals 201, 202, 203, and 204.
  • the composition unit 13 creates one composite image including an image signal acquired from one user terminal 201, 202, 203, 204 and an image signal acquired from the other user terminal 201, 202, 203, 204.
  • the image signals are synthesized.
  • the combining unit 13 includes the acquired image signals in one window and displays the image signals as a screen to be superimposed on the screens (web conference screens) displayed on the respective user terminals 201, 202, 203, and 204. Synthesize. For example, as shown in FIG.
  • the synthesizing unit 13 synthesizes image signals to be displayed using a PIP (picture-in-picture) function. Specifically, the composition unit 13 generates an image signal to create a composite image 22 to be superimposed on the display image 21 displayed on the user terminals 201, 202, 203, and 204. Further, the combining unit 13 combines images to be displayed using, for example, a PBP (picture-by-picture) function.
  • PIP picture-in-picture
  • the composition unit 13 generates an image signal to create a composite image 22 to be superimposed on the display image 21 displayed on the user terminals 201, 202, 203, and 204.
  • the combining unit 13 combines images to be displayed using, for example, a PBP (picture-by-picture) function.
  • the output unit 14 is realized by, for example, operating a CPU.
  • the output unit 14 outputs the combined image signals to the respective user terminals 201, 202, 203, and 204.
  • the output unit 14 outputs the combined image signal to, for example, user terminals 201, 202, 203, and 204 that output image signals with the identified correspondence.
  • the output unit 14 outputs a composite image that can be displayed on the user terminals 201, 202, 203, and 204 using, for example, the PIP function or the PIB function.
  • the image signal acquisition unit 11 acquires image signals and audio signals from the user terminals 201, 202, 203, and 204 of users participating in the web conference (step S1).
  • the specifying unit 12 specifies image signals having a predetermined correspondence relationship (step S2, step S3).
  • step S3: YES If the image signal has a predetermined correspondence relationship (step S3: YES), the process proceeds to step S4. On the other hand, if the image signals do not have the predetermined correspondence relationship (step S3: NO), the process proceeds to step S6.
  • step S4 the synthesis unit 13 synthesizes the corresponding image signals to create a composite image.
  • the output unit 14 outputs the created composite image to the user terminals 201, 202, 203, and 204 that have output the corresponding image signals (step S5). The process then proceeds to step S6.
  • step S6 it is determined whether the connection to the web conference has ended. If the connection to the web conference is terminated (step S6: YES), the processing according to this flow ends. Further, if the connection to the web conference is continued (step S6: NO), the process returns to step S1.
  • Each configuration included in the information processing device 1 can be realized by hardware, software, or a combination thereof.
  • being realized by software means being realized by a computer reading and executing a program.
  • Non-transitory computer-readable media include various types of tangible storage media.
  • Examples of non-transitory computer-readable media include magnetic recording media (e.g., flexible disks, magnetic tape, hard disk drives), magneto-optical recording media (e.g., magneto-optical disks), CD-ROMs (Read Only Memory), and CD-ROMs. R, CD-R/W, semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).
  • the display program may also be provided to the computer via various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves.
  • the temporary computer-readable medium can provide the program to the computer via wired communication channels, such as electrical wires and fiber optics, or wireless communication channels.
  • An information processing device 1 that is connected to a plurality of user terminals 201, 202, 203, and 204 used by users and that shares image signals among the plurality of user terminals 201, 202, 203, and 204;
  • An image signal acquisition unit 11 that acquires image signals output from each of the user terminals 201, 202, 203, and 204, a identification unit 12 that identifies image signals that have a correspondence relationship with the acquired content, and an identified image.
  • a synthesis unit 13 that establishes connections between the user terminals 201, 202, 203, and 204 that are outputting signals, and synthesizes image signals output from the respective user terminals 201, 202, 203, and 204; and an output unit 14 that outputs the image signals to the respective user terminals 201, 202, 203, and 204. This allows connections to be easily established between multiple terminals.
  • the specifying unit 12 specifies an image signal in which an image showing a predetermined relationship is included in the acquired image signal as an image signal having a corresponding relationship. Thereby, user terminals 201, 202, 203, and 204 that satisfy the predetermined relationship can be easily specified.
  • the specifying unit 12 specifies, in the acquired image signal, an image signal that includes a condition indicating a predetermined relationship as an image signal having a corresponding relationship. This makes it possible to easily specify user terminals 201, 202, 203, and 204 that satisfy a condition indicating a predetermined relationship.
  • the synthesis unit 13 synthesizes the image signal obtained from one user terminal 201, 202, 203, 204 by superimposing the image signal obtained from the other user terminal 201, 202, 203, 204. . Thereby, a composite image in which image signals output from the identified user terminals 201, 202, 203, and 204 are combined into one can be easily created.
  • the present invention is not limited to the above-described embodiments and can be modified as appropriate.
  • the explanation was given using a web conference as an example, but the present invention is not limited to this.
  • it may be a system that mediates between different platforms (for example, web conferences, game chats, video distribution, etc.) where conversations are realized in real time.
  • the information processing device 1 may be used for a gathering of customers on a plurality of online shop sites, an online social gathering, or the like.
  • the number of user terminals 201, 202, 203, 204 and the number of conference servers 301, 302 are not limited.
  • the number of user terminals 201, 202, 203, 204 and the number of conference servers 301, 302 may be arbitrarily determined.
  • the synthesis unit 13 synthesizes the plurality of acquired image signals, but the present invention is not limited to this.
  • the composition unit 13 may create a composite image excluding image signals acquired from the output destination user terminals 201, 202, 203, and 204. Thereby, it is possible to suppress the size of each image signal included in the composite image displayed in the window from becoming smaller on the screen.
  • the image signal acquisition unit 11 may acquire image signals of the user terminals 201, 202, 203, and 204 via the conference servers 301 and 302.
  • the information processing device 1 may start up a room as a host for each web conference.
  • the image signal acquisition unit 11 may acquire image signals provided from the conference servers 301 and 302 as image signals output from each of the user terminals 201, 202, 203, and 204.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The objective of the present invention is to provide an information processing device and program capable of easily establishing connections between a plurality of terminals. An information processing device 1 is connected to a plurality of user terminals (201, 202, 203, 204) used by users, and shares an image signal between the plurality of user terminals (201, 202, 203, 204), the information processing device 1 comprising: an image signal acquiring unit (11) for acquiring image signals output by each of the plurality of user terminals (201, 202, 203, 204); a specifying unit 12 for specifying an image signal having a correspondence relationship with acquired content; a combining unit 13 for establishing a connection between the user terminals (201, 202, 203, 204) outputting the specified image signal, and combining the image signals output from each user terminal (201, 202, 203, 204); and an output unit (14) for outputting the combined image signal to each user terminal (201, 202, 203, 204).

Description

情報処理装置及びプログラムInformation processing device and program
 本発明は、情報処理装置及びプログラムに関する。 The present invention relates to an information processing device and a program.
 従来、所定のサーバがアクセスを受け付けた複数の端末間を接続するWeb会議システムが知られている。サーバは、アクセスを受け付けた端末間で画像信号(映像)及び音声信号のやり取りを仲介することにより、アクセスを受け付けた端末間でのやりとり可能にしている。このようなWeb会議システムとして、ミーティングへの参加者同士でダイレクトなトーク(ダイレクトトーク)をすることが可能なシステムが提案されている(例えば、特許文献1参照)。 Conventionally, a web conference system is known that connects a plurality of terminals whose access is accepted by a predetermined server. The server mediates the exchange of image signals (video) and audio signals between the terminals that have received access, thereby enabling exchange between the terminals that have received access. As such a web conference system, a system has been proposed in which participants in a meeting can have a direct talk (direct talk) (for example, see Patent Document 1).
特開2021-184189号公報JP 2021-184189 Publication
 ところで、Web会議システムでは、上記のように、サーバがアクセスされた端末間を仲介する。近年では様々なWeb会議システムが存在するが、システムごとにプラットフォームが異なるため、参加するWeb会議システムごとに接続すべきサーバも異なる。このため、異なるWeb会議システムに接続された端末間では、そのままWeb会議を実施することができない。これに対し、複数の端末間で容易に接続を確立することができれば好適である。 By the way, in the web conference system, as described above, the server mediates between accessed terminals. In recent years, there are various web conference systems, but since each system has a different platform, the server to be connected to each participating web conference system also differs. Therefore, it is not possible to directly conduct a web conference between terminals connected to different web conference systems. On the other hand, it is preferable if connections can be easily established between a plurality of terminals.
 本発明は、このような点に鑑みてなされたものであり、複数の端末間で容易に接続を確立することが可能な情報処理装置及びプログラムを提供することを目的とする。 The present invention has been made in view of these points, and an object of the present invention is to provide an information processing device and a program that can easily establish a connection between a plurality of terminals.
 本発明は、複数の端末に接続され、複数の前記端末間で画像信号を共有させる情報処理装置であって、複数の前記端末のそれぞれから出力される画像信号を取得する画像信号取得部と、取得された内容に対応関係のある前記画像信号を特定する特定部と、特定された前記画像信号を出力している前記端末間の接続を確立するとともに、それぞれの前記端末から出力される前記画像信号を合成する合成部と、合成された前記画像信号をそれぞれの前記端末に出力する出力部と、を備える情報処理装置に関する。 The present invention provides an information processing device that is connected to a plurality of terminals and shares an image signal among the plurality of terminals, comprising: an image signal acquisition unit that obtains an image signal output from each of the plurality of terminals; Establishing a connection between the identification unit that identifies the image signal that has a correspondence relationship with the acquired content and the terminal that outputs the identified image signal, and outputs the image from each of the terminals. The present invention relates to an information processing apparatus including a combining unit that combines signals, and an output unit that outputs the combined image signals to each of the terminals.
 また、前記特定部は、取得した画像信号に、予め定められた関係性を示す画像が含まれる前記画像信号を対応関係のある前記画像信号として特定するのが好ましい。 Further, it is preferable that the identifying unit identifies the acquired image signal in which an image showing a predetermined relationship is included as the image signal having a corresponding relationship.
 また、前記特定部は、取得した画像信号に、予め定められた関係性を示す条件を含む前記画像信号を対応関係のある前記画像信号として特定するのが好ましい。 Further, it is preferable that the identifying unit identifies, in the acquired image signal, the image signal that includes a condition indicating a predetermined relationship as the image signal that has a corresponding relationship.
 また、前記合成部は、一方の前記端末から取得された前記画像信号に、他方の前記端末から取得された前記画像信号を重ねることにより合成するのが好ましい。 Further, it is preferable that the synthesis unit synthesizes the image signal obtained from one of the terminals by superimposing the image signal obtained from the other terminal.
 また、本発明は、複数の端末に接続され、複数の前記端末間で画像信号を共有させる情報処理装置としてコンピュータを機能させるプログラムであって、前記コンピュータを、複数の前記端末のそれぞれから出力される画像信号を取得する画像信号取得部、取得された内容に対応関係のある前記画像信号を特定する特定部、特定された前記画像信号を出力している前記端末間の接続を確立するとともに、それぞれの前記端末から出力される前記画像信号を合成する合成部、合成された前記画像信号をそれぞれの前記端末に出力する出力部、として機能させるプログラムに関する。 The present invention also provides a program that causes a computer to function as an information processing device that is connected to a plurality of terminals and causes image signals to be shared among the plurality of terminals, wherein Establishing a connection between an image signal acquisition unit that acquires an image signal, a identification unit that identifies the image signal that corresponds to the acquired content, and the terminal that outputs the identified image signal, The present invention relates to a program that functions as a combining unit that combines the image signals output from each of the terminals, and an output unit that outputs the combined image signal to each of the terminals.
 本発明は、複数の端末間で容易に接続を確立することが可能な情報処理装置及びプログラムを提供することができる。 The present invention can provide an information processing device and a program that can easily establish connections between multiple terminals.
本発明の一実施形態に係る情報処理装置を含む情報処理システムの概要を示す構成図である。1 is a configuration diagram showing an overview of an information processing system including an information processing device according to an embodiment of the present invention. 一実施形態の情報処理装置の構成を示すブロック図である。FIG. 1 is a block diagram showing the configuration of an information processing device according to an embodiment. 一実施形態の情報処理装置によって出力される画面図である。FIG. 2 is a screen diagram output by the information processing device of one embodiment. 一実施形態の情報処理装置の動作の流れを示すフローチャートである。1 is a flowchart showing the flow of operations of an information processing device according to an embodiment.
 以下、本発明の一実施形態に係る情報処理装置1及びプログラムについて、図1から図4を参照して説明する。
 まず、情報処理装置1の概要について説明する。
Hereinafter, an information processing device 1 and a program according to an embodiment of the present invention will be described with reference to FIGS. 1 to 4.
First, an overview of the information processing device 1 will be explained.
 情報処理装置1は、例えば、パーソナルコンピュータ又はタブレット等の電子計算機である。情報処理装置1は、例えば、Web会議に参加している複数のユーザのユーザ端末201,202,203,204間の映像出力(以下、画像信号ともいう)及び音声出力(以下、音声信号ともいう)を合成出力する装置である。情報処理装置1は、例えば、異なるWeb会議に参加しているユーザ端末201,202,203,204の間で両者の映像出力及び音声出力を合成する装置である。 The information processing device 1 is, for example, an electronic computer such as a personal computer or a tablet. For example, the information processing device 1 transmits video output (hereinafter also referred to as an image signal) and audio output (hereinafter also referred to as an audio signal) between user terminals 201, 202, 203, and 204 of a plurality of users participating in a web conference. ) is a device that synthesizes and outputs. The information processing device 1 is, for example, a device that synthesizes the video output and audio output of user terminals 201, 202, 203, and 204 participating in different web conferences.
 具体的には、情報処理装置1は、Web会議における画像信号において、所定の条件を満たしたユーザ端末201,202,203,204間の画像信号同士を合成する。また、情報処理装置1は、所定の条件を満たしたユーザ端末201,202,203,204間の音声信号同士を合成する。情報処理装置1は、合成した画像信号及び音声信号を合成元のユーザ端末201,202,203,204に配信する。これにより、情報処理装置1は、異なるWeb会議システムに参加したユーザ端末間を接続することを可能にすることを図ったものである。 Specifically, the information processing device 1 combines image signals between user terminals 201, 202, 203, and 204 that meet a predetermined condition among image signals in a web conference. Further, the information processing device 1 synthesizes audio signals between the user terminals 201, 202, 203, and 204 that meet a predetermined condition. The information processing device 1 delivers the combined image signal and audio signal to the user terminals 201, 202, 203, and 204 that are the sources of the combination. Thereby, the information processing apparatus 1 is intended to enable connection between user terminals participating in different web conference systems.
 次に、一実施形態に係る情報処理装置1を含む情報処理システムについて、図1を参照して説明する。
 情報処理システムは、例えば、図1に示すように、ユーザ端末201,202,203,204と、会議サーバ301,302と、情報処理装置1と、を含む。
Next, an information processing system including an information processing device 1 according to an embodiment will be described with reference to FIG. 1.
The information processing system includes, for example, user terminals 201, 202, 203, 204, conference servers 301, 302, and information processing apparatus 1, as shown in FIG.
 ユーザ端末201,202,203,204は、例えば、パーソナルコンピュータ又はタブレット等の電子計算機である。ユーザ端末201,202,203,204は、Web会議に接続するユーザごとに設けられる。ユーザ端末201,202,203,204は、画像信号及び音声信号を出力可能な装置である。ユーザ端末201,202,203,204は、例えば、ユーザのアバターや顔画像を含む画像信号を出力する。また、ユーザ端末201,202,203,204は、例えば、ユーザ端末201,202,203,204によって取得されたユーザの声を含む音声信号を出力する。 The user terminals 201, 202, 203, and 204 are, for example, electronic computers such as personal computers or tablets. User terminals 201, 202, 203, and 204 are provided for each user who connects to the web conference. The user terminals 201, 202, 203, and 204 are devices that can output image signals and audio signals. The user terminals 201, 202, 203, and 204 output image signals including, for example, a user's avatar or face image. Further, the user terminals 201, 202, 203, and 204 output audio signals including the user's voice acquired by the user terminals 201, 202, 203, and 204, for example.
 会議サーバ301,302は、例えば、サーバである。会議サーバ301,302は、ユーザ端末201,202,203,204間においてWeb会議を実施するためのプラットフォームを提供する。会議サーバ301,302は例えば、ネットワークNを介して接続をリクエストされたユーザ端末201,202,203,204に対して、接続を確立する。これにより、会議サーバ301,302は、接続したユーザ端末201,202,203,204から出力される画像信号及び映像信号を他の接続したユーザ端末201,202,203,204に提供する。これにより、会議サーバ301,302は、接続を確立したユーザ端末201,202,203,204間のWeb会議を実現する。 The conference servers 301 and 302 are, for example, servers. Conference servers 301 and 302 provide a platform for conducting a web conference between user terminals 201, 202, 203, and 204. For example, the conference servers 301 and 302 establish connections to the user terminals 201, 202, 203, and 204 that have requested connections via the network N. Thereby, the conference servers 301, 302 provide the image signals and video signals output from the connected user terminals 201, 202, 203, 204 to the other connected user terminals 201, 202, 203, 204. Thereby, the conference servers 301 and 302 realize a web conference between the user terminals 201, 202, 203, and 204 with which the connection has been established.
 次に、一実施形態に係る情報処理装置1について、図2から図4を参照して説明する。
 情報処理装置1は、ユーザによって用いられる複数のユーザ端末201,202,203,204に接続され、複数のユーザ端末201,202,203,204間で画像信号を共有させる。情報処理装置1は、例えば、異なるWeb会議システムに接続されるユーザ端末201,202,203,204間で通信を確立することを可能にする。情報処理装置1は、例えば、図2に示すように、1つの会議サーバ301に接続されるユーザ端末201,202と、1つの会議サーバ302に接続されるユーザ端末203,204との間の通信を確立する。情報処理装置1は、例えば、会議サーバ301,302のそれぞれに接続される。情報処理装置1は、例えば、Web会議を主催するホストとして会議サーバ301,302に接続される。情報処理装置1は、画像信号取得部11と、特定部12と、合成部13と、出力部14と、を備える。
Next, the information processing device 1 according to one embodiment will be described with reference to FIGS. 2 to 4.
The information processing device 1 is connected to a plurality of user terminals 201, 202, 203, and 204 used by users, and allows image signals to be shared among the plurality of user terminals 201, 202, 203, and 204. The information processing device 1 enables, for example, establishing communication between user terminals 201, 202, 203, and 204 connected to different web conference systems. For example, the information processing device 1, as shown in FIG. Establish. The information processing device 1 is connected to each of conference servers 301 and 302, for example. The information processing device 1 is connected to conference servers 301 and 302 as a host that hosts a web conference, for example. The information processing device 1 includes an image signal acquisition section 11, a specifying section 12, a combining section 13, and an output section 14.
 本実施形態において、情報処理装置1は、Web会議のホストとして、会議サーバ301,302のそれぞれにWeb会議を実施するためのミーティングルームを作成する。情報処理装置1は、作成したWeb会議に参加するためのURLをユーザ端末201,202,203,204に送信する。これにより、情報処理装置1は、例えば、ユーザ端末201,202,203,204の会議サーバ301,302のいずれかのサーバへの接続が不調である場合に、他の会議サーバへの接続を可能にすることができる。また、情報処理装置1は、ユーザ端末201,202,203,204へのURLの送付とともに、ブレークアウトルームに入場するためのブレークアウトルームのルーム名およびコード(QRコード(登録商標)等)を配布してもよい。 In this embodiment, the information processing device 1, as a host of a web conference, creates a meeting room for holding a web conference on each of the conference servers 301 and 302. The information processing device 1 transmits the created URL for participating in the web conference to the user terminals 201, 202, 203, and 204. As a result, the information processing device 1 can connect to another conference server, for example, when the connection of the user terminals 201, 202, 203, 204 to one of the conference servers 301, 302 is malfunctioning. It can be done. In addition, the information processing device 1 sends the URL to the user terminals 201, 202, 203, and 204, as well as the room name and code (QR code (registered trademark), etc.) of the breakout room for entering the breakout room. May be distributed.
 画像信号取得部11は、例えば、CPUが動作することにより実現される。画像信号取得部11は、複数のユーザ端末201,202,203,204のそれぞれから出力される画像信号を取得する。画像信号取得部11は、例えば、Web会議に参加しているユーザのユーザ端末201,202,203,204に表示される画像を画像信号として取得する。また、画像信号取得部11は、例えば、Web会議に参加しているユーザのユーザ端末201,202,203,204に入力される音声を音声信号として取得する。画像信号取得部11は、例えば、会議サーバ301,302のそれぞれにおいて実行されるWeb会議のホストとして、当該Web会議に参加するユーザのユーザ端末201,202,203,204から画像信号及び音声信号を取得する。画像信号取得部11は、所定の会議サーバ301,302に接続される複数のユーザ端末201,202,203,204であって、少なくとも1つが異なる会議サーバ301,302に接続されるユーザ端末201,202,203,204から画像信号を取得する。本実施形態において、画像信号取得部11は、会議サーバ301に接続されるユーザ端末201,202と、会議サーバ302に接続されるユーザ端末203,204とから画像信号を取得する。 The image signal acquisition unit 11 is realized, for example, by the operation of a CPU. The image signal acquisition unit 11 acquires image signals output from each of the plurality of user terminals 201, 202, 203, and 204. The image signal acquisition unit 11 acquires, for example, images displayed on the user terminals 201, 202, 203, and 204 of users participating in the web conference as image signals. Further, the image signal acquisition unit 11 acquires, as an audio signal, the audio input to the user terminals 201, 202, 203, and 204 of users participating in the web conference, for example. For example, the image signal acquisition unit 11 serves as a host of a web conference executed on each of the conference servers 301 and 302, and receives image signals and audio signals from user terminals 201, 202, 203, and 204 of users participating in the web conference. get. The image signal acquisition unit 11 includes a plurality of user terminals 201, 202, 203, 204 connected to predetermined conference servers 301, 302, at least one of which is connected to a different conference server 301, 302. Image signals are acquired from 202, 203, and 204. In this embodiment, the image signal acquisition unit 11 acquires image signals from user terminals 201 and 202 connected to the conference server 301 and user terminals 203 and 204 connected to the conference server 302.
 特定部12は、例えば、CPUが動作することにより実現される。特定部12は、取得した画像信号に、対応関係のある画像信号を特定する。特定部12は、例えば、取得した画像信号に予め定められた関係性を示す画像が含まれる画像信号を対応関係のある画像信号として特定する。また、特定部12は、取得した画像信号に、予め定められた関係性を示す条件を含む画像信号を対応関係のある画像信号として特定する。 The identification unit 12 is realized, for example, by the operation of a CPU. The specifying unit 12 specifies an image signal that has a correspondence relationship with the acquired image signal. The identifying unit 12 identifies, for example, an image signal in which an image showing a predetermined relationship is included in the acquired image signal as an image signal having a corresponding relationship. Further, the specifying unit 12 specifies, in the acquired image signal, an image signal that includes a condition indicating a predetermined relationship as an image signal having a corresponding relationship.
 特定部12は、例えば、1つのユーザ端末201,202,203,204から取得した画像信号において、画面上で特定のユーザと対話可能な予め定められた位置へのユーザを示すアバターの移動を予め定められた関係性として特定する。また、特定部12は、例えば、他のユーザ端末201,202,203,204から取得した画像信号において、予め定められたルーム(ブレークアウトルーム)への入場を予め定められた条件として特定する。すなわち、特定部12は、例えば、他のユーザ端末201,202,203,204から取得した画像信号において、限られた他のユーザと対話を希望することを示すユーザの行動であって、予め対応付けられたユーザの行動を特定する。具体的には、特定部12は、1つのユーザ端末201,202,203,204の画像信号において、1つのルーム名を示す画像上の位置へのアバターの移動を特定する。また、特定部12は、他のユーザ端末201,202,203,204の画像信号において、同じルーム名を示すルームへの入室を示す画像上の変化を特定する。特定部12は、これらの移動及入室(画像遷移)について、予め定められた対応関係のある画像信号と特定する。 For example, in the image signal acquired from one user terminal 201, 202, 203, 204, the identification unit 12 determines in advance the movement of an avatar representing a user to a predetermined position on the screen where the user can interact with the specific user. Identify as a defined relationship. Further, the specifying unit 12 specifies entry into a predetermined room (breakout room) as a predetermined condition, for example, in image signals acquired from other user terminals 201, 202, 203, and 204. That is, the identification unit 12 determines, for example, in the image signals acquired from other user terminals 201, 202, 203, and 204, the user's behavior indicating a desire to interact with a limited number of other users, Identify the behavior of the user who has been tagged. Specifically, the identifying unit 12 identifies the movement of the avatar to a position on the image indicating one room name in the image signal of one user terminal 201, 202, 203, 204. Further, the identification unit 12 identifies a change in the image indicating entry into a room having the same room name in the image signals of other user terminals 201, 202, 203, and 204. The identification unit 12 identifies these movements and entrances (image transitions) as image signals having a predetermined correspondence relationship.
 本実施形態において、特定部12は、例えば、ユーザ端末201,202から取得した画像信号について、ブレークアウト会議を開始する位置へのユーザのアバターの移動を特定する。また、特定部12は、例えば、ユーザ端末203,204から取得した画像信号について、ブレークアウトルームへの入室を示す画像の遷移を特定する。特定部12は、ユーザ端末201,202,203,204から取得した画像信号について、同じルームへの入室であると認められる対応関係にある画像信号を特定する。 In the present embodiment, the identifying unit 12 identifies, for example, the movement of the user's avatar to the position where the breakout conference is to be started, using image signals acquired from the user terminals 201 and 202. Further, the specifying unit 12 specifies, for example, a transition of an image indicating entry into a breakout room with respect to image signals acquired from the user terminals 203 and 204. The specifying unit 12 specifies, among image signals acquired from the user terminals 201, 202, 203, and 204, image signals that have a corresponding relationship that is recognized as an entry into the same room.
 合成部13は、例えば、CPUが動作することにより実現される。合成部13は、特定された画像信号を出力しているユーザ端末201,202,203,204間の接続を確立する。合成部13は、例えば、特定された画像信号を出力するユーザ端末201,202,203,204から取得した音声信号について、ユーザ端末201,202,203,204間に出力することによりユーザ端末201,202,203,204間の接続を確立する。 The synthesis unit 13 is realized, for example, by the operation of a CPU. The combining unit 13 establishes a connection between the user terminals 201, 202, 203, and 204 outputting the identified image signals. For example, the synthesis unit 13 outputs the audio signals acquired from the user terminals 201, 202, 203, and 204 that output the identified image signals between the user terminals 201, 202, 203, and 204, so that the user terminals 201, A connection between 202, 203, and 204 is established.
 また、合成部13は、それぞれのユーザ端末201,202,203,204から出力される画像信号を合成する。合成部13は、一方のユーザ端末201,202,203,204から取得された画像信号と、他方のユーザ端末201,202,203,204から取得された画像信号とを含む1つの合成画像を作製することにより画像信号を合成する。合成部13は、例えば、取得した画像信号を1つのウィンドウに含め、それぞれのユーザ端末201,202,203,204に表示されている画面(Web会議の画面)に重畳する画面として、画像信号を合成する。合成部13は、例えば、図3に示すように、PIP(ピクチャーインピクチャ)機能を用いて表示される画像信号を合成する。具体的には、合成部13は、ユーザ端末201,202,203,204に表示される表示画像21に対して、重畳される合成画像22を作製するように画像信号生する。また、合成部13は、例えば、PBP(ピクチャーバイピクチャー)機能を用いて表示される画像を合成する。 Furthermore, the combining unit 13 combines image signals output from the respective user terminals 201, 202, 203, and 204. The composition unit 13 creates one composite image including an image signal acquired from one user terminal 201, 202, 203, 204 and an image signal acquired from the other user terminal 201, 202, 203, 204. By doing so, the image signals are synthesized. For example, the combining unit 13 includes the acquired image signals in one window and displays the image signals as a screen to be superimposed on the screens (web conference screens) displayed on the respective user terminals 201, 202, 203, and 204. Synthesize. For example, as shown in FIG. 3, the synthesizing unit 13 synthesizes image signals to be displayed using a PIP (picture-in-picture) function. Specifically, the composition unit 13 generates an image signal to create a composite image 22 to be superimposed on the display image 21 displayed on the user terminals 201, 202, 203, and 204. Further, the combining unit 13 combines images to be displayed using, for example, a PBP (picture-by-picture) function.
 出力部14は、例えば、CPUが動作することにより実現される。出力部14は、合成された画像信号をそれぞれのユーザ端末201,202,203,204に出力する。出力部14は、例えば、特定された対応関係のある画像信号を出力するユーザ端末201,202,203,204に対して、合成された画像信号を出力する。出力部14は、例えば、PIP機能又はPIB機能を用いてユーザ端末201,202,203,204に表示可能な合成画像を出力する。 The output unit 14 is realized by, for example, operating a CPU. The output unit 14 outputs the combined image signals to the respective user terminals 201, 202, 203, and 204. The output unit 14 outputs the combined image signal to, for example, user terminals 201, 202, 203, and 204 that output image signals with the identified correspondence. The output unit 14 outputs a composite image that can be displayed on the user terminals 201, 202, 203, and 204 using, for example, the PIP function or the PIB function.
 次に、情報処理装置1の動作について、図4のフローチャートを参照して説明する。
 まず、画像信号取得部11は、Web会議に参加しているユーザのユーザ端末201,202,203,204から、画像信号及び音声信号を取得する(ステップS1)。次いで、特定部12は、所定の対応関係にある画像信号を特定する(ステップS2、ステップS3)。
Next, the operation of the information processing device 1 will be explained with reference to the flowchart of FIG. 4.
First, the image signal acquisition unit 11 acquires image signals and audio signals from the user terminals 201, 202, 203, and 204 of users participating in the web conference (step S1). Next, the specifying unit 12 specifies image signals having a predetermined correspondence relationship (step S2, step S3).
 画像信号に所定の対応関係がある場合(ステップS3:YES)、処理は、ステップS4に進む。一方、画像信号に所定の対応関係が無い場合(ステップS3:NO)、処理は、ステップS6に進む。 If the image signal has a predetermined correspondence relationship (step S3: YES), the process proceeds to step S4. On the other hand, if the image signals do not have the predetermined correspondence relationship (step S3: NO), the process proceeds to step S6.
 ステップS4において、合成部13は、対応関係のある画像信号を合成して合成画像を作製する。次いで、出力部14は、対応関係のある画像信号を出力したユーザ端末201,202,203,204に対して、作製した合成画像を出力する(ステップS5)。次いで、処理は、ステップS6に進む。 In step S4, the synthesis unit 13 synthesizes the corresponding image signals to create a composite image. Next, the output unit 14 outputs the created composite image to the user terminals 201, 202, 203, and 204 that have output the corresponding image signals (step S5). The process then proceeds to step S6.
 ステップS6において、Web会議への接続が終了したか否かが判断される。Web会議への接続が終了された場合(ステップS6:YES)、本フローによる処理は、終了する。また、Web会議への接続が継続される場合(ステップS6:NO)、処理は、ステップS1に戻る。 In step S6, it is determined whether the connection to the web conference has ended. If the connection to the web conference is terminated (step S6: YES), the processing according to this flow ends. Further, if the connection to the web conference is continued (step S6: NO), the process returns to step S1.
 次に、本実施形態のプログラムについて説明する。
 情報処理装置1に含まれる各構成は、ハードウェア、ソフトウェア又はこれらの組み合わせによりそれぞれ実現することができる。ここで、ソフトウェアによって実現されるとは、コンピュータがプログラムを読み込んで実行することにより実現されることを意味する。
Next, the program of this embodiment will be explained.
Each configuration included in the information processing device 1 can be realized by hardware, software, or a combination thereof. Here, being realized by software means being realized by a computer reading and executing a program.
 プログラムは、様々なタイプの非一時的なコンピュータ可読媒体(non-transitory computer readable medium)を用いて格納され、コンピュータに供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記録媒体(tangible storage medium)を含む。非一時的なコンピュータ可読媒体の例は、磁気記録媒体(例えば、フレキシブルディスク、磁気テープ、ハードディスクドライブ)、光磁気記録媒体(例えば、光磁気ディスク)、CD-ROM(Read Only Memory)、CD-R、CD-R/W、半導体メモリ(例えば、マスクROM、PROM(Programmable ROM)、EPROM(Erasable PROM)、フラッシュROM、RAM(random access memory))を含む。また、表示プログラムは、様々なタイプの一時的なコンピュータ可読媒体(transitory computer readable medium)によってコンピュータに供給されてもよい。一時的なコンピュータ可読媒体の例は、電気信号、光信号、及び電磁波を含む。一時的なコンピュータ可読媒体は、電線及び光ファイバ等の有線通信路、又は無線通信路を介して、プログラムをコンピュータに供給できる。 The program can be stored and delivered to a computer using various types of non-transitory computer readable media. Non-transitory computer-readable media include various types of tangible storage media. Examples of non-transitory computer-readable media include magnetic recording media (e.g., flexible disks, magnetic tape, hard disk drives), magneto-optical recording media (e.g., magneto-optical disks), CD-ROMs (Read Only Memory), and CD-ROMs. R, CD-R/W, semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)). The display program may also be provided to the computer via various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. The temporary computer-readable medium can provide the program to the computer via wired communication channels, such as electrical wires and fiber optics, or wireless communication channels.
 以上の第1実施形態に係る情報処理装置1及びプログラムによれば、以下の効果を奏する。
(1)ユーザによって用いられる複数のユーザ端末201,202,203,204に接続され、複数のユーザ端末201,202,203,204間で画像信号を共有させる情報処理装置1であって、複数のユーザ端末201,202,203,204のそれぞれから出力される画像信号を取得する画像信号取得部11と、取得された内容に対応関係のある画像信号を特定する特定部12と、特定された画像信号を出力しているユーザ端末201,202,203,204間の接続を確立するとともに、それぞれのユーザ端末201,202,203,204から出力される画像信号を合成する合成部13と、合成された画像信号をそれぞれのユーザ端末201,202,203,204に出力する出力部14と、を備える。これにより、複数の端末間で容易に接続を確立することができる。
According to the information processing device 1 and the program according to the first embodiment described above, the following effects are achieved.
(1) An information processing device 1 that is connected to a plurality of user terminals 201, 202, 203, and 204 used by users and that shares image signals among the plurality of user terminals 201, 202, 203, and 204; An image signal acquisition unit 11 that acquires image signals output from each of the user terminals 201, 202, 203, and 204, a identification unit 12 that identifies image signals that have a correspondence relationship with the acquired content, and an identified image. A synthesis unit 13 that establishes connections between the user terminals 201, 202, 203, and 204 that are outputting signals, and synthesizes image signals output from the respective user terminals 201, 202, 203, and 204; and an output unit 14 that outputs the image signals to the respective user terminals 201, 202, 203, and 204. This allows connections to be easily established between multiple terminals.
(2)特定部12は、取得した画像信号に、予め定められた関係性を示す画像が含まれる画像信号を対応関係のある画像信号として特定する。これにより、所定の関係を満たしたユーザ端末201,202,203,204を容易に特定することができる。 (2) The specifying unit 12 specifies an image signal in which an image showing a predetermined relationship is included in the acquired image signal as an image signal having a corresponding relationship. Thereby, user terminals 201, 202, 203, and 204 that satisfy the predetermined relationship can be easily specified.
(3)特定部12は、取得した画像信号に、予め定められた関係性を示す条件を含む画像信号を対応関係のある画像信号として特定する。これにより、所定の関係性を示す条件を満たしたユーザ端末201,202,203,204を容易に特定することができる。 (3) The specifying unit 12 specifies, in the acquired image signal, an image signal that includes a condition indicating a predetermined relationship as an image signal having a corresponding relationship. This makes it possible to easily specify user terminals 201, 202, 203, and 204 that satisfy a condition indicating a predetermined relationship.
(4)合成部13は、一方のユーザ端末201,202,203,204から取得された画像信号に、他方のユーザ端末201,202,203,204から取得された画像信号を重ねることにより合成する。これにより、特定されたユーザ端末201,202,203,204から出力された画像信号を1つにまとめた合成画像を容易に作製することができる。 (4) The synthesis unit 13 synthesizes the image signal obtained from one user terminal 201, 202, 203, 204 by superimposing the image signal obtained from the other user terminal 201, 202, 203, 204. . Thereby, a composite image in which image signals output from the identified user terminals 201, 202, 203, and 204 are combined into one can be easily created.
 以上、本発明の情報処理装置1及び処理装置の好ましい各実施形態につき説明したが、本発明は、上述の実施形態に制限されるものではなく、適宜変更が可能である。
 例えば、上記実施形態において、Web会議を例に説明したが、これに制限されない。例えば、リアルタイムで会話が実現される異なるプラットフォーム(例えば、Web会議、ゲームのチャット、動画配信等)の間を仲介するシステムであってよい。また、情報処理装置1は、複数のオンラインショップのサイト上での客の集まり、又はオンライン上での懇親会等に用いられてもよい。
Although preferred embodiments of the information processing device 1 and the processing device of the present invention have been described above, the present invention is not limited to the above-described embodiments and can be modified as appropriate.
For example, in the above embodiment, the explanation was given using a web conference as an example, but the present invention is not limited to this. For example, it may be a system that mediates between different platforms (for example, web conferences, game chats, video distribution, etc.) where conversations are realized in real time. Further, the information processing device 1 may be used for a gathering of customers on a plurality of online shop sites, an online social gathering, or the like.
 また、上記実施形態において、ユーザ端末201,202,203,204の数及び会議サーバ301,302の数は、限定されない。ユーザ端末201,202,203,204の数及び会議サーバ301,302の数は、任意に決定されてよい。 Furthermore, in the above embodiment, the number of user terminals 201, 202, 203, 204 and the number of conference servers 301, 302 are not limited. The number of user terminals 201, 202, 203, 204 and the number of conference servers 301, 302 may be arbitrarily determined.
 また、上記実施形態において、合成部13は、取得した複数の画像信号を合成するとしたが、これに制限されない。合成部13は、出力先のユーザ端末201,202,203,204から取得した画像信号を除いた合成画像を作製してもよい。これにより、ウィンドウ表示される合成画像に含まれる各画像信号の画面上における大きさが小さくなることを抑制することができる。 Furthermore, in the above embodiment, the synthesis unit 13 synthesizes the plurality of acquired image signals, but the present invention is not limited to this. The composition unit 13 may create a composite image excluding image signals acquired from the output destination user terminals 201, 202, 203, and 204. Thereby, it is possible to suppress the size of each image signal included in the composite image displayed in the window from becoming smaller on the screen.
 また、上記実施形態において、画像信号取得部11は、会議サーバ301,302経由でユーザ端末201,202,203,204の画像信号を取得してもよい。例えば、情報処理装置1は、各Web会議のホストとしてルームを立ち上げてもよい。画像信号取得部11は、会議サーバ301,302から提供される画像信号をユーザ端末201,202,203,204のそれぞれから出力される画像信号として取得してもよい。 Furthermore, in the embodiment described above, the image signal acquisition unit 11 may acquire image signals of the user terminals 201, 202, 203, and 204 via the conference servers 301 and 302. For example, the information processing device 1 may start up a room as a host for each web conference. The image signal acquisition unit 11 may acquire image signals provided from the conference servers 301 and 302 as image signals output from each of the user terminals 201, 202, 203, and 204.
 1 情報処理装置
 11 画像信号取得部
 12 特定部
 13 合成部
 14 出力部
 201,202,203,204 ユーザ端末
 301,302 会議サーバ

 
1 Information Processing Device 11 Image Signal Acquisition Unit 12 Specification Unit 13 Combination Unit 14 Output Unit 201, 202, 203, 204 User Terminal 301, 302 Conference Server

Claims (5)

  1.  ユーザによって用いられる複数のユーザ端末に接続され、複数の前記ユーザ端末間で画像信号を共有させる情報処理装置であって、
     複数の前記ユーザ端末のそれぞれから出力される画像信号を取得する画像信号取得部と、
     取得された内容に対応関係のある前記画像信号を特定する特定部と、
     特定された前記画像信号を出力している前記ユーザ端末間の接続を確立するとともに、それぞれの前記ユーザ端末から出力される前記画像信号を合成する合成部と、
     合成された前記画像信号をそれぞれの前記ユーザ端末に出力する出力部と、
    を備える情報処理装置。
    An information processing device that is connected to a plurality of user terminals used by a user and allows image signals to be shared among the plurality of user terminals,
    an image signal acquisition unit that acquires image signals output from each of the plurality of user terminals;
    a specifying unit that specifies the image signal that has a correspondence relationship with the acquired content;
    a synthesizing unit that establishes a connection between the user terminals outputting the identified image signals and synthesizes the image signals output from the respective user terminals;
    an output unit that outputs the combined image signal to each of the user terminals;
    An information processing device comprising:
  2.  前記特定部は、取得した画像信号に、予め定められた関係性を示す画像が含まれる前記画像信号を対応関係のある前記画像信号として特定する請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the identifying unit identifies the image signal in which the acquired image signal includes an image showing a predetermined relationship as the image signal having a corresponding relationship.
  3.  前記特定部は、取得した画像信号に、予め定められた関係性を示す条件を含む前記画像信号を対応関係のある前記画像信号として特定する請求項1又は2に記載の情報処理装置。 The information processing device according to claim 1 or 2, wherein the identification unit identifies the image signal that includes a condition indicating a predetermined relationship in the acquired image signal as the image signal that has a corresponding relationship.
  4.  前記合成部は、一方の前記ユーザ端末から取得された前記画像信号に、他方の前記ユーザ端末から取得された前記画像信号を重ねることにより合成する請求項1から3のいずれかに記載の情報処理装置。 The information processing according to any one of claims 1 to 3, wherein the synthesis unit synthesizes the image signal obtained from one of the user terminals by superimposing the image signal obtained from the other user terminal. Device.
  5.  ユーザによって用いられる複数のユーザ端末に接続され、複数の前記ユーザ端末間で画像信号を共有させる情報処理装置としてコンピュータを機能させるプログラムであって、
     前記コンピュータを、
     複数の前記ユーザ端末のそれぞれから出力される画像信号を取得する画像信号取得部、
     取得された内容に対応関係のある前記画像信号を特定する特定部、
     特定された前記画像信号を出力している前記ユーザ端末間の接続を確立するとともに、それぞれの前記ユーザ端末から出力される前記画像信号を合成する合成部、
     合成された前記画像信号をそれぞれの前記ユーザ端末に出力する出力部、
    として機能させるプログラム。

     
    A program that causes a computer to function as an information processing device that is connected to a plurality of user terminals used by a user and shares image signals among the plurality of user terminals, the program comprising:
    The computer,
    an image signal acquisition unit that acquires image signals output from each of the plurality of user terminals;
    an identification unit that identifies the image signal that has a correspondence relationship with the acquired content;
    a synthesizing unit that establishes a connection between the user terminals outputting the identified image signals and synthesizes the image signals output from the respective user terminals;
    an output unit that outputs the combined image signal to each of the user terminals;
    A program that functions as

PCT/JP2023/010696 2022-03-23 2023-03-17 Information processing device and program WO2023182230A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2024510138A JP7549936B2 (en) 2022-03-23 2023-03-17 Information processing device and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022047275 2022-03-23
JP2022-047275 2022-03-23

Publications (1)

Publication Number Publication Date
WO2023182230A1 true WO2023182230A1 (en) 2023-09-28

Family

ID=88101034

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/010696 WO2023182230A1 (en) 2022-03-23 2023-03-17 Information processing device and program

Country Status (2)

Country Link
JP (1) JP7549936B2 (en)
WO (1) WO2023182230A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006201709A (en) * 2005-01-24 2006-08-03 Toshiba Corp Video display device, composite video delivery system, program, system, and method
JP2009033255A (en) * 2007-07-24 2009-02-12 Ntt Docomo Inc Control device, mobile communication system and communication terminal
JP2020135556A (en) * 2019-02-21 2020-08-31 沖電気工業株式会社 Processing device, program, and processing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014236472A (en) 2013-06-05 2014-12-15 日本電信電話株式会社 Gateway device, communication system, communication method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006201709A (en) * 2005-01-24 2006-08-03 Toshiba Corp Video display device, composite video delivery system, program, system, and method
JP2009033255A (en) * 2007-07-24 2009-02-12 Ntt Docomo Inc Control device, mobile communication system and communication terminal
JP2020135556A (en) * 2019-02-21 2020-08-31 沖電気工業株式会社 Processing device, program, and processing method

Also Published As

Publication number Publication date
JPWO2023182230A1 (en) 2023-09-28
JP7549936B2 (en) 2024-09-12

Similar Documents

Publication Publication Date Title
US9473741B2 (en) Teleconference system and teleconference terminal
JPH04351188A (en) Remote conference system
KR101915786B1 (en) Service System and Method for Connect to Inserting Broadcasting Program Using an Avata
US20170048284A1 (en) Non-transitory computer readable medium, information processing apparatus, and information processing system
US7949116B2 (en) Primary data stream communication
JP7453576B2 (en) Information processing system, its control method and program.
CN111163280B (en) Asymmetric video conference system and method thereof
JP2010157906A (en) Video display device
JP3610423B2 (en) Video display system and method for improving its presence
WO2023182230A1 (en) Information processing device and program
US9131109B2 (en) Information processing device, display control system, and computer program product
KR101887380B1 (en) Apparatus and method for transmitting and processing image filmed using a plurality of camera
US20230138733A1 (en) Representation of natural eye contact within a video conferencing session
JP2019117997A (en) Web conference system, control method of web conference system, and program
JP2022032812A (en) Information processing device and program
JP2011066745A (en) Terminal apparatus, communication method and communication system
JP3031320B2 (en) Video conferencing equipment
US20080043962A1 (en) Methods, systems, and computer program products for implementing enhanced conferencing services
KR20220077781A (en) Method And System for Transmitting and Receiving Multi-Channel Media
JP5234850B2 (en) Projector system, projector, and data receiving method
JPH11289524A (en) Virtual space conference method and record medium recording the method
KR20180105594A (en) Multi-point connection control apparatus and method for video conference service
JP7198952B1 (en) Insurance consultation system, solicitor terminal, and insurance consultation program
JP2012182524A (en) Communication apparatus
JP7124483B2 (en) Communication terminal, data transmission method, program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23774825

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024510138

Country of ref document: JP