WO2024116763A1 - Management device - Google Patents

Management device Download PDF

Info

Publication number
WO2024116763A1
WO2024116763A1 PCT/JP2023/040286 JP2023040286W WO2024116763A1 WO 2024116763 A1 WO2024116763 A1 WO 2024116763A1 JP 2023040286 W JP2023040286 W JP 2023040286W WO 2024116763 A1 WO2024116763 A1 WO 2024116763A1
Authority
WO
WIPO (PCT)
Prior art keywords
conference
image
management device
virtual space
control unit
Prior art date
Application number
PCT/JP2023/040286
Other languages
French (fr)
Japanese (ja)
Inventor
祐貴 田中
圭一 村上
Original Assignee
株式会社Nttドコモ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Nttドコモ filed Critical 株式会社Nttドコモ
Publication of WO2024116763A1 publication Critical patent/WO2024116763A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present invention relates to a management device.
  • Patent Document 1 discloses a system in which multiple avatars, which correspond one-to-one with multiple users, can participate in a web conference in a virtual space.
  • the state of the web conference is distributed to the user terminals used by each of the multiple users.
  • the present disclosure therefore aims to provide a management device that allows users of a web conference in real space and users of a web conference in virtual space to both participate in the web conference.
  • a management device is a management device that communicates with an external device that manages a first web conference held in a real space and manages a second web conference held in a virtual space, and includes a first generation unit that generates first image information showing a two-dimensional image obtained by capturing an image of a conference room in the virtual space with a first virtual camera, a second generation unit that generates second image information showing a three-dimensional image obtained by capturing an image of a conference room in the virtual space with a second virtual camera, and a communication unit that transmits the first image information to the external device and transmits the second image information to one or more terminal devices participating in the second web conference, and a virtual object showing a two-dimensional image related to the first web conference output from the external device is placed in the conference room in the virtual space.
  • users of a web conference in real space and users of a web conference in virtual space can both participate in the web conference.
  • FIG. 1 is a block diagram showing the overall configuration of a conference system 1.
  • FIG. 3 is a block diagram showing a configuration example of a terminal device 30-3.
  • FIG. 3 is a block diagram showing a configuration example of a terminal device 30-1.
  • FIG. 4 is a diagram showing an example of a virtual space VS displayed on a display 38.
  • FIG. 4 is a diagram showing an example of a display image DP.
  • 13A and 13B are diagrams showing a method of generating an image showing a virtual space VS displayed on a terminal device 30-1 and an image showing a virtual space VS displayed on a terminal device 30-3.
  • FIG. 2 is a block diagram showing an example of the configuration of a second conference management device 20.
  • FIG. 2 is a functional block diagram of an avatar control unit 212.
  • FIG. 2 is a block diagram showing an example of the configuration of a first conference management device 10.
  • FIG. 4 is a sequence diagram showing the operation of the conference system 1.
  • FIG. 1 is a block diagram showing the overall configuration of a conference system 1.
  • the conference system 1 includes a first conference management device 10, a second conference management device 20, and terminal devices 30-1 to 30-4.
  • the first conference management device 10 manages a first web conference held in real space.
  • the first conference management device 10 provides a service related to the first web conference to a user who subscribes to the service.
  • a user U1 and a user U2 subscribe to a service related to the first web conference.
  • the first conference management device 10 is a server that provides the first web conference to the terminal device 30-1 and the terminal device 30-2.
  • the first web conference is a conference in which users in remote locations in real space can participate on the web.
  • the second conference management device 20 manages the second web conference held in a virtual space.
  • the second web conference is held in a virtual space such as a metaverse space.
  • the conference room used in the second web conference is a virtual conference room in the virtual space.
  • the second conference management device 20 provides a service related to the second web conference to a user who subscribes to the service. In the example shown in FIG. 1, user U3 and user U4 subscribe to the service related to the second web conference.
  • the second conference management device 20 is a server that provides a web conference in a virtual space to terminal device 30-3 and terminal device 30-4.
  • the second conference management device 20 is an example of a "management device".
  • the first conference management device 10 and the second conference management device 20 are communicatively connected to each other via the communication network NET.
  • the first conference management device 10 and the terminal device 30-1 and the terminal device 30-2 are communicatively connected to each other.
  • the second conference management device 20 and the terminal device 30-3 and the terminal device 30-4 are communicatively connected to each other.
  • the terminal device 30-1 is a device through which the user U1 participates in the first web conference in the real space provided by the first conference management device 10.
  • the terminal device 30-2 is a device through which the user U2 participates in the first web conference.
  • the users U1 and U2 participate in the first web conference using, for example, an application such as ZOOM or Teams. Note that ZOOM and Teams are registered trademarks.
  • the terminal devices 30-1 and 30-2 may be a PC (Personal Computer), a smartphone, or a tablet.
  • the terminal devices 30-1 and 30-2 display two-dimensional images.
  • the terminal device 30-3 is a device for user U3 to participate in the second web conference held in the virtual space.
  • the terminal device 30-4 is a device for user U4 to participate in the second web conference.
  • the terminal devices 30-3 and 30-4 are, for example, goggle-type head-mounted displays (HMDs).
  • the terminal devices 30-3 and 30-4 display three-dimensional images.
  • two terminal devices 30-1 and 30-2 are connected to the first conference management device 10.
  • the number of terminal devices connected to the first conference management device 10 is not limited to two and can be any number.
  • two terminal devices 30-3 and 30-4 are connected to the second conference management device 20.
  • the number of terminal devices connected to the second conference management device 20 is not limited to two and can be any number.
  • Fig. 2A is a block diagram showing a configuration example of the terminal device 30-3.
  • the terminal device 30-3 includes a processing device 31A, a storage device 32A, an input device 33A, a communication device 34, a sound pickup device 35, a speaker 36, a display 38A, and a display 38B.
  • Each element of the terminal device 30 is connected to each other by a single or multiple buses for communicating information.
  • the processing device 31A is a processor that controls the entire terminal device 30-3.
  • the processing device 31A includes, for example, one or more chips.
  • the processing device 31A also includes, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, and registers.
  • CPU central processing unit
  • Some or all of the functions of the processing device 31A may be realized by hardware such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), or an FPGA (Field Programmable Gate Array).
  • the processing device 31A executes various processes in parallel or sequentially.
  • the storage device 32A is a recording medium that can be read and written by the processing device 31A.
  • the storage device 32A also stores a number of programs including the control program PR3A executed by the processing device 31A.
  • the storage device 32A functions as a work area for the processing device 31A.
  • the input device 33A accepts operations from the user U3.
  • the input device 33A includes a touch panel.
  • the input device 33A may also include an imaging device.
  • the imaging device is, for example, a camera.
  • the input device 33A detects a gesture of the user U3 based on an image captured by the imaging device.
  • the input device 33A outputs an operation signal indicating the detected gesture as input information to the processing device 31A.
  • the communication device 34 is hardware that functions as a transmitting/receiving device for communicating with other devices.
  • the communication device 34 is also called, for example, a network device, a network controller, a network card, or a communication module.
  • the communication device 34 may further include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products that comply with wired LAN, IEEE 1394, and USB. Examples of wireless communication interfaces include products that comply with wireless LAN and Bluetooth (registered trademark), etc.
  • the sound pickup device 35 includes, for example, a microphone.
  • the sound pickup device 35 picks up the voice of the user U3.
  • the sound pickup device 35 supplies audio data indicating the picked-up voice to the processing device 31A.
  • the speaker 36 is a device that emits audio.
  • the speaker 36 emits various types of audio under the control of the processing device 31A.
  • audio data which is a digital signal
  • an audio signal which is an analog signal
  • the amplitude of the audio signal is amplified by an amplifier (not shown).
  • the speaker 36 emits audio represented by the audio signal having the amplified amplitude.
  • the speaker 36 emits the audio of the web conference.
  • Displays 38A and 38B are devices that display images. Displays 38A and 38B display various images under the control of processing device 31A. Display 38A displays an image for the left eye. Display 38B displays an image for the right eye. The image for the left eye and the image for the right eye are different images that take into account the parallax between the left eye and the right eye. These images are generated by processing device 31A based on a three-dimensional image received from second conference management device 20. In this embodiment, displays 38A and 38B display an image of a virtual conference room where the second web conference is held, as described below.
  • the processing device 31A reads out the control program PR3A from the storage device 32A.
  • the processing device 31A executes the read out control program PR3A to function as a communication control device 311A and a display control device 312.
  • the communication control unit 311A causes the second conference management device 20 to transmit to the communication device 34 the input information input by user U3 operating the input device 33A and the audio information acquired from the audio pickup device 35.
  • the communication control unit 311A also causes the communication device 34 to receive conference information for holding the second web conference from the second conference management device 20.
  • the conference information includes image information showing an image of a virtual conference room, image information showing images of the avatars of users U3 and U4, video information showing videos obtained by taking actual photographs of the bodies of users U1 and U2, and audio information of users U1 to U4.
  • the display control unit 312 uses the conference information received by the communication device 34 by the communication control unit 311A to display a three-dimensional image of a virtual space showing a virtual conference room on the displays 38A and 38B.
  • FIG. 3 is a diagram showing an example of a virtual space VS displayed on displays 38A and 38B.
  • a long table T is installed in the virtual space VS, which serves as a virtual conference room.
  • Two chairs C1 and C2 are arranged on either side of the long table T.
  • Avatar A3 corresponding to user U3 is displayed sitting in chair C1.
  • avatar A4 corresponding to user U4 is displayed sitting in chair C2.
  • a panel P is installed on the wall W in front of avatars A3 and A4, on which a live-action image R1 of user U1 and a live-action image R2 of user U2 are displayed.
  • Live-action images R1 and R2 are two-dimensional images serving as virtual objects.
  • user U3 can participate in the second web conference using avatar A3 corresponding to the user.
  • User U4 can participate in the second web conference using avatar A4 corresponding to the user.
  • each of users U1 and U2 can participate in the second web conference even if they are not subscribed to the service related to the second web conference.
  • terminal device 30-3 has been described with reference to Figures 2A and 3, but terminal device 30-4 has a similar configuration to terminal device 30-3.
  • FIG. 2B is a block diagram showing an example configuration of terminal device 30-1.
  • the same reference numerals are used for components common to terminal device 30-1 and terminal device 30-3, and their explanation is omitted.
  • the following mainly describes the differences between terminal device 30-1 and terminal device 30-3.
  • the terminal device 30-1 has an input device 33B instead of the input device 33A of the terminal device 30-3.
  • the terminal device 30-1 also has an imaging device 37, unlike the terminal device 30-3.
  • the imaging device 37 is, for example, a camera.
  • the terminal device 30-1 also has a single display 38 instead of the displays 38A and 38B of the terminal device 30-3.
  • the terminal device 30-1 also has a storage device 32B instead of the storage device 32A of the terminal device 30-3.
  • the storage device 32B stores the control program PR3B instead of the control program PR3A stored in the storage device 32A.
  • the terminal device 30-1 also has a processing device 31B instead of the processing device 31A.
  • the processing device 31B has a communication control device 311B instead of the communication control device 311A of the processing device 31A.
  • the input device 33B accepts operations from the user U1.
  • the input device 33B includes a keyboard, a touchpad, a touch panel, or a pointing device such as a mouse.
  • the input device 33B includes a touch panel, it may also serve as the display 38.
  • the imaging device 37 outputs imaging information indicating an image obtained by imaging the outside world.
  • the imaging device 37 includes, for example, a lens, an imaging element, an amplifier, and an AD converter. Light collected through the lens is converted by the imaging element into an imaging signal, which is an analog signal.
  • the amplifier generates an amplified imaging signal by amplifying the imaging signal.
  • the AD converter converts the amplified imaging signal, which is an analog signal, into imaging data, which is a digital signal.
  • the imaging information including the imaging data is supplied to the processing device 31B.
  • the display 38 is a device that displays images.
  • the display 38 displays various images under the control of the processing device 31B.
  • the display image DP which will be described later, is displayed on the display 38.
  • the display image DP is a two-dimensional image.
  • the communication control unit 311B causes the communication device 34 to receive participation data for participating in the second web conference held in the virtual space from the first conference management device 10.
  • the participation data is originally data received by the first conference management device 10 from the second conference management device 20.
  • the participation data is, for example, text data with a link for participating in the second web conference.
  • User U1 participates in the second web conference by operating the input device 33B and clicking on the link.
  • the terminal device 30-1 can participate in the first web conference under the control of the first conference management device 10, and can also participate in the second web conference under the control of the second conference management device 20.
  • FIG. 4 is a diagram showing an example of a display image DP that the display control unit 312 of the terminal device 30-1 displays on the display 38.
  • the display image DP includes an image showing the virtual space VS.
  • the image showing the virtual space VS included in the display image DP is an image of the virtual space VS viewed from the opposite direction to the direction in which the virtual space VS is viewed so as to obtain the image shown in FIG. 3.
  • the front of the avatar A3 and the avatar A4 are displayed.
  • the user U1 of the terminal device 30-1 can visually recognize the facial expressions and movements of the avatar A3 and the avatar A4.
  • a live-action image R1 of the user U1 and a live-action image R2 of the user U2 are displayed in the lower right corner of the display image DP.
  • the display image DP is a two-dimensional image.
  • FIG. 5 is a diagram showing a method of generating an image showing the virtual space VS displayed on terminal device 30-1, and an image showing the virtual space VS displayed on terminal device 30-3.
  • a virtual first camera M1 is placed in front of avatar A3 and avatar A4.
  • a virtual second camera M2 is placed behind avatar A3 and avatar A4.
  • the first camera M1 is an example of a "first virtual camera.”
  • the second camera M2 is an example of a "second virtual camera.”
  • an image generated by the first camera M1 is displayed on the terminal device 30-1 as an image showing the virtual space VS.
  • the image generated by the first camera M1 is a two-dimensional image obtained by the first camera M1 capturing an image of the conference room in the virtual space VS.
  • an image generated by the second camera M2 is displayed on the terminal device 30-3 as an image showing the virtual space VS.
  • the image generated by the second camera M2 is a three-dimensional image obtained by the second camera M2 capturing an image of the conference room in the virtual space VS.
  • the real-life image R1 of user U1 and the real-life image R2 of user U2 included in the three-dimensional image shown in FIG. 3 are two-dimensional images.
  • terminal device 30-1 has been described with reference to Figures 2B, 4, and 5, but terminal device 30-2 has the same configuration as terminal device 30-1.
  • the second conference management device 20 includes a processing device 21, a storage device 22, an input device 23, a communication device 24, and a display 25.
  • the elements of the second conference management device 20 are connected to each other by one or more buses for communicating information.
  • the processing device 21 is a processor that controls the entire second conference management device 20.
  • the processing device 21 includes, for example, one or more chips.
  • the processing device 21 also includes, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, and registers. Some or all of the functions of the processing device 21 may be realized by hardware such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), or an FPGA (Field Programmable Gate Array).
  • the processing device 21 executes various processes in parallel or sequentially.
  • the storage device 22 is a recording medium that can be read from and written to by the processing device 21.
  • the storage device 22 also stores a plurality of programs including a control program PR2 that the processing device 21 executes.
  • the storage device 22 also stores an image information database PD.
  • the image information database PD stores image information for generating an image showing the virtual space VS.
  • the image information includes image information showing an image of a virtual conference room in the virtual space VS, image information showing an image of the avatar A3, and image information showing an image of the avatar A4.
  • the image of the virtual conference room includes a three-dimensional image of the long table T, a three-dimensional image of the chairs C1 and C2, a three-dimensional image of the wall W, and a three-dimensional image of the panel P.
  • the image information database PD also stores placement location information showing the placement locations of the three-dimensional image of the long table T, the three-dimensional image of the chairs C1 and C2, the three-dimensional image of the wall W, and the three-dimensional image of the panel P in the virtual conference room in the virtual space VS shown in FIG. 3.
  • the storage device 22 functions as a work area for the processing device 21.
  • the input device 23 accepts operations from the administrator of the second conference management device 20.
  • the input device 23 includes a keyboard, a touchpad, a touch panel, or a pointing device such as a mouse.
  • the communication device 24 is hardware that functions as a transmitting/receiving device for communicating with other devices.
  • the communication device 24 is also called, for example, a network device, a network controller, a network card, or a communication module.
  • the communication device 24 may also be equipped with a wireless communication interface. Examples of connectors and interface circuits for wired connections include products that comply with wired LAN, IEEE 1394, and USB. Examples of wireless communication interfaces include products that comply with wireless LAN and Bluetooth (registered trademark), etc.
  • the display 25 is a device that displays images.
  • the display 25 displays various images under the control of the processing device 21.
  • the processing device 21 reads out the control program PR2 from the storage device 22.
  • the processing device 21 functions as a space generation unit 211, an avatar control unit 212, a movement control unit 213, a first image generation unit 214, a first audio acquisition unit 215, a second image generation unit 216, a second audio acquisition unit 217, and a communication control unit 218.
  • the space generation unit 211 generates a three-dimensional image showing a virtual conference room in the virtual space VS.
  • the three-dimensional image includes a three-dimensional image of the long table T, a three-dimensional image of chairs C1 and C2, a three-dimensional image of the wall W, and a three-dimensional image of the panel P.
  • the space generation unit 211 generates a three-dimensional image showing the virtual conference room using the image information and placement information of these three-dimensional images stored in the image information database PD.
  • the avatar control unit 212 displays avatar A3 and avatar A4 in a virtual conference room in the virtual space VS.
  • FIG. 7 is a functional block diagram of the avatar control unit 212.
  • the avatar control unit 212 includes an avatar generation unit 212-1 and an action control unit 212-2.
  • Avatar generation unit 212-1 generates a three-dimensional image of avatar A3 using image information corresponding to avatar A3 stored in image information database PD. Similarly, avatar generation unit 212-1 generates a three-dimensional image of avatar A4 using image information corresponding to avatar A4 stored in image information database PD. Furthermore, avatar generation unit 212-1 displays the three-dimensional images of avatar A3 and avatar A4 in virtual space VS.
  • the motion control unit 212-2 controls the motion of avatar A3 and avatar A4. As an example, the motion control unit 212-2 controls the motion of avatar A3 based on the operation of user U3 on terminal device 30-3. Similarly, the motion control unit 212-2 controls the motion of avatar A4 based on the operation of user U4 on terminal device 30-4.
  • the movement control unit 213 moves at least one of the first camera M1 and the second camera M2 in the virtual space VS. More specifically, the movement control unit 213 controls the placement locations of the first camera M1 and the second camera M2 in a virtual conference room in the virtual space VS, and the movement direction and movement speed of each of the first camera M1 and the second camera M2. The movement control unit 213 also controls the imaging direction of the first camera M1 and the second camera M2. Furthermore, the movement control unit 213 controls the zoom function of the first camera M1 and the second camera M2, i.e., the magnification ratio of the captured image.
  • the first image generating unit 214 generates image information showing a two-dimensional image obtained by the first camera M1 capturing an image of the conference room in the virtual space VS.
  • the first image generating unit 214 is an example of a "first generating unit.”
  • the image information generated by the first image generating unit 214 is an example of "first image information.”
  • the display image DP is displayed on the display 38 provided on the terminal device 30-1 and the terminal device 30-2.
  • the first image generating unit 214 generates image information showing a two-dimensional image of the virtual space VS included in the display image DP.
  • the image information shows a two-dimensional image obtained by the first camera M1 capturing an image of the virtual space VS.
  • the image information as the "first image information" does not include the live-action image R1 of the user U1 and the live-action image R2 of the user U2 shown in FIG. 3.
  • the first audio acquisition unit 215 acquires audio in the first web conference held in real space. Furthermore, the "audio in the first web conference held in real space” is an example of the "first audio.” Furthermore, the first audio acquisition unit 215 is an example of the "first acquisition unit.”
  • the sound pickup device 35 of terminal device 30-1 picks up the voice of user U1.
  • the sound pickup device 35 also supplies the picked up voice of user U1 to the processing device 31B.
  • the processing device 31B transmits voice information indicating the voice of user U1 to the first conference management device 10 via the communication device 34.
  • the processing device 31B of terminal device 30-2 transmits voice information indicating the voice of user U2 to the first conference management device 10 via the communication device 34.
  • the first conference management device 10 transmits voice information indicating the voice of user U1 and voice information indicating the voice of user U2 to the second conference management device 20 via the communication network NET, as described below.
  • the first voice acquisition unit 215 acquires voice information indicating the voice of user U1 and voice information indicating the voice of user U2 via the communication device 24.
  • the second image generating unit 216 generates image information showing a three-dimensional image obtained by capturing an image of the conference room in the virtual space VS with the second camera M2.
  • the second image generating unit 216 is an example of a "second generating unit.”
  • the image information generated by the second image generating unit 216 is an example of "second image information.”
  • the above three-dimensional image includes a live-action image R1 of user U1 and a live-action image R2 of user U2. That is, the image information as the "second image information" includes the live-action image R1 of user U1 and the live-action image R2 of user U2 shown in FIG. 3.
  • These live-action images R1 and R2 are examples of "virtual objects showing two-dimensional images related to the first web conference.”
  • a three-dimensional image showing the virtual space VS illustrated in FIG. 3 is displayed on the displays 38A and 38B of the terminal devices 30-3 and 30-4.
  • the second image generator 216 generates image information showing the three-dimensional image.
  • the second audio acquisition unit 217 acquires audio in the second web conference held in the virtual space VS. Furthermore, the "audio in the second web conference held in the virtual space VS" is an example of the "second audio.” Furthermore, the second audio acquisition unit 217 is an example of the "second acquisition unit.”
  • the sound pickup device 35 of terminal device 30-3 picks up the voice of user U3.
  • the sound pickup device 35 also supplies voice information indicating the picked-up voice of user U3 to the processing device 31A.
  • the processing device 31A transmits the voice information indicating the voice of user U3 to the second conference management device 20 via the communication device 34.
  • the processing device 31A of terminal device 30-4 transmits voice information indicating the voice of user U4 to the second conference management device 20 via the communication device 34.
  • the second voice acquisition unit 217 acquires the voice information indicating the voice of user U3 and the voice information indicating the voice of user U4 via the communication device 24.
  • the communication control unit 218 causes the communication device 24 to transmit the first image information generated by the first image generation unit 214 to the first conference management device 10 as an external device.
  • the communication control unit 218 also causes the communication device 24 to receive first imaging information indicating the live-action image R1 from the terminal device 30-1.
  • the communication control unit 218 also causes the communication device 24 to receive second imaging information indicating the live-action image R2 from the terminal device 30-2.
  • the communication control unit 218 also causes the communication device 24 to transmit the second image information generated by the second image generation unit 216 to the terminal devices 30-3 and 30-4.
  • the communication control unit 218 also causes the communication device 24 to transmit first audio information indicating the first audio to the terminal devices 30-3 and 30-4.
  • the communication control unit 218 also causes the communication device 24 to transmit second audio information indicating the second audio to the first conference management device 10 as an external device.
  • the communication control unit 218 is an example of a "communication unit.”
  • the first conference management device 10 includes a processing device 11, a storage device 12, an input device 13, a communication device 14, and a display 15.
  • the elements of the first conference management device 10 are connected to each other by one or more buses for communicating information.
  • the processing device 11 is a processor that controls the entire first conference management device 10.
  • the processing device 11 includes, for example, one or more chips.
  • the processing device 11 also includes, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, and registers.
  • CPU central processing unit
  • Some or all of the functions of the processing device 11 may be realized by hardware such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), or an FPGA (Field Programmable Gate Array).
  • the processing device 11 executes various processes in parallel or sequentially.
  • the storage device 12 is a recording medium that can be read from and written to by the processing device 11.
  • the storage device 12 also stores a plurality of programs including a control program PR1 that the processing device 11 executes.
  • the storage device 12 also stores application data AD.
  • the application data AD is data necessary for the terminal device 30-1 and the terminal device 30-2 to use an application for conducting a web conference, such as ZOOM or Teams.
  • the storage device 12 functions as a work area for the processing device 11 .
  • the input device 13 accepts operations from the administrator of the first conference management device 10.
  • the input device 13 includes a keyboard, a touchpad, a touch panel, or a pointing device such as a mouse.
  • the input device 13 may also serve as the display 15.
  • the communication device 14 is hardware that functions as a transmitting/receiving device for communicating with other devices.
  • the communication device 14 is also called, for example, a network device, a network controller, a network card, a communication module, etc.
  • the communication device 14 may also be equipped with a wireless communication interface. Examples of connectors and interface circuits for wired connections include products that comply with wired LAN, IEEE 1394, and USB. Examples of wireless communication interfaces include products that comply with wireless LAN and Bluetooth (registered trademark), etc.
  • the display 15 is a device that displays images.
  • the display 15 displays various images under the control of the processing device 11.
  • the processing device 11 reads the control program PR1 from the storage device 12.
  • the processing device 11 functions as a communication control unit 111 by executing the read control program PR1.
  • the communication control unit 111 causes the communication device 14 to receive participation data for participating in the second web conference held in the virtual space from the second conference management device 20.
  • the communication control unit 111 also causes the communication device 14 to transmit the participation data received from the second conference management device 20 to the terminal device 30-1 and terminal device 30-2.
  • the communication control unit 111 also causes the communication device 14 to receive the first image information from the second conference management device 20.
  • the communication control unit 111 also causes the terminal devices 30-1 and 30-2 to transmit the first image information received from the second conference management device 20 to the communication device 14.
  • the communication control unit 111 also causes the communication device 14 to receive the first audio information from the terminal devices 30-1 and 30-2.
  • the communication control unit 111 also causes the second conference management device 20 to transmit the first audio information received from the terminal devices 30-3 and 30-4 to the communication device 14.
  • the communication control unit 111 also causes the communication device 14 to receive the second audio information from the second conference management device 20.
  • the communication control unit 111 also causes the communication device 14 to transmit the second audio information received from the second conference management device 20 to the communication device 14.
  • FIG. 9 is a sequence diagram showing the operation of the conference system 1 according to the embodiment.
  • step S2 the processing device 21 included in the second conference management device 20 functions as the avatar control unit 212.
  • the processing device 21 generates a three-dimensional image showing avatar A3 and avatar A4.
  • step S3 the processing device 21 included in the second conference management device 20 functions as the first image generation unit 214.
  • the processing device 21 generates the first image information PI1.
  • step S4 the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218.
  • the processing device 21 causes the first conference management device 10 to transmit the first image information PI1 to the communication device 24.
  • the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the communication device 14 to receive the first image information PI1 from the second conference management device 20.
  • step S5 the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the terminal device 30-1 to transmit the first image information PI1 to the communication device 14.
  • the processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B.
  • the processing device 31B causes the communication device 34 to receive the first image information PI1 from the first conference management device 10.
  • step S6 the processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B.
  • the processing device 31B causes the first conference management device 10 to transmit the first imaging information RI1 to the communication device 34.
  • the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the communication device 14 to receive the first imaging information RI1 from the terminal device 30-1.
  • step S7 the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the second conference management device 20 to transmit the first imaging information RI1 to the communication device 14.
  • the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218.
  • the processing device 21 causes the communication device 24 to receive the first imaging information RI1 from the first conference management device 10.
  • step S8 the processing device 21 included in the second conference management device 20 functions as the second image generation unit 216.
  • the processing device 21 generates the second image information PI2.
  • step S9 the processing device 21 provided in the second conference management device 20 functions as the communication control unit 218.
  • the processing device 21 causes the terminal device 30-3 to transmit the second image information PI2 to the communication device 24.
  • the processing device 31A provided in the terminal device 30-3 functions as the communication control unit 311A.
  • the processing device 31A causes the communication device 34 to receive the second image information PI2 from the second conference management device 20.
  • step S10 the processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B.
  • the processing device 31B causes the first conference management device 10 to transmit first voice information SI1 indicating the first voice to the communication device 34.
  • the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the communication device 14 to receive the first voice information SI1 from the terminal device 30-1.
  • step S11 the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the second conference management device 20 to transmit the first audio information SI1 to the communication device 14.
  • the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218.
  • the processing device 21 causes the communication device 24 to receive the first audio information SI1 from the first conference management device 10.
  • step S12 the processing device 31A provided in the terminal device 30-3 functions as a communication control unit 311A.
  • the processing device 31A causes the second conference management device 20 to transmit second voice information SI2 indicating the second voice to the communication device 34.
  • the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218.
  • the processing device 21 causes the communication device 24 to receive the second voice information SI2 from the terminal device 30-3.
  • step S13 the processing device 21 provided in the second conference management device 20 functions as the communication control unit 218.
  • the processing device 21 causes the terminal device 30-3 to transmit the first audio information SI1 to the communication device 24.
  • the processing device 31A provided in the terminal device 30-3 functions as the communication control unit 311A.
  • the processing device 31A causes the communication device 34 to receive the first audio information SI1 from the second conference management device 20.
  • step S14 the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218.
  • the processing device 21 causes the first conference management device 10 to transmit the second audio information SI2 to the communication device 24.
  • the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the communication device 14 to receive the second audio information SI2 from the second conference management device 20.
  • step S15 the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111.
  • the processing device 11 causes the terminal device 30-1 to transmit the second audio information SI2 to the communication device 14.
  • the processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B.
  • the processing device 31B causes the communication device 34 to receive the second audio information SI2 from the first conference management device 10.
  • terminal device 30-2 performs the same operations as terminal device 30-1.
  • Terminal device 30-4 performs the same operations as terminal device 30-3. Furthermore, the order of steps S1 to S15 above may be changed as necessary.
  • the second conference management device 20 communicates with the first conference management device 10 as an external device that manages the first web conference held in the real space, and manages the second web conference held in the virtual space VS.
  • the second conference management device 20 includes a first image generation unit 214 as a first generation unit, a second image generation unit 216 as a second generation unit, and a communication control unit 218 as a communication unit.
  • the first image generation unit 214 generates first image information PI1 indicating a two-dimensional image obtained by capturing an image of a conference room in the virtual space VS with a first camera M1 as a first virtual camera.
  • the second image generation unit 216 generates second image information indicating a three-dimensional image obtained by capturing an image of a conference room in the virtual space VS in which a virtual object indicating a two-dimensional image related to the first web conference output from the first conference management device 10 as an external device is arranged with a second camera M2 as a second virtual camera.
  • the communication control unit 218 transmits the first image information PI1 to the first conference management device 10 serving as an external device, and transmits the second image information PI2 to one or more terminal devices 30-3 to 30-4 participating in the second Web conference.
  • the second conference management device 20 has the above configuration, so that both users of a web conference in real space and users of a web conference in virtual space VS can participate in a web conference.
  • a user of a web conference in virtual space VS participates in the web conference using an avatar
  • a user who has not yet generated an avatar could not participate in the web conference.
  • a user who has not yet generated an avatar can participate in a web conference in virtual space VS in which an avatar is used, as a user of a web conference in real space.
  • the second conference management device 20 includes a first voice acquisition unit 215 as a first acquisition unit, and a second voice acquisition unit 217 as a second acquisition unit.
  • the first voice acquisition unit 215 acquires the first voice in the first web conference.
  • the second voice acquisition unit 217 acquires the second voice in the second web conference.
  • the communication control unit 218 transmits the first voice to one or more terminal devices 30-1 to 30-2, and transmits the second voice to the first conference management device 10 as an external device.
  • the second conference management device 20 has the above configuration, so that users of a web conference in real space and users of a web conference in virtual space VS can converse using voice.
  • the second conference management device 20 includes a movement control unit 213.
  • the movement control unit 213 moves at least one of the first camera M1 as the first virtual camera and the second camera M2 as the second virtual camera in the virtual space VS.
  • images of the web conference from any viewpoint and any angle are displayed on the terminal device 30-1 used by user U1 of the web conference in the real space, and on the terminal device 30-2 used by user U2 of the web conference in the real space.
  • images of the web conference from any viewpoint and any angle are displayed on the terminal device 30-3 used by user U3 of the web conference in the virtual space VS, and on the terminal device 30-4 used by user U4 of the web conference in the virtual space VS.
  • the second conference management device 20 includes an avatar control unit 212.
  • the avatar control unit 212 displays one or more avatars A3 to A4 in one-to-one correspondence with users U3 to U4 of one or more terminal devices 30-3 to 30-4 in the conference room in the virtual space VS.
  • the second conference management device 20 Since the second conference management device 20 has the above configuration, user U3 of the web conference in the virtual space VS can participate in the web conference using avatar A3 corresponding to himself. Similarly, user U4 of the web conference in the virtual space VS can participate in the web conference using avatar A3 corresponding to himself.
  • control program PR2 is stored in the storage device 22 of the second conference management device 20, but the control program PR2 may be manufactured or sold separately.
  • the control program PR2 may be provided to a purchaser, for example, by distributing a computer-readable recording medium such as a flash ROM on which the control program PR2 is written, or by distributing the control program PR2 by downloading it via a telecommunications line.
  • the space generation unit 211, the avatar control unit 212, the movement control unit 213, the first image generation unit 214, the first voice acquisition unit 215, the second image generation unit 216, the second voice acquisition unit 217, and the communication control unit 218 are all software modules.
  • any one, any two, or all of the space generation unit 211, the avatar control unit 212, the movement control unit 213, the first image generation unit 214, the first voice acquisition unit 215, the second image generation unit 216, the second voice acquisition unit 217, and the communication control unit 218 may be hardware modules.
  • the hardware modules are, for example, DSPs (Digital Signal Processors), ASICs (Application Specific Integrated Circuits), PLDs (Programmable Logic Devices), and FPGAs (Field Programmable Gate Arrays). Even if any one, any two, or all of the space generation unit 211, the avatar control unit 212, the movement control unit 213, the first image generation unit 214, the first audio acquisition unit 215, the second image generation unit 216, the second audio acquisition unit 217, and the communication control unit 218 are hardware modules, the same effect as in the above embodiment can be achieved.
  • DSPs Digital Signal Processors
  • ASICs Application Specific Integrated Circuits
  • PLDs Program Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • storage device 12, storage device 22, storage device 32A, and storage device 32B are exemplified by ROM and RAM, but the storage devices may be flexible disks, magneto-optical disks (e.g., compact disks, digital versatile disks, Blu-ray (registered trademark) disks), smart cards, flash memory devices (e.g., cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable disks, hard disks, floppy (registered trademark) disks, magnetic strips, databases, servers, or other suitable storage media.
  • magneto-optical disks e.g., compact disks, digital versatile disks, Blu-ray (registered trademark) disks
  • smart cards e.g., cards, sticks, key drives
  • CD-ROMs Compact Disc-ROMs
  • registers removable disks
  • hard disks hard disks
  • floppy (registered trademark) disks magnetic strips
  • databases servers, or other suitable storage media.
  • the information, signals, etc. described may be represented using any of a variety of different technologies.
  • data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, optical fields or photons, or any combination thereof.
  • the input and output information, etc. may be stored in a specific location (e.g., memory) or may be managed using a management table.
  • the input and output information, etc. may be overwritten, updated, or added to.
  • the output information, etc. may be deleted.
  • the input information, etc. may be transmitted to another device.
  • the determination may be made based on a value (0 or 1) represented using one bit, a Boolean value (true or false), or a comparison of numerical values (e.g., a comparison with a predetermined value).
  • each function illustrated in Figures 1 to 9 is realized by any combination of at least one of hardware and software. Furthermore, there are no particular limitations on the method of realizing each functional block. That is, each functional block may be realized using one device that is physically or logically coupled, or may be realized using two or more devices that are physically or logically separated and connected directly or indirectly (e.g., using wires, wirelessly, etc.) and these multiple devices. A functional block may be realized by combining the one device or the multiple devices with software.
  • the programs exemplified in the above-described embodiments should be broadly construed to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executable files, threads of execution, procedures, functions, etc., regardless of whether they are called software, firmware, middleware, microcode, hardware description language, or by other names.
  • Software, instructions, information, etc. may also be transmitted and received via a transmission medium.
  • a transmission medium For example, if the software is transmitted from a website, server, or other remote source using at least one of wired technologies (such as coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL)), and/or wireless technologies (such as infrared, microwave), then at least one of these wired and wireless technologies is included within the definition of a transmission medium.
  • wired technologies such as coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL)
  • wireless technologies such as infrared, microwave
  • the information, parameters, etc. described in this disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information.
  • the first conference management device 10, the second conference management device 20, and the terminal devices 30-1 to 30-4 may be mobile stations (MS).
  • a mobile station may also be referred to by those skilled in the art as a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a mobile device, a wireless device, a wireless communication device, a remote device, a mobile subscriber station, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client, a client, or some other suitable term.
  • the terms "mobile station", “user terminal”, “user equipment (UE)", “terminal”, etc. may be used interchangeably.
  • connection refers to any direct or indirect connection or coupling between two or more elements, and may include the presence of one or more intermediate elements between two elements that are “connected” or “coupled” to each other.
  • the coupling or connection between elements may be a physical coupling or connection, a logical coupling or connection, or a combination thereof. For example, “connected” may be read with "access”.
  • two elements may be considered to be “connected” or “coupled” to each other using at least one of one or more wires, cables, and printed electrical connections, as well as electromagnetic energy having wavelengths in the radio frequency range, microwave range, and light (both visible and invisible) range, as some non-limiting and non-exhaustive examples.
  • the phrase “based on” does not mean “based only on,” unless otherwise specified. In other words, the phrase “based on” means both “based only on” and “based at least on.”
  • determining and “determining” as used in this disclosure may encompass a wide variety of actions. “Determining” and “determining” may include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, inquiry (e.g., searching in a table, database, or other data structure), and considering ascertaining as “judging” or “determining”. Also, “determining” and “determining” may include considering receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and accessing (e.g., accessing data in memory) as “judging” or “determining”.
  • judgment and “decision” can include considering resolving, selecting, choosing, establishing, comparing, etc., to have been “judged” or “decided.” In other words, “judgment” and “decision” can include considering some action to have been “judged” or “decided.” Additionally, “judgment (decision)” can be interpreted as “assuming,” “expecting,” “considering,” etc.
  • notification of specific information is not limited to being an explicit notification, but may be performed implicitly (e.g., not notifying the specific information).

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

This management device comprises: a first generation unit that generates first image information indicating a two-dimensional image that is obtained by imaging a conference room in a virtual space by means of a first virtual camera; a second generation unit that generates second image information indicating a three-dimensional image that is obtained by imaging the conference room in the virtual space by means of a second virtual camera; and a communication unit that transmits the first image information to an external device, and transmits the second image information to one or a plurality of terminal devices participating in a second webconference. In the conference room in the virtual space, a virtual object is disposed indicating a two-dimensional image that is output from the external device and that relates to a first webconference.

Description

管理装置Management Device
 本発明は、管理装置に関する。 The present invention relates to a management device.
 近年、メタバース空間等の仮想空間において、現実空間におけるweb会議と同様に、仮想的なweb会議を実現するシステムが用いられることがある。 In recent years, systems that realize virtual web conferences in virtual spaces such as the metaverse have come into use, similar to web conferences in the real world.
 例えば特許文献1は、仮想空間において、複数のユーザと1対1に対応する複数のアバターがweb会議に参加できるシステムを開示している。このシステムでは、web会議の様子が、複数のユーザの各々が使用するユーザ端末に配信される。 For example, Patent Document 1 discloses a system in which multiple avatars, which correspond one-to-one with multiple users, can participate in a web conference in a virtual space. In this system, the state of the web conference is distributed to the user terminals used by each of the multiple users.
特開2020-149400号公報JP 2020-149400 A
 しかし、特許文献1に係る技術を用いて実現されるweb会議においては、全てのユーザが、仮想空間において、アバターとしてweb会議に参加する必要があった。このweb会議に参加できるユーザは、仮想空間サービスに加入しているユーザに限られていた。従って、仮想空間サービスに加入していないユーザは、仮想空間で開催されるweb会議には参加できなかった。すなわち、現実空間におけるweb会議のユーザは、仮想空間で開催されるweb会議には参加できなかった。 However, in a web conference realized using the technology disclosed in Patent Document 1, all users had to participate in the web conference as avatars in a virtual space. Users who could participate in this web conference were limited to those who subscribed to the virtual space service. Therefore, users who were not subscribed to the virtual space service could not participate in a web conference held in a virtual space. In other words, users of a web conference in real space could not participate in a web conference held in a virtual space.
 そこで、本開示は、現実空間におけるweb会議のユーザと、仮想空間におけるweb会議のユーザとが、共にweb会議に参加できる管理装置を提供することを目的とする。 The present disclosure therefore aims to provide a management device that allows users of a web conference in real space and users of a web conference in virtual space to both participate in the web conference.
 本開示の好適な態様に係る管理装置は、現実空間で開催される第1web会議を管理する外部装置と通信し、仮想空間で開催される第2web会議を管理する管理装置であって、前記仮想空間上の会議室を第1仮想カメラが撮像することによって得られた2次元画像を示す第1画像情報を生成する第1生成部と、前記仮想空間上の会議室を第2仮想カメラが撮像することによって得られた3次元画像を示す第2画像情報を生成する第2生成部と、前記第1画像情報を前記外部装置へ送信すると共に、前記第2画像情報を前記第2web会議に参加する1又は複数の端末装置へ送信する通信部と、を備え、前記仮想空間上の会議室には、前記外部装置から出力される前記第1web会議に関する2次元画像を示す仮想オブジェクトが配置される。 A management device according to a preferred embodiment of the present disclosure is a management device that communicates with an external device that manages a first web conference held in a real space and manages a second web conference held in a virtual space, and includes a first generation unit that generates first image information showing a two-dimensional image obtained by capturing an image of a conference room in the virtual space with a first virtual camera, a second generation unit that generates second image information showing a three-dimensional image obtained by capturing an image of a conference room in the virtual space with a second virtual camera, and a communication unit that transmits the first image information to the external device and transmits the second image information to one or more terminal devices participating in the second web conference, and a virtual object showing a two-dimensional image related to the first web conference output from the external device is placed in the conference room in the virtual space.
 本開示によれば、現実空間におけるweb会議のユーザと、仮想空間におけるweb会議のユーザとが、共にweb会議に参加できる。 According to the present disclosure, users of a web conference in real space and users of a web conference in virtual space can both participate in the web conference.
会議システム1の全体構成を示すブロック図。FIG. 1 is a block diagram showing the overall configuration of a conference system 1. 端末装置30-3の構成例を示すブロック図。FIG. 3 is a block diagram showing a configuration example of a terminal device 30-3. 端末装置30-1の構成例を示すブロック図。FIG. 3 is a block diagram showing a configuration example of a terminal device 30-1. ディスプレイ38に表示される仮想空間VSの一例を示す図。FIG. 4 is a diagram showing an example of a virtual space VS displayed on a display 38. 表示画像DPの一例を示す図。FIG. 4 is a diagram showing an example of a display image DP. 端末装置30-1に表示される仮想空間VSを示す画像、及び端末装置30-3に表示される仮想空間VSを示す画像の生成方法を示す図。13A and 13B are diagrams showing a method of generating an image showing a virtual space VS displayed on a terminal device 30-1 and an image showing a virtual space VS displayed on a terminal device 30-3. 第2会議管理装置20の構成例を示すブロック図。FIG. 2 is a block diagram showing an example of the configuration of a second conference management device 20. アバター制御部212の機能ブロック図。FIG. 2 is a functional block diagram of an avatar control unit 212. 第1会議管理装置10の構成例を示すブロック図。FIG. 2 is a block diagram showing an example of the configuration of a first conference management device 10. 会議システム1の動作を示すシーケンス図。FIG. 4 is a sequence diagram showing the operation of the conference system 1.
 以下、図1~図9を参照しつつ、実施形態に係る会議システム1について説明する。 The following describes the conference system 1 according to the embodiment, with reference to Figures 1 to 9.
1:実施形態の構成
1-1:全体構成
 図1は、会議システム1の全体構成を示すブロック図である。図1に示されるように、会議システム1は、第1会議管理装置10、第2会議管理装置20、及び端末装置30-1~30-4を備える。第1会議管理装置10は、現実空間で開催される第1web会議を管理する。第1会議管理装置10は、第1web会議に関するサービスを当該サービスに加入するユーザに提供する。図1に示される例では、ユーザU1及びユーザU2が第1web会議に関するサービスに加入している。より具体的には、第1会議管理装置10は、端末装置30-1及び端末装置30-2に対して第1web会議を提供するサーバである。第1web会議は現実空間において遠隔地にいるユーザ同士が、web上で参加可能な会議である。
 第2会議管理装置20は、仮想空間で開催される第2web会議を管理する。第2web会議は、メタバース空間等の仮想空間で開催される。第2web会議で使用される会議室は、仮想空間上の仮想的な会議室である。第2会議管理装置20は、第2web会議に関するサービスを当該サービスに加入するユーザに提供する。図1に示される例では、ユーザU3及びユーザU4が第2web会議に関するサービスに加入している。より具体的には、第2会議管理装置20は、端末装置30-3及び端末装置30-4に対して仮想空間上のweb会議を提供するサーバである。第2会議管理装置20は、「管理装置」の一例である。
1: Configuration of the embodiment 1-1: Overall configuration FIG. 1 is a block diagram showing the overall configuration of a conference system 1. As shown in FIG. 1, the conference system 1 includes a first conference management device 10, a second conference management device 20, and terminal devices 30-1 to 30-4. The first conference management device 10 manages a first web conference held in real space. The first conference management device 10 provides a service related to the first web conference to a user who subscribes to the service. In the example shown in FIG. 1, a user U1 and a user U2 subscribe to a service related to the first web conference. More specifically, the first conference management device 10 is a server that provides the first web conference to the terminal device 30-1 and the terminal device 30-2. The first web conference is a conference in which users in remote locations in real space can participate on the web.
The second conference management device 20 manages the second web conference held in a virtual space. The second web conference is held in a virtual space such as a metaverse space. The conference room used in the second web conference is a virtual conference room in the virtual space. The second conference management device 20 provides a service related to the second web conference to a user who subscribes to the service. In the example shown in FIG. 1, user U3 and user U4 subscribe to the service related to the second web conference. More specifically, the second conference management device 20 is a server that provides a web conference in a virtual space to terminal device 30-3 and terminal device 30-4. The second conference management device 20 is an example of a "management device".
 第1会議管理装置10と第2会議管理装置20とは、通信網NETを介して、互いに通信可能に接続される。第1会議管理装置10と端末装置30-1及び端末装置30-2とは、互いに通信可能に接続される。第2会議管理装置20と端末装置30-3及び端末装置30-4とは、互いに通信可能に接続される。 The first conference management device 10 and the second conference management device 20 are communicatively connected to each other via the communication network NET. The first conference management device 10 and the terminal device 30-1 and the terminal device 30-2 are communicatively connected to each other. The second conference management device 20 and the terminal device 30-3 and the terminal device 30-4 are communicatively connected to each other.
 端末装置30-1は、ユーザU1が第1会議管理装置10から提供される現実空間における第1web会議に参加するための装置である。同様に、端末装置30-2は、ユーザU2が第1web会議に参加するための装置である。ユーザU1及びユーザU2は、例として、ZOOM、又はTeamsといったアプリケーションを用いて第1web会議に参加する。なお、ZOOM、及びTeamsは登録商標である。端末装置30-1及び30-2は、PC(Personal Computer)であってもよく、スマートフォンであってもよく、タブレットであってもよい。端末装置30-1及び30-2は、2次元画像を表示する。 The terminal device 30-1 is a device through which the user U1 participates in the first web conference in the real space provided by the first conference management device 10. Similarly, the terminal device 30-2 is a device through which the user U2 participates in the first web conference. The users U1 and U2 participate in the first web conference using, for example, an application such as ZOOM or Teams. Note that ZOOM and Teams are registered trademarks. The terminal devices 30-1 and 30-2 may be a PC (Personal Computer), a smartphone, or a tablet. The terminal devices 30-1 and 30-2 display two-dimensional images.
 端末装置30-3は、ユーザU3が仮想空間で開催される第2web会議に参加するための装置である。同様に、端末装置30-4は、ユーザU4が第2web会議に参加するための装置である。端末装置30-3及び30-4は、例えば、ゴーグル型のヘッドマウントディスプレイ(HMD:Head-mounted Display)である。端末装置30-3及び30-4は、3次元画像を表示する。 The terminal device 30-3 is a device for user U3 to participate in the second web conference held in the virtual space. Similarly, the terminal device 30-4 is a device for user U4 to participate in the second web conference. The terminal devices 30-3 and 30-4 are, for example, goggle-type head-mounted displays (HMDs). The terminal devices 30-3 and 30-4 display three-dimensional images.
 図1に示される例において、第1会議管理装置10には、2台の端末装置30-1及び30-2が接続される。しかし、第1会議管理装置10に接続される端末装置の数は、2台に限らず、任意である。同様に、第2会議管理装置20には、2台の端末装置30-3及び30-4が接続される。しかし、第2会議管理装置20に接続される端末装置の数は、2台に限らず、任意である。 In the example shown in FIG. 1, two terminal devices 30-1 and 30-2 are connected to the first conference management device 10. However, the number of terminal devices connected to the first conference management device 10 is not limited to two and can be any number. Similarly, two terminal devices 30-3 and 30-4 are connected to the second conference management device 20. However, the number of terminal devices connected to the second conference management device 20 is not limited to two and can be any number.
1-2:端末装置の構成
 図2Aは、端末装置30-3の構成例を示すブロック図である。端末装置30-3は、処理装置31A、記憶装置32A、入力装置33A、通信装置34、収音装置35、スピーカ36、ディスプレイ38A及びディスプレイ38Bを備える。端末装置30が有する各要素は、情報を通信するための単体又は複数のバスによって相互に接続される。
1-2: Configuration of the terminal device Fig. 2A is a block diagram showing a configuration example of the terminal device 30-3. The terminal device 30-3 includes a processing device 31A, a storage device 32A, an input device 33A, a communication device 34, a sound pickup device 35, a speaker 36, a display 38A, and a display 38B. Each element of the terminal device 30 is connected to each other by a single or multiple buses for communicating information.
 処理装置31Aは、端末装置30-3の全体を制御するプロセッサである。処理装置31Aは、例えば、単数又は複数のチップを含む。また、処理装置31Aは、例えば、周辺装置とのインターフェース、演算装置及びレジスタ等を含む中央処理装置(CPU:Central Processing Unit)を含む。なお、処理装置31Aが有する機能の一部又は全部は、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、又はFPGA(Field Programmable Gate Array)等のハードウェアによって実現されてもよい。処理装置31Aは、各種の処理を並列的又は逐次的に実行する。 The processing device 31A is a processor that controls the entire terminal device 30-3. The processing device 31A includes, for example, one or more chips. The processing device 31A also includes, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, and registers. Some or all of the functions of the processing device 31A may be realized by hardware such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), or an FPGA (Field Programmable Gate Array). The processing device 31A executes various processes in parallel or sequentially.
 記憶装置32Aは、処理装置31Aによる読取及び書込が可能な記録媒体である。また、記憶装置32Aは、処理装置31Aが実行する制御プログラムPR3Aを含む複数のプログラムを記憶する。記憶装置32Aは、処理装置31Aのワークエリアとして機能する。 The storage device 32A is a recording medium that can be read and written by the processing device 31A. The storage device 32A also stores a number of programs including the control program PR3A executed by the processing device 31A. The storage device 32A functions as a work area for the processing device 31A.
 入力装置33Aは、ユーザU3からの操作を受け付ける。例えば、入力装置33Aは、タッチパネルを含む。また、入力装置33Aは、撮像装置を含んでもよい。撮像装置は、例えば、カメラである。入力装置33Aが撮像装置を含む場合、入力装置33Aは撮像装置によって撮像された画像に基づいて、ユーザU3のジェスチャーを検出する。入力装置33Aは、検出されたジェスチャーを示す操作信号を、入力情報として、処理装置31Aに出力する。 The input device 33A accepts operations from the user U3. For example, the input device 33A includes a touch panel. The input device 33A may also include an imaging device. The imaging device is, for example, a camera. When the input device 33A includes an imaging device, the input device 33A detects a gesture of the user U3 based on an image captured by the imaging device. The input device 33A outputs an operation signal indicating the detected gesture as input information to the processing device 31A.
 通信装置34は、他の装置と通信を行うための、送受信デバイスとしてのハードウェアである。また、通信装置34は、例えば、ネットワークデバイス、ネットワークコントローラ、ネットワークカード、又は通信モジュール等とも呼ばれる。更に、通信装置34は、無線通信インターフェースを備えていてもよい。有線接続用のコネクター及びインターフェース回路としては有線LAN、IEEE1394、及びUSBに準拠した製品が挙げられる。また、無線通信インターフェースとしては無線LAN及びBluetooth(登録商標)等に準拠した製品が挙げられる。 The communication device 34 is hardware that functions as a transmitting/receiving device for communicating with other devices. The communication device 34 is also called, for example, a network device, a network controller, a network card, or a communication module. The communication device 34 may further include a wireless communication interface. Examples of connectors and interface circuits for wired connections include products that comply with wired LAN, IEEE 1394, and USB. Examples of wireless communication interfaces include products that comply with wireless LAN and Bluetooth (registered trademark), etc.
 収音装置35は、例えば、マイクを含む。収音装置35は、ユーザU3の音声を収音する。収音装置35は、収音した音声を示す音声データを、処理装置31Aに供給する。 The sound pickup device 35 includes, for example, a microphone. The sound pickup device 35 picks up the voice of the user U3. The sound pickup device 35 supplies audio data indicating the picked-up voice to the processing device 31A.
 スピーカ36は音声を放音するデバイスである。スピーカ36は、処理装置31Aによる制御のもとで、各種の音声を放音する。例えば、デジタル信号である音声データが、図示しないDA変換器によってアナログ信号である音声信号に変換される。当該音声信号は、図示しないアンプによって、振幅が増幅される。スピーカ36は、増幅された振幅を有する音声信号によって示される音声を放音する。本実施形態において、スピーカ36は、web会議の音声を放音する。 The speaker 36 is a device that emits audio. The speaker 36 emits various types of audio under the control of the processing device 31A. For example, audio data, which is a digital signal, is converted into an audio signal, which is an analog signal, by a DA converter (not shown). The amplitude of the audio signal is amplified by an amplifier (not shown). The speaker 36 emits audio represented by the audio signal having the amplified amplitude. In this embodiment, the speaker 36 emits the audio of the web conference.
 ディスプレイ38A及び38Bは、画像を表示するデバイスである。ディスプレイ38A及び38Bは、処理装置31Aの制御のもとで各種の画像を表示する。ディスプレイ38Aは左眼用の画像を表示する。ディスプレイ38Bは右眼用の画像を表示する。左眼用の画像と右眼用の画像とは、左眼と右眼の視差を考慮した、互いに異なる画像である。これらの画像は、処理装置31Aが、第2会議管理装置20から受信した3次元画像に基づいて生成される。本実施形態において、ディスプレイ38A及び38Bには、後述のように、第2web会議が行われる仮想的な会議室の画像が表示される。 Displays 38A and 38B are devices that display images. Displays 38A and 38B display various images under the control of processing device 31A. Display 38A displays an image for the left eye. Display 38B displays an image for the right eye. The image for the left eye and the image for the right eye are different images that take into account the parallax between the left eye and the right eye. These images are generated by processing device 31A based on a three-dimensional image received from second conference management device 20. In this embodiment, displays 38A and 38B display an image of a virtual conference room where the second web conference is held, as described below.
 以上の構成において、処理装置31Aは、制御プログラムPR3Aを記憶装置32Aから読み出す。処理装置31Aは、読み出した制御プログラムPR3Aを実行することによって、通信制御部311A、及び表示制御部312として、機能する。 In the above configuration, the processing device 31A reads out the control program PR3A from the storage device 32A. The processing device 31A executes the read out control program PR3A to function as a communication control device 311A and a display control device 312.
 通信制御部311Aは、ユーザU3が入力装置33Aを操作することによって入力された入力情報、及び、収音装置35から取得した音声情報を、第2会議管理装置20に対して、通信装置34に送信させる。また、通信制御部311Aは、第2web会議を行うための会議情報を、第2会議管理装置20から、通信装置34に受信させる。当該会議情報は、仮想的な会議室の画像を示す画像情報、ユーザU3及びユーザU4の各々のアバターの画像を示す画像情報、ユーザU1及びユーザU2の身体を実写することによって得られた動画を示す動画情報、及びユーザU1~ユーザU4の音声情報を含む。 The communication control unit 311A causes the second conference management device 20 to transmit to the communication device 34 the input information input by user U3 operating the input device 33A and the audio information acquired from the audio pickup device 35. The communication control unit 311A also causes the communication device 34 to receive conference information for holding the second web conference from the second conference management device 20. The conference information includes image information showing an image of a virtual conference room, image information showing images of the avatars of users U3 and U4, video information showing videos obtained by taking actual photographs of the bodies of users U1 and U2, and audio information of users U1 to U4.
 表示制御部312は、通信制御部311Aが通信装置34に受信させた会議情報を用いて、ディスプレイ38A及び38Bに、仮想的な会議室を示す仮想空間の3次元画像を表示させる。 The display control unit 312 uses the conference information received by the communication device 34 by the communication control unit 311A to display a three-dimensional image of a virtual space showing a virtual conference room on the displays 38A and 38B.
 図3は、ディスプレイ38A及び38Bに表示される仮想空間VSの一例を示す図である。仮想的な会議室としての仮想空間VSには、長机Tが設置される。長机Tを挟んで、2脚の椅子C1及びC2が配置される。ユーザU3に対応するアバターA3は、椅子C1に座った状態で表示される。同様に、ユーザU4に対応するアバターA4は、椅子C2に座った状態で表示される。アバターA3及びアバターA4から見て正面の壁Wには、ユーザU1の実写画像R1と、ユーザU2の実写画像R2が表示されるパネルPが設置される。実写画像R1と実写画像R2は、仮想オブジェクトとしての2次元の画像である。 FIG. 3 is a diagram showing an example of a virtual space VS displayed on displays 38A and 38B. A long table T is installed in the virtual space VS, which serves as a virtual conference room. Two chairs C1 and C2 are arranged on either side of the long table T. Avatar A3 corresponding to user U3 is displayed sitting in chair C1. Similarly, avatar A4 corresponding to user U4 is displayed sitting in chair C2. A panel P is installed on the wall W in front of avatars A3 and A4, on which a live-action image R1 of user U1 and a live-action image R2 of user U2 are displayed. Live-action images R1 and R2 are two-dimensional images serving as virtual objects.
 図3に示される仮想的な会議室において、ユーザU3は自身に対応するアバターA3を用いて第2web会議に参加できる。ユーザU4は自身に対応するアバターA4を用いて第2web会議に参加できる。また、ユーザU1及びユーザU2の各々は、第2web会議に関するサービスに加入していなくても、第2web会議に参加できる。 In the virtual conference room shown in FIG. 3, user U3 can participate in the second web conference using avatar A3 corresponding to the user. User U4 can participate in the second web conference using avatar A4 corresponding to the user. Furthermore, each of users U1 and U2 can participate in the second web conference even if they are not subscribed to the service related to the second web conference.
 図2A及び図3を参照することにより、端末装置30-3の構成について説明したが、端末装置30-4も端末装置30-3と同様の構成である。 The configuration of terminal device 30-3 has been described with reference to Figures 2A and 3, but terminal device 30-4 has a similar configuration to terminal device 30-3.
 図2Bは、端末装置30-1の構成例を示すブロック図である。以下では、説明の簡略化のために、端末装置30-1と端末装置30-3とで共通の構成要素については、同一の符号を用いて、その説明を省略する。また、以下では主として、端末装置30-1が端末装置30-3と相違する点について説明する。 FIG. 2B is a block diagram showing an example configuration of terminal device 30-1. In the following, to simplify the explanation, the same reference numerals are used for components common to terminal device 30-1 and terminal device 30-3, and their explanation is omitted. In addition, the following mainly describes the differences between terminal device 30-1 and terminal device 30-3.
 端末装置30-1は、端末装置30-3に備わる入力装置33Aの代わりに入力装置33Bを備える。また、端末装置30-1は、端末装置30-3と異なり、撮像装置37を備える。撮像装置37は、例えば、カメラである。また、端末装置30-1は、端末装置30-3に備わるディスプレイ38A及び38Bの代わりに、1枚のディスプレイ38を備える。また、端末装置30-1は、端末装置30―3に備わる記憶装置32Aの代わりに、記憶装置32Bを備える。記憶装置32Bは、記憶装置32Aが記憶する制御プログラムPR3Aの代わりに制御プログラムPR3Bを記憶する。また、端末装置30-1は、処理装置31Aの代わりに処理装置31Bを備える。処理装置31Bは、処理装置31Aに備わる通信制御部311Aの代わりに通信制御部311Bを備える。 The terminal device 30-1 has an input device 33B instead of the input device 33A of the terminal device 30-3. The terminal device 30-1 also has an imaging device 37, unlike the terminal device 30-3. The imaging device 37 is, for example, a camera. The terminal device 30-1 also has a single display 38 instead of the displays 38A and 38B of the terminal device 30-3. The terminal device 30-1 also has a storage device 32B instead of the storage device 32A of the terminal device 30-3. The storage device 32B stores the control program PR3B instead of the control program PR3A stored in the storage device 32A. The terminal device 30-1 also has a processing device 31B instead of the processing device 31A. The processing device 31B has a communication control device 311B instead of the communication control device 311A of the processing device 31A.
 入力装置33Bは、ユーザU1からの操作を受け付ける。例えば、入力装置33Bは、キーボード、タッチパッド、タッチパネル又はマウス等のポインティングデバイスを含む。ここで、入力装置33Bは、タッチパネルを含む場合、ディスプレイ38を兼ねてもよい。 The input device 33B accepts operations from the user U1. For example, the input device 33B includes a keyboard, a touchpad, a touch panel, or a pointing device such as a mouse. Here, if the input device 33B includes a touch panel, it may also serve as the display 38.
 撮像装置37は、外界を撮像することによって得られた撮像画像を示す撮像情報を出力する。撮像装置37は、例えば、レンズ、撮像素子、増幅器、及びAD変換器を備える。レンズを介して集光された光は、撮像素子によってアナログ信号である撮像信号に変換される。増幅器は、撮像信号を増幅することによって増幅撮像信号を生成する。AD変換器はアナログ信号である増幅撮像信号をデジタル信号である撮像データに変換する。撮像データを含む撮像情報は、処理装置31Bに供給される。 The imaging device 37 outputs imaging information indicating an image obtained by imaging the outside world. The imaging device 37 includes, for example, a lens, an imaging element, an amplifier, and an AD converter. Light collected through the lens is converted by the imaging element into an imaging signal, which is an analog signal. The amplifier generates an amplified imaging signal by amplifying the imaging signal. The AD converter converts the amplified imaging signal, which is an analog signal, into imaging data, which is a digital signal. The imaging information including the imaging data is supplied to the processing device 31B.
 ディスプレイ38は、画像を表示するデバイスである。ディスプレイ38は、処理装置31Bの制御のもとで各種の画像を表示する。ディスプレイ38には、後述の表示画像DPが表示される。表示画像DPは、2次元画像である。 The display 38 is a device that displays images. The display 38 displays various images under the control of the processing device 31B. The display image DP, which will be described later, is displayed on the display 38. The display image DP is a two-dimensional image.
 通信制御部311Bは、仮想空間で開催される第2web会議に参加するための参加データを、第1会議管理装置10から、通信装置34に受信させる。当該参加データは、元々は、第1会議管理装置10が第2会議管理装置20から受信したデータである。また、当該参加データは、例えば、第2web会議に参加するためのリンクが付されたテキストデータである。ユーザU1は、入力装置33Bを操作して当該リンクをクリックすることによって、第2web会議に参加する。端末装置30-1は、第1会議管理装置10の制御の下、第1web会議に参加でき、更に、第2会議管理装置20の制御の下、第2web会議に参加できる。 The communication control unit 311B causes the communication device 34 to receive participation data for participating in the second web conference held in the virtual space from the first conference management device 10. The participation data is originally data received by the first conference management device 10 from the second conference management device 20. The participation data is, for example, text data with a link for participating in the second web conference. User U1 participates in the second web conference by operating the input device 33B and clicking on the link. The terminal device 30-1 can participate in the first web conference under the control of the first conference management device 10, and can also participate in the second web conference under the control of the second conference management device 20.
 図4は、端末装置30-1に備わる表示制御部312が、ディスプレイ38に表示させる表示画像DPの一例を示す図である。表示画像DPには、仮想空間VSを示す画像が含まれる。表示画像DPに含まれる仮想空間VSを示す画像は、図3に示される画像が得られるように仮想空間VSを見る方向とは反対方向から、仮想空間VSを見た場合の画像である。具体的には、図4に示される仮想空間VSにおいては、アバターA3及びアバターA4の正面が表示される。この結果、端末装置30-1のユーザU1は、アバターA3及びアバターA4の表情及び動作を視認できる。また、表示画像DPの右下部には、ユーザU1の実写画像R1と、ユーザU2の実写画像R2が表示される。表示画像DPは、2次元の画像である。 FIG. 4 is a diagram showing an example of a display image DP that the display control unit 312 of the terminal device 30-1 displays on the display 38. The display image DP includes an image showing the virtual space VS. The image showing the virtual space VS included in the display image DP is an image of the virtual space VS viewed from the opposite direction to the direction in which the virtual space VS is viewed so as to obtain the image shown in FIG. 3. Specifically, in the virtual space VS shown in FIG. 4, the front of the avatar A3 and the avatar A4 are displayed. As a result, the user U1 of the terminal device 30-1 can visually recognize the facial expressions and movements of the avatar A3 and the avatar A4. In addition, a live-action image R1 of the user U1 and a live-action image R2 of the user U2 are displayed in the lower right corner of the display image DP. The display image DP is a two-dimensional image.
 図5は、端末装置30-1に表示される仮想空間VSを示す画像、及び端末装置30-3に表示される仮想空間VSを示す画像の生成方法を示す図である。図5に示されるように、仮想空間VSにおいて、アバターA3及びアバターA4の正面に仮想的な第1カメラM1が設置される。一方で、仮想空間VSにおいて、アバターA3及びアバターA4の背後に仮想的な第2カメラM2が設置される。第1カメラM1は、「第1仮想カメラ」の一例である。第2カメラM2は、「第2仮想カメラ」の一例である。 FIG. 5 is a diagram showing a method of generating an image showing the virtual space VS displayed on terminal device 30-1, and an image showing the virtual space VS displayed on terminal device 30-3. As shown in FIG. 5, in the virtual space VS, a virtual first camera M1 is placed in front of avatar A3 and avatar A4. Meanwhile, in the virtual space VS, a virtual second camera M2 is placed behind avatar A3 and avatar A4. The first camera M1 is an example of a "first virtual camera." The second camera M2 is an example of a "second virtual camera."
 仮想空間VSにおいて、第1カメラM1によって生成された画像が、端末装置30-1において、仮想空間VSを示す画像として表示される。図4に示されるように、第1カメラM1によって生成された画像は、第1カメラM1が仮想空間VS上の会議室を撮像することによって得られた2次元画像である。 In the virtual space VS, an image generated by the first camera M1 is displayed on the terminal device 30-1 as an image showing the virtual space VS. As shown in FIG. 4, the image generated by the first camera M1 is a two-dimensional image obtained by the first camera M1 capturing an image of the conference room in the virtual space VS.
 また、仮想空間VSにおいて、第2カメラM2によって生成された画像が、端末装置30-3において、仮想空間VSを示す画像として表示される。図3に示されるように、第2カメラM2によって生成された画像は、第2カメラM2が仮想空間VS上の会議室を撮像することによって得られた3次元画像である。なお、図3に示される、当該3次元画像に含まれるユーザU1の実写画像R1及びユーザU2の実写画像R2は、2次元画像である、 In addition, in the virtual space VS, an image generated by the second camera M2 is displayed on the terminal device 30-3 as an image showing the virtual space VS. As shown in FIG. 3, the image generated by the second camera M2 is a three-dimensional image obtained by the second camera M2 capturing an image of the conference room in the virtual space VS. Note that the real-life image R1 of user U1 and the real-life image R2 of user U2 included in the three-dimensional image shown in FIG. 3 are two-dimensional images.
 図2B、図4、及び図5を参照することにより、端末装置30-1の構成について説明したが、端末装置30-2も端末装置30-1と同様の構成である。 The configuration of terminal device 30-1 has been described with reference to Figures 2B, 4, and 5, but terminal device 30-2 has the same configuration as terminal device 30-1.
1-3:第2会議管理装置の構成
 図6は、第2会議管理装置20の構成例を示すブロック図である。第2会議管理装置20は、処理装置21、記憶装置22、入力装置23、通信装置24、及びディスプレイ25を備える。第2会議管理装置20が有する各要素は、情報を通信するための単体又は複数のバスによって相互に接続される。
6 is a block diagram showing a configuration example of the second conference management device 20. The second conference management device 20 includes a processing device 21, a storage device 22, an input device 23, a communication device 24, and a display 25. The elements of the second conference management device 20 are connected to each other by one or more buses for communicating information.
 処理装置21は、第2会議管理装置20の全体を制御するプロセッサである。処理装置21は、例えば、単数又は複数のチップを含む。また、処理装置21は、例えば、周辺装置とのインターフェース、演算装置及びレジスタ等を含む中央処理装置(CPU:Central Processing Unit)を含む。なお、処理装置21が有する機能の一部又は全部は、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、又はFPGA(Field Programmable Gate Array)等のハードウェアによって実現されてもよい。処理装置21は、各種の処理を並列的又は逐次的に実行する。 The processing device 21 is a processor that controls the entire second conference management device 20. The processing device 21 includes, for example, one or more chips. The processing device 21 also includes, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, and registers. Some or all of the functions of the processing device 21 may be realized by hardware such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), or an FPGA (Field Programmable Gate Array). The processing device 21 executes various processes in parallel or sequentially.
 記憶装置22は、処理装置21による読取及び書込が可能な記録媒体である。また、記憶装置22は、処理装置21が実行する制御プログラムPR2を含む複数のプログラムを記憶する。
 また、記憶装置22は、画像情報データベースPDを記憶する。画像情報データベースPDには、仮想空間VSを示す画像を生成するための画像情報が格納される。当該画像情報は、仮想空間VS上の仮想的な会議室の画像を示す画像情報、アバターA3の画像を示す画像情報、及びアバターA4の画像を示す画像情報を含む。仮想的な会議室の画像には、長机Tの3次元画像、椅子C1及び椅子C2の3次元画像、壁Wの3次元画像、及びパネルPの3次元画像が含まれる。また、画像情報データベースPDには、図3に示される仮想空間VS上の仮想的な会議室における、長机Tの3次元画像、椅子C1及び椅子C2の3次元画像、壁Wの3次元画像、及びパネルPの3次元画像の各々の配置場所を示す配置場所情報も格納される。記憶装置22は、処理装置21のワークエリアとして機能する。
The storage device 22 is a recording medium that can be read from and written to by the processing device 21. The storage device 22 also stores a plurality of programs including a control program PR2 that the processing device 21 executes.
The storage device 22 also stores an image information database PD. The image information database PD stores image information for generating an image showing the virtual space VS. The image information includes image information showing an image of a virtual conference room in the virtual space VS, image information showing an image of the avatar A3, and image information showing an image of the avatar A4. The image of the virtual conference room includes a three-dimensional image of the long table T, a three-dimensional image of the chairs C1 and C2, a three-dimensional image of the wall W, and a three-dimensional image of the panel P. The image information database PD also stores placement location information showing the placement locations of the three-dimensional image of the long table T, the three-dimensional image of the chairs C1 and C2, the three-dimensional image of the wall W, and the three-dimensional image of the panel P in the virtual conference room in the virtual space VS shown in FIG. 3. The storage device 22 functions as a work area for the processing device 21.
 入力装置23は、第2会議管理装置20の管理者からの操作を受け付ける。例えば、入力装置23は、キーボード、タッチパッド、タッチパネル又はマウス等のポインティングデバイスを含む。 The input device 23 accepts operations from the administrator of the second conference management device 20. For example, the input device 23 includes a keyboard, a touchpad, a touch panel, or a pointing device such as a mouse.
 通信装置24は、他の装置と通信を行うための、送受信デバイスとしてのハードウェアである。また、通信装置24は、例えば、ネットワークデバイス、ネットワークコントローラ、ネットワークカード、又は通信モジュール等とも呼ばれる。また、通信装置24は、無線通信インターフェースを備えていてもよい。有線接続用のコネクター及びインターフェース回路としては有線LAN、IEEE1394、及びUSBに準拠した製品が挙げられる。また、無線通信インターフェースとしては無線LAN及びBluetooth(登録商標)等に準拠した製品が挙げられる。 The communication device 24 is hardware that functions as a transmitting/receiving device for communicating with other devices. The communication device 24 is also called, for example, a network device, a network controller, a network card, or a communication module. The communication device 24 may also be equipped with a wireless communication interface. Examples of connectors and interface circuits for wired connections include products that comply with wired LAN, IEEE 1394, and USB. Examples of wireless communication interfaces include products that comply with wireless LAN and Bluetooth (registered trademark), etc.
 ディスプレイ25は、画像を表示するデバイスである。ディスプレイ25は、処理装置21の制御のもとで各種の画像を表示する。 The display 25 is a device that displays images. The display 25 displays various images under the control of the processing device 21.
 以上の構成において、処理装置21は、制御プログラムPR2を記憶装置22から読み出す。処理装置21は、読み出した制御プログラムPR2を実行することによって、空間生成部211、アバター制御部212、移動制御部213、第1画像生成部214、第1音声取得部215、第2画像生成部216、第2音声取得部217、及び通信制御部218として、機能する。 In the above configuration, the processing device 21 reads out the control program PR2 from the storage device 22. By executing the read out control program PR2, the processing device 21 functions as a space generation unit 211, an avatar control unit 212, a movement control unit 213, a first image generation unit 214, a first audio acquisition unit 215, a second image generation unit 216, a second audio acquisition unit 217, and a communication control unit 218.
 空間生成部211は、仮想空間VS上の仮想的な会議室を示す3次元画像を生成する。当該3次元画像には、長机Tの3次元画像、椅子C1及び椅子C2の3次元画像、壁Wの3次元画像、及びパネルPの3次元画像が含まれる。空間生成部211は、画像情報データベースPDに格納される、これらの3次元画像の画像情報及び配置場所情報を用いて、仮想的な会議室を示す3次元画像を生成する。 The space generation unit 211 generates a three-dimensional image showing a virtual conference room in the virtual space VS. The three-dimensional image includes a three-dimensional image of the long table T, a three-dimensional image of chairs C1 and C2, a three-dimensional image of the wall W, and a three-dimensional image of the panel P. The space generation unit 211 generates a three-dimensional image showing the virtual conference room using the image information and placement information of these three-dimensional images stored in the image information database PD.
 アバター制御部212は、アバターA3及びアバターA4を、仮想空間VS上の仮想的な会議室に表示させる。図7は、アバター制御部212の機能ブロック図である。アバター制御部212は、アバター生成部212-1と動作制御部212-2とを備える。 The avatar control unit 212 displays avatar A3 and avatar A4 in a virtual conference room in the virtual space VS. FIG. 7 is a functional block diagram of the avatar control unit 212. The avatar control unit 212 includes an avatar generation unit 212-1 and an action control unit 212-2.
 アバター生成部212-1は、画像情報データベースPDに格納されるアバターA3に対応する画像情報を用いて、アバターA3の3次元画像を生成する。同様に、アバター生成部212-1は、画像情報データベースPDに格納されるアバターA4に対応する画像情報を用いて、アバターA4の3次元画像を生成する。更に、アバター生成部212-1は、仮想空間VSにアバターA3及びアバターA4の3次元画像を表示させる。 Avatar generation unit 212-1 generates a three-dimensional image of avatar A3 using image information corresponding to avatar A3 stored in image information database PD. Similarly, avatar generation unit 212-1 generates a three-dimensional image of avatar A4 using image information corresponding to avatar A4 stored in image information database PD. Furthermore, avatar generation unit 212-1 displays the three-dimensional images of avatar A3 and avatar A4 in virtual space VS.
 動作制御部212-2は、アバターA3及びアバターA4の動作を制御する。一例として、動作制御部212-2は、端末装置30-3に対するユーザU3の操作に基づいて、アバターA3の動作を制御する。同様に、動作制御部212-2は、端末装置30-4に対するユーザU4の操作に基づいて、アバターA4の動作を制御する。 The motion control unit 212-2 controls the motion of avatar A3 and avatar A4. As an example, the motion control unit 212-2 controls the motion of avatar A3 based on the operation of user U3 on terminal device 30-3. Similarly, the motion control unit 212-2 controls the motion of avatar A4 based on the operation of user U4 on terminal device 30-4.
 移動制御部213は、仮想空間VSにおいて、第1カメラM1及び第2カメラM2のうち、少なくとも1つを移動させる。より詳細には、移動制御部213は、仮想空間VS上の仮想的な会議室における、第1カメラM1及び第2カメラM2の配置場所と、第1カメラM1及び第2カメラM2の各々の移動方向及び移動速度を制御する。また、移動制御部213は、第1カメラM1及び第2カメラM2の撮像方向を制御する。更に、移動制御部213は、第1カメラM1及び第2カメラM2のズーム機能、すなわち撮像画像の拡大率を制御する。 The movement control unit 213 moves at least one of the first camera M1 and the second camera M2 in the virtual space VS. More specifically, the movement control unit 213 controls the placement locations of the first camera M1 and the second camera M2 in a virtual conference room in the virtual space VS, and the movement direction and movement speed of each of the first camera M1 and the second camera M2. The movement control unit 213 also controls the imaging direction of the first camera M1 and the second camera M2. Furthermore, the movement control unit 213 controls the zoom function of the first camera M1 and the second camera M2, i.e., the magnification ratio of the captured image.
 第1画像生成部214は、仮想空間VS上の会議室を第1カメラM1が撮像することによって得られた2次元画像を示す画像情報を生成する。第1画像生成部214は「第1生成部」の一例である。また、第1画像生成部214が生成する画像情報は「第1画像情報」の一例である。上記のように、端末装置30-1及び端末装置30-2に備わるディスプレイ38には、表示画像DPが表示される。換言すれば、第1画像生成部214は、表示画像DPに含まれる仮想空間VSの2次元画像を示す画像情報を生成する。上記のように、当該画像情報は、第1カメラM1が仮想空間VSを撮像することによって得られた2次元画像を示す。このため、「第1画像情報」としての当該画像情報には、図3に示されるユーザU1の実写画像R1と、ユーザU2の実写画像R2は含まれない。 The first image generating unit 214 generates image information showing a two-dimensional image obtained by the first camera M1 capturing an image of the conference room in the virtual space VS. The first image generating unit 214 is an example of a "first generating unit." The image information generated by the first image generating unit 214 is an example of "first image information." As described above, the display image DP is displayed on the display 38 provided on the terminal device 30-1 and the terminal device 30-2. In other words, the first image generating unit 214 generates image information showing a two-dimensional image of the virtual space VS included in the display image DP. As described above, the image information shows a two-dimensional image obtained by the first camera M1 capturing an image of the virtual space VS. For this reason, the image information as the "first image information" does not include the live-action image R1 of the user U1 and the live-action image R2 of the user U2 shown in FIG. 3.
 第1音声取得部215は、現実空間で開催される第1web会議における音声を取得する。また、「現実空間で開催される第1web会議における音声」は、「第1音声」の一例である。更に、第1音声取得部215は、「第1取得部」の一例である。 The first audio acquisition unit 215 acquires audio in the first web conference held in real space. Furthermore, the "audio in the first web conference held in real space" is an example of the "first audio." Furthermore, the first audio acquisition unit 215 is an example of the "first acquisition unit."
 ユーザU1が端末装置30-1を用いて第1web会議に参加する場合、端末装置30-1に備わる収音装置35が、ユーザU1の音声を収音する。また、収音装置35は、収音したユーザU1の音声を処理装置31Bに供給する。処理装置31Bは、ユーザU1の音声を示す音声情報を、通信装置34を介して第1会議管理装置10に送信する。端末装置30-2に備わる処理装置31Bも同様に、ユーザU2の音声を示す音声情報を、通信装置34を介して第1会議管理装置10に送信する。第1会議管理装置10は、後述のようにユーザU1の音声を示す音声情報及びユーザU2の音声を示す音声情報を、通信網NETを介して第2会議管理装置20に送信する。第1音声取得部215は、通信装置24を介して、ユーザU1の音声を示す音声情報、及びユーザU2の音声を示す音声情報を取得する。 When user U1 participates in the first web conference using terminal device 30-1, the sound pickup device 35 of terminal device 30-1 picks up the voice of user U1. The sound pickup device 35 also supplies the picked up voice of user U1 to the processing device 31B. The processing device 31B transmits voice information indicating the voice of user U1 to the first conference management device 10 via the communication device 34. Similarly, the processing device 31B of terminal device 30-2 transmits voice information indicating the voice of user U2 to the first conference management device 10 via the communication device 34. The first conference management device 10 transmits voice information indicating the voice of user U1 and voice information indicating the voice of user U2 to the second conference management device 20 via the communication network NET, as described below. The first voice acquisition unit 215 acquires voice information indicating the voice of user U1 and voice information indicating the voice of user U2 via the communication device 24.
 第2画像生成部216は、仮想空間VS上の会議室を第2カメラM2で撮像することによって得られた3次元画像を示す画像情報を生成する。第2画像生成部216は「第2生成部」の一例である。また、第2画像生成部216が生成する画像情報は「第2画像情報」の一例である。 The second image generating unit 216 generates image information showing a three-dimensional image obtained by capturing an image of the conference room in the virtual space VS with the second camera M2. The second image generating unit 216 is an example of a "second generating unit." The image information generated by the second image generating unit 216 is an example of "second image information."
 図3に示されるように、上記の3次元画像には、ユーザU1の実写画像R1と、ユーザU2の実写画像R2が含まれる。すなわち「第2画像情報」としての画像情報には、図3に示されるユーザU1の実写画像R1と、ユーザU2の実写画像R2が含まれる。これらの実写画像R1及び実写画像R2は、「第1web会議に関する2次元画像を示す仮想オブジェクト」の一例である。 As shown in FIG. 3, the above three-dimensional image includes a live-action image R1 of user U1 and a live-action image R2 of user U2. That is, the image information as the "second image information" includes the live-action image R1 of user U1 and the live-action image R2 of user U2 shown in FIG. 3. These live-action images R1 and R2 are examples of "virtual objects showing two-dimensional images related to the first web conference."
 上記のように、端末装置30-3及び端末装置30-4に備わるディスプレイ38A及び38Bには、図3に例示される仮想空間VSを示す3次元画像が表示される。第2画像生成部216は、当該3次元画像を示す画像情報を生成する。 As described above, a three-dimensional image showing the virtual space VS illustrated in FIG. 3 is displayed on the displays 38A and 38B of the terminal devices 30-3 and 30-4. The second image generator 216 generates image information showing the three-dimensional image.
 第2音声取得部217は、仮想空間VSで開催される第2web会議における音声を取得する。また、「仮想空間VSで開催される第2web会議における音声」は、「第2音声」の一例である。更に、第2音声取得部217は、「第2取得部」の一例である。 The second audio acquisition unit 217 acquires audio in the second web conference held in the virtual space VS. Furthermore, the "audio in the second web conference held in the virtual space VS" is an example of the "second audio." Furthermore, the second audio acquisition unit 217 is an example of the "second acquisition unit."
 ユーザU3が端末装置30-3を用いて第2web会議に参加する場合、端末装置30-3に備わる収音装置35が、ユーザU3の音声を収音する。また、収音装置35は、収音したユーザU3の音声を示す音声情報を、処理装置31Aに供給する。処理装置31Aは、ユーザU3の音声を示す音声情報を、通信装置34を介して第2会議管理装置20に送信する。端末装置30-4に備わる処理装置31Aも同様に、ユーザU4の音声を示す音声情報を、通信装置34を介して第2会議管理装置20に送信する。第2音声取得部217は、通信装置24を介して、ユーザU3の音声を示す音声情報、及びユーザU4の音声を示す音声情報を取得する。 When user U3 participates in the second web conference using terminal device 30-3, the sound pickup device 35 of terminal device 30-3 picks up the voice of user U3. The sound pickup device 35 also supplies voice information indicating the picked-up voice of user U3 to the processing device 31A. The processing device 31A transmits the voice information indicating the voice of user U3 to the second conference management device 20 via the communication device 34. Similarly, the processing device 31A of terminal device 30-4 transmits voice information indicating the voice of user U4 to the second conference management device 20 via the communication device 34. The second voice acquisition unit 217 acquires the voice information indicating the voice of user U3 and the voice information indicating the voice of user U4 via the communication device 24.
 通信制御部218は、第1画像生成部214が生成した第1画像情報を、外部装置としての第1会議管理装置10に対して、通信装置24に送信させる。また、通信制御部218は、端末装置30-1から、実写画像R1を示す第1撮像情報を、通信装置24に受信させる。また、通信制御部218は、端末装置30-2から、実写画像R2を示す第2撮像情報を、通信装置24に受信させる。また、通信制御部218は、第2画像生成部216が生成した第2画像情報を、端末装置30-3及び端末装置30-4に対して、通信装置24に送信させる。また、通信制御部218は、第1音声を示す第1音声情報を、端末装置30-3及び端末装置30-4に対して、通信装置24に送信させる。更に、通信制御部218は、第2音声を示す第2音声情報を、外部装置としての第1会議管理装置10に対して、通信装置24に送信させる。通信制御部218は、「通信部」の一例である。 The communication control unit 218 causes the communication device 24 to transmit the first image information generated by the first image generation unit 214 to the first conference management device 10 as an external device. The communication control unit 218 also causes the communication device 24 to receive first imaging information indicating the live-action image R1 from the terminal device 30-1. The communication control unit 218 also causes the communication device 24 to receive second imaging information indicating the live-action image R2 from the terminal device 30-2. The communication control unit 218 also causes the communication device 24 to transmit the second image information generated by the second image generation unit 216 to the terminal devices 30-3 and 30-4. The communication control unit 218 also causes the communication device 24 to transmit first audio information indicating the first audio to the terminal devices 30-3 and 30-4. The communication control unit 218 also causes the communication device 24 to transmit second audio information indicating the second audio to the first conference management device 10 as an external device. The communication control unit 218 is an example of a "communication unit."
1-4:第1会議管理装置の構成
 図8は、第1会議管理装置10の構成例を示すブロック図である。第1会議管理装置10は、処理装置11、記憶装置12、入力装置13、通信装置14、及びディスプレイ15を備える。第1会議管理装置10が有する各要素は、情報を通信するための単体又は複数のバスによって相互に接続される。
8 is a block diagram showing a configuration example of the first conference management device 10. The first conference management device 10 includes a processing device 11, a storage device 12, an input device 13, a communication device 14, and a display 15. The elements of the first conference management device 10 are connected to each other by one or more buses for communicating information.
 処理装置11は、第1会議管理装置10の全体を制御するプロセッサである。処理装置11は、例えば、単数又は複数のチップを含む。また、処理装置11は、例えば、周辺装置とのインターフェース、演算装置及びレジスタ等を含む中央処理装置(CPU:Central Processing Unit)を含む。なお、処理装置11が有する機能の一部又は全部は、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、又はFPGA(Field Programmable Gate Array)等のハードウェアによって実現されてもよい。処理装置11は、各種の処理を並列的又は逐次的に実行する。 The processing device 11 is a processor that controls the entire first conference management device 10. The processing device 11 includes, for example, one or more chips. The processing device 11 also includes, for example, a central processing unit (CPU) that includes an interface with peripheral devices, an arithmetic unit, and registers. Some or all of the functions of the processing device 11 may be realized by hardware such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), or an FPGA (Field Programmable Gate Array). The processing device 11 executes various processes in parallel or sequentially.
 記憶装置12は、処理装置11による読取及び書込が可能な記録媒体である。また、記憶装置12は、処理装置11が実行する制御プログラムPR1を含む複数のプログラムを記憶する。
 また、記憶装置12は、アプリケーションデータADを記憶する。アプリケーションデータADは、端末装置30-1及び端末装置30-2が、例えばZOOM、又はTeamsのような、web会議を実行するためのアプリケーションを利用するために必要となるデータである。
 記憶装置12は、処理装置11のワークエリアとして機能する。
The storage device 12 is a recording medium that can be read from and written to by the processing device 11. The storage device 12 also stores a plurality of programs including a control program PR1 that the processing device 11 executes.
The storage device 12 also stores application data AD. The application data AD is data necessary for the terminal device 30-1 and the terminal device 30-2 to use an application for conducting a web conference, such as ZOOM or Teams.
The storage device 12 functions as a work area for the processing device 11 .
 入力装置13は、第1会議管理装置10の管理者からの操作を受け付ける。例えば、入力装置13は、キーボード、タッチパッド、タッチパネル又はマウス等のポインティングデバイスを含む。ここで、入力装置13は、タッチパネルを含む場合、ディスプレイ15を兼ねてもよい。 The input device 13 accepts operations from the administrator of the first conference management device 10. For example, the input device 13 includes a keyboard, a touchpad, a touch panel, or a pointing device such as a mouse. Here, if the input device 13 includes a touch panel, it may also serve as the display 15.
 通信装置14は、他の装置と通信を行うための、送受信デバイスとしてのハードウェアである。また、通信装置14は、例えば、ネットワークデバイス、ネットワークコントローラ、ネットワークカード、通信モジュール等とも呼ばれる。また、通信装置14は、無線通信インターフェースを備えていてもよい。有線接続用のコネクター及びインターフェース回路としては有線LAN、IEEE1394、及びUSBに準拠した製品が挙げられる。また、無線通信インターフェースとしては無線LAN及びBluetooth(登録商標)等に準拠した製品が挙げられる。 The communication device 14 is hardware that functions as a transmitting/receiving device for communicating with other devices. The communication device 14 is also called, for example, a network device, a network controller, a network card, a communication module, etc. The communication device 14 may also be equipped with a wireless communication interface. Examples of connectors and interface circuits for wired connections include products that comply with wired LAN, IEEE 1394, and USB. Examples of wireless communication interfaces include products that comply with wireless LAN and Bluetooth (registered trademark), etc.
 ディスプレイ15は、画像を表示するデバイスである。ディスプレイ15は、処理装置11の制御のもとで各種の画像を表示する。 The display 15 is a device that displays images. The display 15 displays various images under the control of the processing device 11.
 以上の構成において、処理装置11は、制御プログラムPR1を記憶装置12から読み出す。処理装置11は、読み出した制御プログラムPR1を実行することによって、通信制御部111として、機能する。 In the above configuration, the processing device 11 reads the control program PR1 from the storage device 12. The processing device 11 functions as a communication control unit 111 by executing the read control program PR1.
 通信制御部111は、第2会議管理装置20から、仮想空間で開催される第2web会議に参加するための参加データを、通信装置14に受信させる。また、通信制御部111は、第2会議管理装置20から受信した参加データを、端末装置30-1及び端末装置30-2に対して、通信装置14に送信させる。 The communication control unit 111 causes the communication device 14 to receive participation data for participating in the second web conference held in the virtual space from the second conference management device 20. The communication control unit 111 also causes the communication device 14 to transmit the participation data received from the second conference management device 20 to the terminal device 30-1 and terminal device 30-2.
 また、通信制御部111は、第2会議管理装置20から、第1画像情報を通信装置14に受信させる。また、通信制御部111は、第2会議管理装置20から受信した第1画像情報を、端末装置30-1及び端末装置30-2に対して、通信装置14に送信させる。また、通信制御部111は、端末装置30-1及び端末装置30-2から、第1音声情報を、通信装置14に受信させる。また、通信制御部111は、端末装置30-3及び端末装置30-4から受信した第1音声情報を、第2会議管理装置20に対して、通信装置14に送信させる。また、通信制御部111は、第2会議管理装置20から、第2音声情報を、通信装置14に受信させる。更に、通信制御部111は、第2会議管理装置20から受信した第2音声情報を、端末装置30-1及び端末装置30-2に対して、通信装置14に送信させる。 The communication control unit 111 also causes the communication device 14 to receive the first image information from the second conference management device 20. The communication control unit 111 also causes the terminal devices 30-1 and 30-2 to transmit the first image information received from the second conference management device 20 to the communication device 14. The communication control unit 111 also causes the communication device 14 to receive the first audio information from the terminal devices 30-1 and 30-2. The communication control unit 111 also causes the second conference management device 20 to transmit the first audio information received from the terminal devices 30-3 and 30-4 to the communication device 14. The communication control unit 111 also causes the communication device 14 to receive the second audio information from the second conference management device 20. The communication control unit 111 also causes the communication device 14 to transmit the second audio information received from the second conference management device 20 to the communication device 14.
2:実施形態の動作
 図9は、実施形態に係る会議システム1の動作を示すシーケンス図である。
2: Operation of the embodiment FIG. 9 is a sequence diagram showing the operation of the conference system 1 according to the embodiment.
 ステップS1において、第2会議管理装置20に備わる処理装置21は、空間生成部211として機能する。処理装置21は、仮想空間VSを示す3次元画像を生成する。当該仮想空間VSには、仮想的な会議室が含まれる。また、当該仮想的な会議室を示す3次元画像には、長机Tを示す3次元画像、椅子C1及び椅子C2を示す3次元画像、壁Wを示す3次元画像、及びパネルPを示す3次元画像が含まれる。 In step S1, the processing device 21 provided in the second conference management device 20 functions as a space generation unit 211. The processing device 21 generates a three-dimensional image showing a virtual space VS. The virtual space VS includes a virtual conference room. The three-dimensional image showing the virtual conference room also includes a three-dimensional image showing a long table T, a three-dimensional image showing chairs C1 and C2, a three-dimensional image showing a wall W, and a three-dimensional image showing a panel P.
 ステップS2において、第2会議管理装置20に備わる処理装置21は、アバター制御部212として機能する。処理装置21は、アバターA3及びアバターA4を示す3次元画像を生成する。 In step S2, the processing device 21 included in the second conference management device 20 functions as the avatar control unit 212. The processing device 21 generates a three-dimensional image showing avatar A3 and avatar A4.
 ステップS3において、第2会議管理装置20に備わる処理装置21は、第1画像生成部214として機能する。処理装置21は、第1画像情報PI1を生成する。 In step S3, the processing device 21 included in the second conference management device 20 functions as the first image generation unit 214. The processing device 21 generates the first image information PI1.
 ステップS4において、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、第1会議管理装置10に対して、第1画像情報PI1を通信装置24に送信させる。また、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、第2会議管理装置20から、第1画像情報PI1を通信装置14に受信させる。 In step S4, the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218. The processing device 21 causes the first conference management device 10 to transmit the first image information PI1 to the communication device 24. The processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the communication device 14 to receive the first image information PI1 from the second conference management device 20.
 ステップS5において、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、端末装置30-1に対して、第1画像情報PI1を通信装置14に送信させる。また、端末装置30-1に備わる処理装置31Bは、通信制御部311Bとして機能する。処理装置31Bは、第1会議管理装置10から、第1画像情報PI1を通信装置34に受信させる。 In step S5, the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the terminal device 30-1 to transmit the first image information PI1 to the communication device 14. The processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B. The processing device 31B causes the communication device 34 to receive the first image information PI1 from the first conference management device 10.
 ステップS6において、端末装置30-1に備わる処理装置31Bは、通信制御部311Bとして機能する。処理装置31Bは、第1会議管理装置10に対して、第1撮像情報RI1を通信装置34に送信させる。また、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、端末装置30-1から、第1撮像情報RI1を通信装置14に受信させる。 In step S6, the processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B. The processing device 31B causes the first conference management device 10 to transmit the first imaging information RI1 to the communication device 34. The processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the communication device 14 to receive the first imaging information RI1 from the terminal device 30-1.
 ステップS7において、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、第2会議管理装置20に対して、第1撮像情報RI1を通信装置14に送信させる。また、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、第1会議管理装置10から、第1撮像情報RI1を通信装置24に受信させる。 In step S7, the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the second conference management device 20 to transmit the first imaging information RI1 to the communication device 14. The processing device 21 provided in the second conference management device 20 functions as a communication control unit 218. The processing device 21 causes the communication device 24 to receive the first imaging information RI1 from the first conference management device 10.
 ステップS8において、第2会議管理装置20に備わる処理装置21は、第2画像生成部216として機能する。処理装置21は、第2画像情報PI2を生成する。 In step S8, the processing device 21 included in the second conference management device 20 functions as the second image generation unit 216. The processing device 21 generates the second image information PI2.
 ステップS9において、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、端末装置30-3に対して、第2画像情報PI2を通信装置24に送信させる。また、端末装置30-3に備わる処理装置31Aは、通信制御部311Aとして機能する。処理装置31Aは、第2会議管理装置20から、第2画像情報PI2を通信装置34に受信させる。 In step S9, the processing device 21 provided in the second conference management device 20 functions as the communication control unit 218. The processing device 21 causes the terminal device 30-3 to transmit the second image information PI2 to the communication device 24. The processing device 31A provided in the terminal device 30-3 functions as the communication control unit 311A. The processing device 31A causes the communication device 34 to receive the second image information PI2 from the second conference management device 20.
 ステップS10において、端末装置30-1に備わる処理装置31Bは、通信制御部311Bとして機能する。処理装置31Bは、第1会議管理装置10に対して、第1音声を示す第1音声情報SI1を通信装置34に送信させる。また、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、端末装置30-1から、第1音声情報SI1を通信装置14に受信させる。 In step S10, the processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B. The processing device 31B causes the first conference management device 10 to transmit first voice information SI1 indicating the first voice to the communication device 34. The processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the communication device 14 to receive the first voice information SI1 from the terminal device 30-1.
 ステップS11において、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、第2会議管理装置20に対して、第1音声情報SI1を通信装置14に送信させる。また、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、第1会議管理装置10から、第1音声情報SI1を通信装置24に受信させる。 In step S11, the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the second conference management device 20 to transmit the first audio information SI1 to the communication device 14. The processing device 21 provided in the second conference management device 20 functions as a communication control unit 218. The processing device 21 causes the communication device 24 to receive the first audio information SI1 from the first conference management device 10.
 ステップS12において、端末装置30-3に備わる処理装置31Aは、通信制御部311Aとして機能する。処理装置31Aは、第2会議管理装置20に対して、第2音声を示す第2音声情報SI2を通信装置34に送信させる。また、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、端末装置30-3から、第2音声情報SI2を通信装置24に受信させる。 In step S12, the processing device 31A provided in the terminal device 30-3 functions as a communication control unit 311A. The processing device 31A causes the second conference management device 20 to transmit second voice information SI2 indicating the second voice to the communication device 34. The processing device 21 provided in the second conference management device 20 functions as a communication control unit 218. The processing device 21 causes the communication device 24 to receive the second voice information SI2 from the terminal device 30-3.
 ステップS13において、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、端末装置30-3に対して、第1音声情報SI1を通信装置24に送信させる。また、端末装置30-3に備わる処理装置31Aは、通信制御部311Aとして機能する。処理装置31Aは、第2会議管理装置20から、第1音声情報SI1を通信装置34に受信させる。 In step S13, the processing device 21 provided in the second conference management device 20 functions as the communication control unit 218. The processing device 21 causes the terminal device 30-3 to transmit the first audio information SI1 to the communication device 24. The processing device 31A provided in the terminal device 30-3 functions as the communication control unit 311A. The processing device 31A causes the communication device 34 to receive the first audio information SI1 from the second conference management device 20.
 ステップS14において、第2会議管理装置20に備わる処理装置21は、通信制御部218として機能する。処理装置21は、第1会議管理装置10に対して、第2音声情報SI2を通信装置24に送信させる。また、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、第2会議管理装置20から、第2音声情報SI2を通信装置14に受信させる。 In step S14, the processing device 21 provided in the second conference management device 20 functions as a communication control unit 218. The processing device 21 causes the first conference management device 10 to transmit the second audio information SI2 to the communication device 24. The processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the communication device 14 to receive the second audio information SI2 from the second conference management device 20.
 ステップS15において、第1会議管理装置10に備わる処理装置11は、通信制御部111として機能する。処理装置11は、端末装置30-1に対して、第2音声情報SI2を通信装置14に送信させる。また、端末装置30-1に備わる処理装置31Bは、通信制御部311Bとして機能する。処理装置31Bは、第1会議管理装置10から、第2音声情報SI2を通信装置34に受信させる。 In step S15, the processing device 11 provided in the first conference management device 10 functions as a communication control unit 111. The processing device 11 causes the terminal device 30-1 to transmit the second audio information SI2 to the communication device 14. The processing device 31B provided in the terminal device 30-1 functions as a communication control unit 311B. The processing device 31B causes the communication device 34 to receive the second audio information SI2 from the first conference management device 10.
 なお、端末装置30-2は端末装置30-1と同様の動作を実行する。端末装置30-4は、端末装置30-3と同様の動作を実行する。また、上記のステップS1~ステップS15のステップは、必要に応じて、適宜順序を入れ替えてもよい。 Note that terminal device 30-2 performs the same operations as terminal device 30-1. Terminal device 30-4 performs the same operations as terminal device 30-3. Furthermore, the order of steps S1 to S15 above may be changed as necessary.
3:実施形態が奏する効果
 以上の説明によれば、第2会議管理装置20は、現実空間で開催される第1web会議を管理する外部装置としての第1会議管理装置10と通信し、仮想空間VSで開催される第2web会議を管理する。第2会議管理装置20は、第1生成部としての第1画像生成部214、第2生成部としての第2画像生成部216、及び通信部としての通信制御部218を備える。第1画像生成部214は、仮想空間VS上の会議室を第1仮想カメラとしての第1カメラM1が撮像することによって得られた2次元画像を示す第1画像情報PI1を生成する。第2画像生成部216は、外部装置としての第1会議管理装置10から出力される第1web会議に関する2次元画像を示す仮想オブジェクトが配置された仮想空間VS上の会議室を、第2仮想カメラとしての第2カメラM2が撮像することによって得られた3次元画像を示す第2画像情報を生成する。通信制御部218は、第1画像情報PI1を外部装置としての第1会議管理装置10へ送信すると共に、第2画像情報PI2を第2web会議に参加する1又は複数の端末装置30-3~30-4へ送信する。
3: Effects of the embodiment According to the above description, the second conference management device 20 communicates with the first conference management device 10 as an external device that manages the first web conference held in the real space, and manages the second web conference held in the virtual space VS. The second conference management device 20 includes a first image generation unit 214 as a first generation unit, a second image generation unit 216 as a second generation unit, and a communication control unit 218 as a communication unit. The first image generation unit 214 generates first image information PI1 indicating a two-dimensional image obtained by capturing an image of a conference room in the virtual space VS with a first camera M1 as a first virtual camera. The second image generation unit 216 generates second image information indicating a three-dimensional image obtained by capturing an image of a conference room in the virtual space VS in which a virtual object indicating a two-dimensional image related to the first web conference output from the first conference management device 10 as an external device is arranged with a second camera M2 as a second virtual camera. The communication control unit 218 transmits the first image information PI1 to the first conference management device 10 serving as an external device, and transmits the second image information PI2 to one or more terminal devices 30-3 to 30-4 participating in the second Web conference.
 第2会議管理装置20は、以上の構成を有するので、現実空間におけるweb会議のユーザと、仮想空間VSにおけるweb会議のユーザとが、共にweb会議に参加できる。例えば、仮想空間VSにおけるweb会議のユーザが、アバターを用いて当該web会議に参加する場合、従来であれば、未だアバターを生成していないユーザは、当該web会議に参加できなかった。実施形態に係る第2会議管理装置20によって、未だアバターを生成していないユーザは、現実空間におけるweb会議のユーザとして、アバターが用いられる仮想空間VSにおけるweb会議に参加できる。 The second conference management device 20 has the above configuration, so that both users of a web conference in real space and users of a web conference in virtual space VS can participate in a web conference. For example, when a user of a web conference in virtual space VS participates in the web conference using an avatar, conventionally, a user who has not yet generated an avatar could not participate in the web conference. With the second conference management device 20 according to the embodiment, a user who has not yet generated an avatar can participate in a web conference in virtual space VS in which an avatar is used, as a user of a web conference in real space.
 また、以上の説明によれば、第2会議管理装置20は、第1取得部としての第1音声取得部215、及び第2取得部としての第2音声取得部217を備える。第1音声取得部215は、第1web会議における第1音声を取得する。第2音声取得部217は、第2web会議における第2音声を取得する。通信制御部218は、第1音声を1又は複数の端末装置30-1~30-2へ送信し、第2音声を外部装置としての第1会議管理装置10に送信する。 Furthermore, according to the above description, the second conference management device 20 includes a first voice acquisition unit 215 as a first acquisition unit, and a second voice acquisition unit 217 as a second acquisition unit. The first voice acquisition unit 215 acquires the first voice in the first web conference. The second voice acquisition unit 217 acquires the second voice in the second web conference. The communication control unit 218 transmits the first voice to one or more terminal devices 30-1 to 30-2, and transmits the second voice to the first conference management device 10 as an external device.
 第2会議管理装置20は、以上の構成を有するので、現実空間におけるweb会議のユーザと、仮想空間VSにおけるweb会議のユーザとが、音声を用いて会話できる。 The second conference management device 20 has the above configuration, so that users of a web conference in real space and users of a web conference in virtual space VS can converse using voice.
 また、以上の説明によれば、第2会議管理装置20は、移動制御部213を備える。移動制御部213は、仮想空間VSにおいて、第1仮想カメラとしての第1カメラM1及び第2仮想カメラとしての第2カメラM2のうち、少なくとも1つを移動させる。 Furthermore, according to the above description, the second conference management device 20 includes a movement control unit 213. The movement control unit 213 moves at least one of the first camera M1 as the first virtual camera and the second camera M2 as the second virtual camera in the virtual space VS.
 第2会議管理装置20は、以上の構成を有するので、現実空間におけるweb会議のユーザU1が使用する端末装置30-1、及び現実空間におけるweb会議のユーザU2が使用する端末装置30-2において、任意の視点及び任意の角度からのweb会議の画像が表示される。同様に、仮想空間VSにおけるweb会議のユーザU3が使用する端末装置30-3、及び仮想空間VSにおけるweb会議のユーザU4が使用する端末装置30-4において、任意の視点及び任意の角度からのweb会議の画像が表示される。 Since the second conference management device 20 has the above configuration, images of the web conference from any viewpoint and any angle are displayed on the terminal device 30-1 used by user U1 of the web conference in the real space, and on the terminal device 30-2 used by user U2 of the web conference in the real space. Similarly, images of the web conference from any viewpoint and any angle are displayed on the terminal device 30-3 used by user U3 of the web conference in the virtual space VS, and on the terminal device 30-4 used by user U4 of the web conference in the virtual space VS.
 また、以上の説明によれば、第2会議管理装置20は、アバター制御部212を備える。アバター制御部212は、仮想空間VS上の会議室において、1又は複数の端末装置30-3~30-4のユーザU3~U4と1対1に対応する1又は複数のアバターA3~A4を表示させる。 Furthermore, according to the above explanation, the second conference management device 20 includes an avatar control unit 212. The avatar control unit 212 displays one or more avatars A3 to A4 in one-to-one correspondence with users U3 to U4 of one or more terminal devices 30-3 to 30-4 in the conference room in the virtual space VS.
 第2会議管理装置20は、以上の構成を有するので、仮想空間VSにおけるweb会議のユーザU3は、自身に対応するアバターA3を用いてweb会議に参加できる。同様に、仮想空間VSにおけるweb会議のユーザU4は、自身に対応するアバターA3を用いてweb会議に参加できる。 Since the second conference management device 20 has the above configuration, user U3 of the web conference in the virtual space VS can participate in the web conference using avatar A3 corresponding to himself. Similarly, user U4 of the web conference in the virtual space VS can participate in the web conference using avatar A3 corresponding to himself.
4:変形例
 本開示は、以上に例示した実施形態に限定されない。具体的な変形の態様を以下に例示する。以下の例示から任意に選択された2以上の態様を併合してもよい。
4: Modifications The present disclosure is not limited to the above-described embodiments. Specific modifications are exemplified below. Two or more of the following examples may be combined.
4-1:変形例1
 上記実施形態では、第2会議管理装置20の記憶装置22に制御プログラムPR2が記憶されていたが、制御プログラムPR2が単体で製造又は販売されてもよい。購入先への制御プログラムPR2の提供態様は、例えば、制御プログラムPR2が書き込まれたフラッシュROM等のコンピュータ読み取り可能な記録媒体が配布される態様、又は電気通信回線経由のダウンロードによって制御プログラムPR2が配布される態様である。
4-1: Modification 1
In the above embodiment, the control program PR2 is stored in the storage device 22 of the second conference management device 20, but the control program PR2 may be manufactured or sold separately. The control program PR2 may be provided to a purchaser, for example, by distributing a computer-readable recording medium such as a flash ROM on which the control program PR2 is written, or by distributing the control program PR2 by downloading it via a telecommunications line.
4-2:変形例2
 上記実施形態における空間生成部211、アバター制御部212、移動制御部213、第1画像生成部214、第1音声取得部215、第2画像生成部216、第2音声取得部217、及び通信制御部218は何れもソフトウェアモジュールであった。しかし、空間生成部211、アバター制御部212、移動制御部213、第1画像生成部214、第1音声取得部215、第2画像生成部216、第2音声取得部217、及び通信制御部218のうちの何れか一つ、何れか二つ、又は全部がハードウェアモジュールであってもよい。ハードウェアモジュールは、例えば、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、及び、FPGA(Field Programmable Gate Array)等である。空間生成部211、アバター制御部212、移動制御部213、第1画像生成部214、第1音声取得部215、第2画像生成部216、第2音声取得部217、及び通信制御部218のうちの何れか一つ、何れか二つ、又は全部がハードウェアモジュールであっても、上記実施形態と同一の効果が奏される。
4-2: Modification 2
In the above embodiment, the space generation unit 211, the avatar control unit 212, the movement control unit 213, the first image generation unit 214, the first voice acquisition unit 215, the second image generation unit 216, the second voice acquisition unit 217, and the communication control unit 218 are all software modules. However, any one, any two, or all of the space generation unit 211, the avatar control unit 212, the movement control unit 213, the first image generation unit 214, the first voice acquisition unit 215, the second image generation unit 216, the second voice acquisition unit 217, and the communication control unit 218 may be hardware modules. The hardware modules are, for example, DSPs (Digital Signal Processors), ASICs (Application Specific Integrated Circuits), PLDs (Programmable Logic Devices), and FPGAs (Field Programmable Gate Arrays). Even if any one, any two, or all of the space generation unit 211, the avatar control unit 212, the movement control unit 213, the first image generation unit 214, the first audio acquisition unit 215, the second image generation unit 216, the second audio acquisition unit 217, and the communication control unit 218 are hardware modules, the same effect as in the above embodiment can be achieved.
5:その他
(1)上述した実施形態では、記憶装置12、記憶装置22、記憶装置32A、及び記憶装置32Bは、ROM及びRAMなどを例示したが、フレキシブルディスク、光磁気ディスク(例えば、コンパクトディスク、デジタル多用途ディスク、Blu-ray(登録商標)ディスク)、スマートカード、フラッシュメモリデバイス(例えば、カード、スティック、キードライブ)、CD-ROM(Compact Disc-ROM)、レジスタ、リムーバブルディスク、ハードディスク、フロッピー(登録商標)ディスク、磁気ストリップ、データベース、サーバその他の適切な記憶媒体である。
5: Others (1) In the above-described embodiment, storage device 12, storage device 22, storage device 32A, and storage device 32B are exemplified by ROM and RAM, but the storage devices may be flexible disks, magneto-optical disks (e.g., compact disks, digital versatile disks, Blu-ray (registered trademark) disks), smart cards, flash memory devices (e.g., cards, sticks, key drives), CD-ROMs (Compact Disc-ROMs), registers, removable disks, hard disks, floppy (registered trademark) disks, magnetic strips, databases, servers, or other suitable storage media.
(2)上述した実施形態において、説明した情報、信号などは、様々な異なる技術の何れかを使用して表されてもよい。例えば、上記の説明全体に渡って言及され得るデータ、命令、コマンド、情報、信号、ビット、シンボル、チップなどは、電圧、電流、電磁波、磁界若しくは磁性粒子、光場若しくは光子、又はこれらの任意の組み合わせによって表されてもよい。 (2) In the above-described embodiments, the information, signals, etc. described may be represented using any of a variety of different technologies. For example, data, instructions, commands, information, signals, bits, symbols, chips, etc. that may be referred to throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or magnetic particles, optical fields or photons, or any combination thereof.
(3)上述した実施形態において、入出力された情報等は特定の場所(例えば、メモリ)に保存されてもよいし、管理テーブルを用いて管理してもよい。入出力される情報等は、上書き、更新、又は追記され得る。出力された情報等は削除されてもよい。入力された情報等は他の装置へ送信されてもよい。 (3) In the above-described embodiment, the input and output information, etc. may be stored in a specific location (e.g., memory) or may be managed using a management table. The input and output information, etc. may be overwritten, updated, or added to. The output information, etc. may be deleted. The input information, etc. may be transmitted to another device.
(4)上述した実施形態において、判定は、1ビットを用いて表される値(0か1か)によって行われてもよいし、真偽値(Boolean:true又はfalse)によって行われてもよいし、数値の比較(例えば、所定の値との比較)によって行われてもよい。 (4) In the above-described embodiment, the determination may be made based on a value (0 or 1) represented using one bit, a Boolean value (true or false), or a comparison of numerical values (e.g., a comparison with a predetermined value).
(5)上述した実施形態において例示した処理手順、シーケンス、フローチャートなどは、矛盾の無い限り、順序を入れ替えてもよい。例えば、本開示において説明した方法については、例示的な順序を用いて様々なステップの要素を提示しており、提示した特定の順序に限定されない。 (5) The order of the process steps, sequences, flow charts, etc. illustrated in the above-described embodiments may be changed as long as it is not inconsistent. For example, the methods described in this disclosure present elements of various steps using an example order and are not limited to the particular order presented.
(6)図1~図9に例示された各機能は、ハードウェア及びソフトウェアの少なくとも一方の任意の組み合わせによって実現される。また、各機能ブロックの実現方法は特に限定されない。すなわち、各機能ブロックは、物理的又は論理的に結合した1つの装置を用いて実現されてもよいし、物理的又は論理的に分離した2つ以上の装置を直接的又は間接的に(例えば、有線、無線などを用いて)接続し、これら複数の装置を用いて実現されてもよい。機能ブロックは、上記1つの装置又は上記複数の装置にソフトウェアを組み合わせて実現されてもよい。 (6) Each function illustrated in Figures 1 to 9 is realized by any combination of at least one of hardware and software. Furthermore, there are no particular limitations on the method of realizing each functional block. That is, each functional block may be realized using one device that is physically or logically coupled, or may be realized using two or more devices that are physically or logically separated and connected directly or indirectly (e.g., using wires, wirelessly, etc.) and these multiple devices. A functional block may be realized by combining the one device or the multiple devices with software.
(7)上述した実施形態において例示したプログラムは、ソフトウェア、ファームウェア、ミドルウェア、マイクロコード、ハードウェア記述言語と呼ばれるか、他の名称を用いて呼ばれるかを問わず、命令、命令セット、コード、コードセグメント、プログラムコード、プログラム、サブプログラム、ソフトウェアモジュール、アプリケーション、ソフトウェアアプリケーション、ソフトウェアパッケージ、ルーチン、サブルーチン、オブジェクト、実行可能ファイル、実行スレッド、手順、機能などを意味するよう広く解釈されるべきである。 (7) The programs exemplified in the above-described embodiments should be broadly construed to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executable files, threads of execution, procedures, functions, etc., regardless of whether they are called software, firmware, middleware, microcode, hardware description language, or by other names.
 また、ソフトウェア、命令、情報などは、伝送媒体を介して送受信されてもよい。例えば、ソフトウェアが、有線技術(同軸ケーブル、光ファイバケーブル、ツイストペア、デジタル加入者回線(DSL:Digital Subscriber Line)など)及び無線技術(赤外線、マイクロ波など)の少なくとも一方を使用してウェブサイト、サーバ、又は他のリモートソースから送信される場合、これらの有線技術及び無線技術の少なくとも一方は、伝送媒体の定義内に含まれる。 Software, instructions, information, etc. may also be transmitted and received via a transmission medium. For example, if the software is transmitted from a website, server, or other remote source using at least one of wired technologies (such as coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL)), and/or wireless technologies (such as infrared, microwave), then at least one of these wired and wireless technologies is included within the definition of a transmission medium.
(8)前述の各形態において、「システム」及び「ネットワーク」という用語は、互換的に使用される。 (8) In each of the above forms, the terms "system" and "network" are used interchangeably.
(9)本開示において説明した情報、パラメータなどは、絶対値を用いて表されてもよいし、所定の値からの相対値を用いて表されてもよいし、対応する別の情報を用いて表されてもよい。 (9) The information, parameters, etc. described in this disclosure may be expressed using absolute values, may be expressed using relative values from a predetermined value, or may be expressed using other corresponding information.
(10)上述した実施形態において、第1会議管理装置10、第2会議管理装置20、及び端末装置30-1~端末装置30-4は、移動局(MS:Mobile Station)である場合が含まれる。移動局は、当業者によって、加入者局、モバイルユニット、加入者ユニット、ワイヤレスユニット、リモートユニット、モバイルデバイス、ワイヤレスデバイス、ワイヤレス通信デバイス、リモートデバイス、モバイル加入者局、アクセス端末、モバイル端末、ワイヤレス端末、リモート端末、ハンドセット、ユーザエージェント、モバイルクライアント、クライアント、又はいくつかの他の適切な用語で呼ばれる場合もある。また、本開示においては、「移動局」、「ユーザ端末(user terminal)」、「ユーザ装置(UE:User Equipment)」、「端末」等の用語は、互換的に使用され得る。 (10) In the above-described embodiment, the first conference management device 10, the second conference management device 20, and the terminal devices 30-1 to 30-4 may be mobile stations (MS). A mobile station may also be referred to by those skilled in the art as a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a mobile device, a wireless device, a wireless communication device, a remote device, a mobile subscriber station, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client, a client, or some other suitable term. In addition, in this disclosure, the terms "mobile station", "user terminal", "user equipment (UE)", "terminal", etc. may be used interchangeably.
(11)上述した実施形態において、「接続された(connected)」、「結合された(coupled)」という用語、又はこれらのあらゆる変形は、2又はそれ以上の要素間の直接的又は間接的なあらゆる接続又は結合を意味し、互いに「接続」又は「結合」された2つの要素間に1又はそれ以上の中間要素が存在することを含むことができる。要素間の結合又は接続は、物理的な結合又は接続であっても、論理的な結合又は接続であっても、或いはこれらの組み合わせであってもよい。例えば、「接続」は「アクセス」を用いて読み替えられてもよい。本開示において使用する場合、2つの要素は、1又はそれ以上の電線、ケーブル及びプリント電気接続の少なくとも一つを用いて、並びにいくつかの非限定的かつ非包括的な例として、無線周波数領域、マイクロ波領域及び光(可視及び不可視の両方)領域の波長を有する電磁エネルギーなどを用いて、互いに「接続」又は「結合」されると考えることができる。 (11) In the above-mentioned embodiments, the terms "connected" and "coupled" or any variation thereof refer to any direct or indirect connection or coupling between two or more elements, and may include the presence of one or more intermediate elements between two elements that are "connected" or "coupled" to each other. The coupling or connection between elements may be a physical coupling or connection, a logical coupling or connection, or a combination thereof. For example, "connected" may be read with "access". As used in this disclosure, two elements may be considered to be "connected" or "coupled" to each other using at least one of one or more wires, cables, and printed electrical connections, as well as electromagnetic energy having wavelengths in the radio frequency range, microwave range, and light (both visible and invisible) range, as some non-limiting and non-exhaustive examples.
(12)上述した実施形態において、「に基づいて」という記載は、別段に明記されていない限り、「のみに基づいて」を意味しない。言い換えれば、「に基づいて」という記載は、「のみに基づいて」と「に少なくとも基づいて」の両方を意味する。 (12) In the above embodiments, the phrase "based on" does not mean "based only on," unless otherwise specified. In other words, the phrase "based on" means both "based only on" and "based at least on."
(13)本開示において使用される「判断(determining)」、「決定(determining)」という用語は、多種多様な動作を包含する場合がある。「判断」、「決定」は、例えば、判定(judging)、計算(calculating)、算出(computing)、処理(processing)、導出(deriving)、調査(investigating)、探索(looking up、search、inquiry)(例えば、テーブル、データベース又は別のデータ構造での探索)、確認(ascertaining)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、受信(receiving)(例えば、情報を受信すること)、送信(transmitting)(例えば、情報を送信すること)、入力(input)、出力(output)、アクセス(accessing)(例えば、メモリ中のデータにアクセスすること)した事を「判断」「決定」したとみなす事などを含み得る。また、「判断」、「決定」は、解決(resolving)、選択(selecting)、選定(choosing)、確立(establishing)、比較(comparing)などした事を「判断」「決定」したとみなす事を含み得る。つまり、「判断」「決定」は、何らかの動作を「判断」「決定」したとみなす事を含み得る。また、「判断(決定)」は、「想定する(assuming)」、「期待する(expecting)」、「みなす(considering)」などで読み替えられてもよい。 (13) The terms "determining" and "determining" as used in this disclosure may encompass a wide variety of actions. "Determining" and "determining" may include, for example, judging, calculating, computing, processing, deriving, investigating, looking up, search, inquiry (e.g., searching in a table, database, or other data structure), and considering ascertaining as "judging" or "determining". Also, "determining" and "determining" may include considering receiving (e.g., receiving information), transmitting (e.g., sending information), input, output, and accessing (e.g., accessing data in memory) as "judging" or "determining". Additionally, "judgment" and "decision" can include considering resolving, selecting, choosing, establishing, comparing, etc., to have been "judged" or "decided." In other words, "judgment" and "decision" can include considering some action to have been "judged" or "decided." Additionally, "judgment (decision)" can be interpreted as "assuming," "expecting," "considering," etc.
(14)上述した実施形態において、「含む(include)」、「含んでいる(including)」及びそれらの変形が使用されている場合、これらの用語は、用語「備える(comprising)」と同様に、包括的であることが意図される。更に、本開示において使用されている用語「又は(or)」は、排他的論理和ではないことが意図される。 (14) In the above embodiments, when the terms "include," "including," and variations thereof are used, these terms are intended to be inclusive, similar to the term "comprising." Furthermore, the term "or" as used in this disclosure is not intended to be an exclusive or.
(15)本開示において、例えば、英語でのa, an及びtheのように、翻訳により冠詞が追加された場合、本開示は、これらの冠詞の後に続く名詞が複数形であることを含んでもよい。 (15) In this disclosure, where articles have been added by translation, such as a, an, and the in English, this disclosure may include that the noun following these articles is in the plural.
(16)本開示において、「AとBが異なる」という用語は、「AとBが互いに異なる」ことを意味してもよい。なお、当該用語は、「AとBがそれぞれCと異なる」ことを意味してもよい。「離れる」、「結合される」等の用語も、「異なる」と同様に解釈されてもよい。 (16) In this disclosure, the term "A and B are different" may mean "A and B are different from each other." In addition, the term may mean "A and B are each different from C." Terms such as "separate" and "combined" may also be interpreted in the same way as "different."
(17)本開示において説明した各態様/実施形態は単独で用いてもよいし、組み合わせて用いてもよいし、実行に伴って切り替えて用いてもよい。また、所定の情報の通知(例えば、「Xであること」の通知)は、明示的に行う通知に限られず、暗黙的(例えば、当該所定の情報の通知を行わない)ことによって行われてもよい。 (17) Each aspect/embodiment described in this disclosure may be used alone, in combination, or switched depending on the execution. In addition, notification of specific information (e.g., notification that "X is the case") is not limited to being an explicit notification, but may be performed implicitly (e.g., not notifying the specific information).
 以上、本開示について詳細に説明したが、当業者にとっては、本開示が本開示中に説明した実施形態に限定されないということは明らかである。本開示は、請求の範囲の記載により定まる本開示の趣旨及び範囲を逸脱することなく修正及び変更態様として実施できる。従って、本開示の記載は、例示説明を目的とし、本開示に対して何ら制限的な意味を有さない。 Although the present disclosure has been described in detail above, it is clear to those skilled in the art that the present disclosure is not limited to the embodiments described herein. The present disclosure can be implemented in modified and altered forms without departing from the spirit and scope of the present disclosure as defined by the claims. Therefore, the description of the present disclosure is intended as an illustrative example and does not have any limiting meaning on the present disclosure.
1…会議システム、10…第1会議管理装置、11…処理装置、12…記憶装置、13…入力装置、14…通信装置、15…ディスプレイ、20…第2会議管理装置、21…処理装置、22…記憶装置、23…入力装置、24…通信装置、25…ディスプレイ、30…端末装置、31A,31B…処理装置、32A,32B…記憶装置、33…入力装置、34…通信装置、35…収音装置、36…スピーカ、37…撮像装置、38,38A,38B…ディスプレイ、111…通信制御部、211…空間生成部、212…アバター制御部、212-1…アバター生成部、212-2…動作制御部、213…移動制御部、214…第1画像生成部、215…第1音声取得部、216…第2画像生成部、217…第2音声取得部、218…通信制御部、311…通信制御部、312…表示制御部、A3,A4…アバター、C1,C2…椅子、M1…第1カメラ、M2…第2カメラ、PI1…第1画像情報、PI2…第2画像情報、PR1,PR2,PR3…制御プログラム、R1,R2…実写画像、RI1…第1撮像情報、SI1…第1音声情報、SI2…第2音声情報、U1,U2,U3,U4…ユーザ 1...conference system, 10...first conference management device, 11...processing device, 12...storage device, 13...input device, 14...communication device, 15...display, 20...second conference management device, 21...processing device, 22...storage device, 23...input device, 24...communication device, 25...display, 30...terminal device, 31A, 31B...processing device, 32A, 32B...storage device, 33...input device, 34...communication device, 35...sound collection device, 36...speaker, 37...imaging device, 38, 38A, 38B...display, 111...communication control unit, 211...space generation unit, 212...avatar control unit, 212-1... Avatar generation unit, 212-2...motion control unit, 213...movement control unit, 214...first image generation unit, 215...first voice acquisition unit, 216...second image generation unit, 217...second voice acquisition unit, 218...communication control unit, 311...communication control unit, 312...display control unit, A3, A4...avatar, C1, C2...chair, M1...first camera, M2...second camera, PI1...first image information, PI2...second image information, PR1, PR2, PR3...control program, R1, R2...real image, RI1...first imaging information, SI1...first voice information, SI2...second voice information, U1, U2, U3, U4...user

Claims (4)

  1.  現実空間で開催される第1web会議を管理する外部装置と通信し、仮想空間で開催される第2web会議を管理する管理装置であって、
     前記仮想空間上の会議室を第1仮想カメラが撮像することによって得られた2次元画像を示す第1画像情報を生成する第1生成部と、
     前記仮想空間上の会議室を第2仮想カメラが撮像することによって得られた3次元画像を示す第2画像情報を生成する第2生成部と、
     前記第1画像情報を前記外部装置へ送信すると共に、前記第2画像情報を前記第2web会議に参加する1又は複数の端末装置へ送信する通信部と、
     を備え、
     前記仮想空間上の会議室には、前記外部装置から出力される前記第1web会議に関する2次元画像を示す仮想オブジェクトが配置される、
    管理装置。
    A management device that communicates with an external device that manages a first Web conference held in a real space and manages a second Web conference held in a virtual space,
    a first generation unit that generates first image information indicating a two-dimensional image obtained by capturing an image of the conference room in the virtual space with a first virtual camera;
    a second generation unit that generates second image information indicating a three-dimensional image obtained by capturing an image of the conference room in the virtual space with a second virtual camera;
    a communication unit that transmits the first image information to the external device and transmits the second image information to one or more terminal devices participating in the second web conference;
    Equipped with
    A virtual object showing a two-dimensional image related to the first web conference output from the external device is placed in the conference room in the virtual space.
    Management device.
  2.  前記第1web会議における第1音声を取得する第1取得部と、
     前記第2web会議における第2音声を取得する第2取得部と、を更に備え、
     前記通信部は、前記第1音声を前記1又は複数の端末装置へ送信し、前記第2音声を前記外部装置に送信する、
     請求項1に記載の管理装置。
    a first acquisition unit that acquires a first voice in the first web conference;
    A second acquisition unit that acquires a second voice in the second web conference,
    The communication unit transmits the first sound to the one or more terminal devices and transmits the second sound to the external device.
    The management device according to claim 1 .
  3.  前記仮想空間において、前記第1仮想カメラ、前記第2仮想カメラ、又は、前記第1仮想カメラと前記第2仮想カメラとの両方を移動させる移動制御部を、更に備える、
     請求項1に記載の管理装置。
    a movement control unit that moves the first virtual camera, the second virtual camera, or both the first virtual camera and the second virtual camera in the virtual space,
    The management device according to claim 1 .
  4.  前記仮想空間上の会議室において、前記1又は複数の端末装置のユーザと1対1に対応する1又は複数のアバターを表示させる、アバター制御部を更に備える、
     請求項1に記載の管理装置。
    and an avatar control unit that displays, in a conference room in the virtual space, one or more avatars that correspond one-to-one to users of the one or more terminal devices.
    The management device according to claim 1 .
PCT/JP2023/040286 2022-12-01 2023-11-08 Management device WO2024116763A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-193026 2022-12-01
JP2022193026 2022-12-01

Publications (1)

Publication Number Publication Date
WO2024116763A1 true WO2024116763A1 (en) 2024-06-06

Family

ID=91323501

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/040286 WO2024116763A1 (en) 2022-12-01 2023-11-08 Management device

Country Status (1)

Country Link
WO (1) WO2024116763A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170339372A1 (en) * 2014-11-14 2017-11-23 Pcms Holdings, Inc. System and method for 3d telepresence
JP2022089616A (en) * 2020-12-04 2022-06-16 パナソニックIpマネジメント株式会社 Conference system, video conference device, and video processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170339372A1 (en) * 2014-11-14 2017-11-23 Pcms Holdings, Inc. System and method for 3d telepresence
JP2022089616A (en) * 2020-12-04 2022-06-16 パナソニックIpマネジメント株式会社 Conference system, video conference device, and video processing method

Similar Documents

Publication Publication Date Title
CN110138645B (en) Session message display method, device, equipment and storage medium
CN108234920B (en) Camera with privacy mode
CN108886600B (en) Method and system for providing selectable interactive elements in a video stream
US8264518B2 (en) Gesture-based actions in a video communication session
US11416202B2 (en) Communication terminal, communication system, method of sharing data, and recording medium
US9959084B2 (en) Communication terminal, communication system, communication control method, and recording medium
JP7533061B2 (en) COMMUNICATION TERMINAL, IMAGE COMMUNICATION SYSTEM, IMAGE DISPLAY METHOD AND PROGRAM
CN103168466A (en) Virtual video capture device
EP3136248B1 (en) Access management system, communication system, recording medium and access management method
US11025603B2 (en) Service providing system, service delivery system, service providing method, and non-transitory recording medium
US11966658B2 (en) System and method for displaying image, image-capturing device, and recording medium
CN101458616A (en) Multimedia kvm system
JP2015225400A (en) Communication system, transfer control device, communication method, and program
WO2024116763A1 (en) Management device
CN117044221A (en) Event-based vision sensor and event filtering method
US20230292011A1 (en) Information processing system, image-capturing device, and display method
US20170034698A1 (en) Terminal, communication method, and recording medium storing program
JP2018060513A (en) Communication terminal, communication system, transmission method, and program
CN113924555A (en) Context-aware based notification delivery
US11687312B2 (en) Display apparatus, data sharing system, and display control method
US9769220B2 (en) Apparatus, system, and method of controlling session, and recording medium
US10257238B2 (en) Apparatus, system, and method of determining destination for state information, and recording medium
WO2020148694A1 (en) A system for wireless presentation and computer programs therefor
WO2024101038A1 (en) Avatar moving device
WO2023149255A1 (en) Display control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23897417

Country of ref document: EP

Kind code of ref document: A1