EP4231632A1 - Système d'affichage, procédé d'affichage et support - Google Patents

Système d'affichage, procédé d'affichage et support Download PDF

Info

Publication number
EP4231632A1
EP4231632A1 EP23153948.7A EP23153948A EP4231632A1 EP 4231632 A1 EP4231632 A1 EP 4231632A1 EP 23153948 A EP23153948 A EP 23153948A EP 4231632 A1 EP4231632 A1 EP 4231632A1
Authority
EP
European Patent Office
Prior art keywords
information
display
application
screen
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23153948.7A
Other languages
German (de)
English (en)
Inventor
Ryutarou Ono
Takahiro Kamekura
Shigeyuki Ishii
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2022188591A external-priority patent/JP2023120142A/ja
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Publication of EP4231632A1 publication Critical patent/EP4231632A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/155Conference systems involving storage of or access to video conference sessions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/783Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/7844Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using original textual content or text extracted from visual content or transcript of audio data
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/152Multipoint control units therefor

Definitions

  • Embodiments of the present disclosure relate to a display system, a display method, and a carrier medium.
  • Known teleconference systems transmit images and audio from one site to one or more other sites in real time to allow users at remote sites to conduct a meeting using the images and the audio.
  • Japanese Unexamined Patent Application Publication No. 2021-105688 discloses displaying, based on information obtained by recording an image and sound in 360 degrees, a minutes screen that includes an area for displaying participants in a conference room and an utterance history area for displaying speeches in chronological order for each utterance.
  • the system does not display a record of telecommunication (or remote communication) created based on a content displayed (screen information) by an application being active at that time (application being executed in the telecommunication) and image information of the surroundings.
  • a content displayed (screen information) by an application being active at that time application being executed in the telecommunication
  • image information of the surroundings even if information obtained by capturing a conference room is displayed, the information is not displayed together with the screen (window) of the application, such as a teleconference application, displayed in the telecommunication.
  • the related art does not enable a user to view the situation of the telecommunication in which the inside of the conference room is captured in a video.
  • An object of the present disclosure is to display a record of a communication based on the screen information having been displayed by an application executed in the communication and image information of the surrounding of a device.
  • a display system includes a display unit to display, on a display, a record of a communication in a form of a video created based on screen information and surrounding image information.
  • the screen information has been displayed by a teleconference application on a communication terminal participating in the communication and acquired by an information recording application.
  • the surrounding image information has been acquired by a device and represents an image of surroundings around the device.
  • the display unit displays the surrounding image information, talker image information cut out from the surrounding image information, representing a person speaking in the communication, and the screen information.
  • a display in another aspect, includes displaying, on a display, a record of a communication in a form of a video created based on screen information and surrounding image information.
  • the screen information has been displayed on a communication terminal participating in the communication and acquired by an information recording application.
  • the surrounding image information has been acquired by a device and represents an image of surroundings around the device.
  • the displaying includes displaying the surrounding image information, talker image information cut out from the surrounding image information, representing a person speaking in the communication, and the screen information.
  • a carrier medium carries computer readable codes for controlling a computer system to carry out the method described above.
  • a record of a communication created based on the screen information having been displayed by the application executed in the communication and image information of surroundings around the device is displayed so that a user can view a situation of the communication during which scenes in a conference room has been recorded in a video.
  • a record display system and a display method carried out by the record display system will be described below as example embodiments of the present disclosure.
  • FIG. 1 is a diagram illustrating an overview of creation of a record in which a screen display of the application executed in a teleconference is stored together with a panoramic image of the surroundings.
  • a user at a first site 102 uses a teleconference service system 90 to host a teleconference with another user at a second site 101.
  • a record display system 100 includes a meeting device 60 and a communication terminal 10.
  • the meeting device 60 includes an image-capturing device that captures an image of 360-degree surroundings, a microphone, and a speaker.
  • the meeting device 60 processes information of the captured image of the surroundings to obtain a horizontal panoramic image (hereinafter referred to as a panoramic image).
  • the record display system 100 uses the panoramic image and a screen displayed by an application executed on the communication terminal 10, to create a record (meeting minutes).
  • the record display system 100 combines audio received by a teleconference application 42 ( FIG. 2 ) with audio received by the meeting device 60 and includes the resultant audio data in the record.
  • the record display system 100 may display (replay) the record to enable the user to view the record, without creating the record. The overview will be described below.
  • the meeting device 60 transmits the audio data directly to the information processing system 50.
  • the information processing system 50 transmits the text data (text file) to the storage service system 70 to be stored in addition to the video.
  • the text data is a part of the record.
  • the information processing system 50 is capable of charging a user with a fee according to the service used. For example, the fee is calculated based on the amount of the text data, the file size of the video, or the processing time.
  • the information recording application 41 can replay the video together with the audio data.
  • the information recording application 41 can also display text data corresponding to the video.
  • the panoramic image 203 and the talker image 204 are on the left, and the screen 103 of the teleconference application 42 is on the right.
  • the panoramic image that is a surrounding image including the user, the talker image, and the screen of the application displayed in the teleconference, such as the teleconference application 42, are displayed. Therefore, when a participant of the teleconference or a person who is not a participant views the video as the minutes of the teleconference, the panoramic image, the talker image, and the window of the application are displayed on one screen. Thus, scenes of the teleconference are reproduced with a sense of presence.
  • app refers to software developed or used for a specific function or purpose. Types of such applications include a native application and a web application.
  • a web application (a cloud application that provides a cloud service) may operate in cooperation with a native application or a web browser.
  • application being executed refers to an application in a state from the activation of application to the end of the application.
  • An application is not necessarily active (an application in the foreground) and may operate in the background.
  • the term "device” refers to a device having capabilities of capturing an image of the surroundings of the device and collecting audio from the surroundings.
  • the device is used as being connected to the communication terminal.
  • the device is built in the communication terminal.
  • the device is used as being connected to the cloud service, instead of being directly connected to the communication terminal.
  • the device is referred to as a "meeting device.”
  • the meeting device captures an image of the surroundings thereof (for example, in an area or space of 180 to 360 degrees in the horizontal direction) to acquire image information and performs predetermined processing on image information of the curved surface acquired by the meeting device.
  • the resultant image information is referred to as the image information of the surroundings acquired by the meeting device.
  • Examples of the predetermined processing include various kinds of processing for generating, from the information of a captured image, the image information of the surroundings. An example is flattening on a curved-surface captured image.
  • Examples of the predetermined processing may further include, in addition to creating a peripheral image, cutting out an image of a talker, and combining the image of the surroundings with the talker image.
  • the image of the surroundings is referred to as "panoramic image.”
  • the panoramic image is an image having an angle of view of 180 degrees to 360 degrees in substantially the horizontal direction.
  • the panoramic image is not necessarily captured by a single meeting device, and may be captured by a combination of a plurality of imaging devices each having an ordinary angle of view.
  • the meeting device being used is placed, for example, on a table, for grasping scenes of a teleconference at a site or the surroundings, aspects of the present disclosure are applicable to a device used for surveillance (security, disaster prevention, etc.), watching (childcare, nursing, etc.), or analyzing scenes of a site (solutions, marketing, etc.).
  • record refers to information recorded by the information recording application 41 and stored in a viewable manner in association with identification information of a certain conference (meeting). Examples of contents of the record are as follows:
  • the record may serve as the minutes of the conference.
  • the minutes are examples of the record.
  • the name of the record may vary depending on the contents of the teleconference or contents carried out at the site.
  • the record may be a record of communication, a record of a scene (situation) at a site, or a record of an event.
  • the record includes files of a plurality of formats, such as a video file (a composite moving image or the like), an audio file, a text data (obtained by performing speech recognition on audio) file, a document file, an image file, and a tabular form file.
  • Such files are associated with the identification information of the conference and can be viewed collectively or selectively in time series.
  • the term "tenant” refers to a group of users (such as a company, a local government, or an organization that is a part of such a company or local government) that has a contract to receive a service from a service provider. In the present embodiment, assuming that the tenant has a contract with the service provider, creation of the record and conversion into text data are performed.
  • telecommunication refers to audio-and-video-based communication with a counterpart at a physically remote site, using software and communication terminals.
  • a teleconference is an example of telecommunication.
  • a conference may also be referred to as an assembly, a meeting, an arrangement, a consultation, an application for a contract or the like, a gathering, a meet, a meet-up, a seminar, a workshop, a study meeting, a study session, a training session, or the like.
  • site refers to a place where an activity is performed.
  • a conference room is an example of the site.
  • the conference room is a room set up to be used primarily for a conference.
  • site may also refer to various places such as a home, a reception desk, a store, a warehouse, and an outdoor site, and may refer to any place or space where a communication terminal, a device, or the like is installable.
  • audio refers to an utterance made by a person, a surrounding sound, or the like.
  • audio data refers to data to which the audio is converted. However, in the present embodiment, the audio and the audio data are not strictly distinguished from each other.
  • FIG. 2 illustrates an example of the configuration of the record display system 100.
  • FIG. 2 illustrates one site (the first site 102) among multiple sites at which participants of a teleconference are present.
  • the communication terminal 10 at the first site 102 communicates via a network with the information processing system 50, the storage service system 70, and the teleconference service system 90.
  • the meeting device 60 is placed at the first site 102.
  • the communication terminal 10 is connected via a universal serial bus (USB) cable or the like to the meeting device 60 to communicate therewith.
  • USB universal serial bus
  • At least the information recording application 41 and the teleconference application 42 operate on the communication terminal 10.
  • the teleconference application 42 can communicate with communication terminals at the second site 101 via the teleconference service system 90 that resides on the network to allow users at the remote sites to participate in the teleconference.
  • the information recording application 41 uses functions of the information processing system 50 and the meeting device 60 to generate the record of the teleconference hosted by the teleconference application 42.
  • the conference is not necessarily held among remote sites. That is, aspects of the present embodiment are applicable to a conference held among the participants present at one site.
  • audio collected by the meeting device 60 is stored without being combined.
  • the rest of the process performed by the information recording application 41 is the same.
  • the communication terminal 10 includes a built-in (or external) camera having an ordinary angle of view.
  • the camera of the communication terminal 10 captures an image of a front space including a user 107 who operates the communication terminal 10. Images captured by the camera having an ordinary angle of view are not panoramic images.
  • the built-in camera having the ordinary angle of view primarily captures planar images that are not curved like spherical images.
  • the communication terminal 10 includes a microphone built therein (or may include a microphone externally attached thereto). The microphone collects audio from the surroundings, such as from the user 107 operating the communication terminal 10. Thus, the user can participate in a teleconference using the teleconference application 42 as usual without paying attention to the information recording application 41.
  • the information recording application 41 and the meeting device 60 do not affect the teleconference application 42 except for an increase in the processing load of the communication terminal 10.
  • the teleconference application 42 can transmit a panoramic image or a talker image captured by the meeting device 60 to the teleconference service system 90.
  • the information recording application 41 is an application that communicates with the meeting device 60 and records the information, to generate the record of the conference.
  • the meeting device 60 is a device including an imaging device that can capture a panoramic image, a microphone, and a speaker and is used for a conference.
  • the camera of the communication terminal 10 can capture an image of only a limited range of the front space.
  • the meeting device 60 can capture an image of the entire surroundings (not necessarily the entire surroundings) around the meeting device 60.
  • the meeting device 60 can always keep a plurality of participants 106 illustrated in FIG. 2 within the angle of view.
  • the meeting device 60 clips a talker image from a panoramic image and combines audio data received by the meeting device 60 and audio data output by the communication terminal 10 (including audio data received by the teleconference application 42).
  • the place where the meeting device 60 is placed is not limited to on a desk or a table, and the meeting device 60 may be disposed at any place of the first site 102. Since the meeting device 60 can capture a spherical image, the meeting device 60 may be disposed on a ceiling, for example.
  • the meeting device 60 may be installed at the second site 101 or any of the sites where the participants are present.
  • the information recording application 41 displays a list of applications operating on the communication terminal 10, stores the above-described record (video), replays the video, and receives editing. Further, the information recording application 41 displays a list of teleconferences already held or are to be held in the future. The list of teleconferences is used in information on the record to allow the user to link a teleconference with the record.
  • the teleconference application 42 is an application that establishes a connection to and communicates with other communication terminals at the second site 101, transmits and receives an image and audio, displays the image and outputs the audio to allow the communication terminal 10 to perform telecommunication with the other communication terminals.
  • the teleconference application 42 may be referred to as a telecommunication app, a remote information sharing application, or the like.
  • the information recording application 41 and the teleconference application 42 each may be a web application or a native application.
  • a web application is an application in which a program on a web server cooperates with a program on a web browser or a native application to perform processing, and is not to be installed on the communication terminal 10.
  • a native application is an application that is installed and used on the communication terminal 10. In the present embodiment, both the information recording application 41 and the teleconference application 42 are described as native applications.
  • the communication terminal 10 may be a general-purpose information processing apparatus having a communication function, such as a personal computer (PC), a smartphone, or a tablet terminal, for example.
  • the communication terminal 10 is, for example, an electronic whiteboard, a game console, a personal digital assistant (PDA), a wearable PC, a car navigation system, an industrial machine, a medical device, or a networked home appliance.
  • the communication terminal 10 may be any apparatus on which at least the information recording application 41 and the teleconference application 42 operate.
  • the information processing system 50 is implemented by one or more information processing apparatuses deployed over a network.
  • the information processing system 50 includes one or more server applications that perform processing in cooperation with the information recording application 41, and provides infrastructure services.
  • the server applications manage, for example, a list of teleconferences, records of teleconferences, and various settings and storage paths.
  • the infrastructure service performs user authentication, makes a contract, performs charging processing, and the like.
  • the information processing system 50 may reside in a cloud environment or in an on-premises environment.
  • the information processing system 50 may be implemented by a plurality of server apparatuses or a single information processing apparatus.
  • the server applications and the infrastructure service may be provided by separate information processing apparatuses.
  • each function of the server applications may be provided by an individual information processing apparatus.
  • the information processing system 50 may be integral with a conference management system 9 described later, the storage service system 70, and the speech recognition service system 80.
  • the conference management system 9 is a system that manages information on a conference hosted by a tenant that uses the information processing system 50.
  • the information processing system 50 acquires conference information from the conference management system 9 and manages the conference information in association with the record.
  • the storage service system 70 is a storage on a network and provides a storage service for receiving files and the like to be stored. Examples of the storage service system 70 include MICROSOFT ONEDRIVE, GOOGLE WORKSPACE, and DROPBOX. The storage service system 70 may be on-premises network-attached storage (NAS) or the like.
  • NAS network-attached storage
  • the speech recognition service system 80 provides a service of performing speech recognition on audio data and converting the audio data into text data.
  • the speech recognition service system 80 may be a general-purpose commercial service or part of the functions of the information processing system 50.
  • the service system set for and used as the speech recognition service system 80 may be different for each user, each tenant, or each conference.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration of the information processing system 50 and the communication terminal 10 according to the present embodiment.
  • the information processing system 50 and the communication terminal 10 each are implemented by a computer and each include a central processing unit (CPU) 501, a read-only memory (ROM) 502, a random access memory (RAM) 503, a hard disk (HD) 504, a hard disk drive (HDD) controller 505, a display 506, an external device interface (I/F) 508, a network I/F 509, a bus line 510, a keyboard 511, a pointing device 512, an optical drive 514, and a medium I/F 516.
  • CPU central processing unit
  • ROM read-only memory
  • RAM random access memory
  • HD hard disk
  • HDD hard disk drive
  • display 506 an external device interface (I/F) 508, a network I/F 509, a bus line 510, a keyboard 511, a pointing device 512, an optical drive 514
  • the CPU 501 controls entire operations of the information processing system 50 and the communication terminal 10.
  • the ROM 502 stores programs such as an initial program loader (IPL) to boot the CPU 501.
  • the RAM 503 is used as a work area for the CPU 501.
  • the HD 504 stores various kinds of data such as a program.
  • the HDD controller 505 controls reading or writing of various kinds of data from or to the HD 504 under the control of the CPU 501.
  • the display 506 displays various information such as a cursor, a menu, a window, characters, and images.
  • the external device I/F 508 is an interface for connecting various external devices. Examples of the external devices in this case include, but are not limited to, a USB memory and a printer.
  • the network I/F 509 is an interface for performing data communication via a network.
  • the bus line 510 is, for example, an address bus or a data bus for electrically connecting the components such as the CPU 501 illustrated in FIG. 3 with each other.
  • the keyboard 511 is an example of an input device including a plurality of keys for a user to input characters, numerical values, various instructions, and the like.
  • the pointing device 512 is an example of an input device for a user to select or execute various instructions, select an item for processing, or move a cursor being displayed.
  • the optical drive 514 controls reading or writing of various kinds of data from or to an optical recording medium 513 that is an example of a removable recording medium.
  • the optical recording medium 513 may be a compact disc (CD), a digital versatile disc (DVD), a BLU-RAY disc, or the like.
  • the medium I/F 516 controls reading or writing (storing) of data from or to a recording medium 515 such as a flash memory.
  • FIG. 4 is a block diagram illustrating an example of the hardware configuration of the meeting device 60 capable of capturing a 360-degree moving image of the surroundings.
  • the following description is based on the assumption that the meeting device 60 uses an imaging element to capture a 360-degree moving image of the surroundings of the device at a predetermined height.
  • the number of imaging elements may be one or two or more.
  • the meeting device 60 is not necessarily a dedicated device and may be a PC, a digital camera, a smartphone, or the like to which an imaging unit for a 360-degree moving image is externally attached so as to implement substantially the same functions as the meeting device 60.
  • the meeting device 60 includes an imaging unit 601, an image processing unit 604, an image capture control unit 605, a microphone 608, an audio processing unit 609, a CPU 611, a ROM 612, a static random access memory (SRAM) 613, a dynamic random access memory (DRAM) 614, an operation device 615, an external device I/F 616, a communication unit 617, an antenna 617a, an audio sensor 618, and a micro-USB socket terminal having a recess.
  • an imaging unit 601 an image processing unit 604, an image capture control unit 605, a microphone 608, an audio processing unit 609, a CPU 611, a ROM 612, a static random access memory (SRAM) 613, a dynamic random access memory (DRAM) 614, an operation device 615, an external device I/F 616, a communication unit 617, an antenna 617a, an audio sensor 618, and a micro-USB socket terminal having a recess.
  • the imaging unit 601 includes wide-angle lenses 602a and 602b (so-called fisheye lenses) having an angle of view of 360 degrees to form a hemispherical image, and imaging elements 603a and 603b (image sensors) provided for the wide-angle lens 602a and 602b, respectively.
  • the lenses 602a and 602b may be collectively referred to as "lenses 602," and the imaging elements 603a and 603b may be collectively referred to as "imaging elements 603.”
  • Each of the imaging elements 603 includes an image sensor such as a complementary metal oxide semiconductor (CMOS) sensor or a charge coupled device (CCD) sensor, a timing generation circuit, and a group of registers.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • the image sensor converts an optical image formed by the corresponding fisheye lens 602 into an electric signal to output image data.
  • the timing generation circuit generates horizontal or vertical synchronization signals, pixel clocks, and the like for this image sensor.
  • Various commands, parameters, and the like for operations of the corresponding imaging element are set in the group of registers.
  • the imaging unit 601 may be a 360-degree camera and is an example of an image capturer that captures an image of a 360-degree space around the meeting device 60.
  • the image capturer combines multiple data respectively obtained by multiple imaging elements (e.g., two imaging elements each having angle of view of 180 degrees), to obtain an angle of view of 360 degrees.
  • Each of the imaging elements 603 (image sensors) of the imaging unit 601 is connected to the image processing unit 604 via a parallel I/F bus.
  • each of the imaging elements 603 of the imaging unit 601 is connected to the image capture control unit 605 via a serial I/F bus such as an inter-integrated circuit (I2C) bus.
  • the image processing unit 604, the image capture control unit 605, and the audio processing unit 609 are connected to the CPU 611 via a bus 610.
  • the ROM 612, the SRAM 613, the DRAM 614, the operation device 615, the external device I/F 616, the terminal communication unit 617, the audio sensor 618, and the like are connected to the bus 610.
  • the image processing unit 604 obtains image data output from each of the imaging elements 603 through the parallel I/F bus and performs predetermined processing on the image data to create data of a panoramic image and data of a talker image from a fisheye image. Further, the image processing unit 604 combines the panoramic image and the talker image to output one moving image.
  • the image capture control unit 605 usually serves as a master device, whereas the imaging elements 603 usually serve as slave devices.
  • the image capture control unit 605 sets commands and the like in the groups of registers of the imaging elements 603 through the I2C bus.
  • the image capture control unit 605 receives the commands and the like from the CPU 611.
  • the image capture control unit 605 obtains status data and the like in the groups of registers of the imaging elements 603 through the I2C bus.
  • the image capture control unit 605 then sends the obtained data to the CPU 611.
  • the image capture control unit 605 instructs the imaging elements 603 to output image data in response to pressing of an image-capturing start button of the operation device 615 or when the image capture control unit 605 receives an image-capturing start instruction from the communication terminal 10.
  • the meeting device 60 supports a preview display function and a moving image display function of a display (e.g., a display of a PC or a smartphone).
  • the image data is continuously output from the imaging elements 603 at a predetermined frame rate (frames per minute).
  • the image capture control unit 605 operates in cooperation with the CPU 611 to synchronize timings output of the respective image data from the imaging elements 603.
  • the meeting device 60 does not include a display.
  • the meeting device 60 includes a display.
  • the microphone 608 converts sound into audio (signal) data.
  • the audio processing unit 609 receives the audio data output from the microphone 608 via an I/F bus and performs predetermined processing on the audio data.
  • the CPU 611 controls entire operations of the meeting device 60 and performs desirable processing.
  • the ROM 612 stores various programs to be executed by the CPU 611.
  • Each of the SRAM 613 and the DRAM 614 is a work memory and stores programs being executed by the CPU 611 or data being processed.
  • the DRAM 614 stores image data being processed by the image processing unit 604 and processed data of an equirectangular projection image.
  • the operation device 615 collectively refers to various operation buttons such as the image-capturing start button.
  • the user operates the operation device 615 to start image-capturing or recording, power on or off the meeting device 60, establish a connection, perform communication, and input settings such as various image-capturing modes and image-capturing conditions.
  • the external device I/F 616 is an interface for connection with various external devices. Examples of the external devices in this case include, but not limited to, a PC, a display, a projector, and an electronic whiteboard.
  • the external device I/F 616 may include a USB terminal and a High-Definition Multimedia Interface (HDMI) terminal.
  • HDMI High-Definition Multimedia Interface
  • the moving image data or still image data stored in the DRAM 614 is transmitted to an external communication terminal or recorded in an external medium via the external device I/F 616. Further, a plurality of external device I/Fs 616 may be used.
  • an image for example, screen information displayed by a teleconference application
  • an image from the PC may be transmitted to the meeting device 60 and further transmitted from the meeting device 60 to another external device (a display, a projector, an electronic whiteboard, etc.) via HDMI to be displayed.
  • another external device a display, a projector, an electronic whiteboard, etc.
  • the terminal communication unit 617 is implemented by, for example, a network interface circuit.
  • the terminal communication unit 617 may communicate with a cloud server via the Internet using a wireless communication technology such as Wireless Fidelity (Wi-Fi) via an antenna 617a of the meeting device 60 and transmit the moving image data and the image data stored in the DRAM 614 to the cloud server.
  • Wi-Fi Wireless Fidelity
  • the terminal communication unit 617 may be able to communicate with nearby devices using a short-range wireless communication technology such as BLUETOOTH LOW ENERGY (BLE) or the near field communication (NFC).
  • BLE BLUETOOTH LOW ENERGY
  • NFC near field communication
  • the audio sensor 618 receives audio in 360 degrees in order to identify the direction from which a loud audio is input in a 360-degree space around the meeting device 60 (on a horizontal plane).
  • the audio processing unit 609 determines a direction from which the audio of the highest volume is input in the 360-degree surroundings, based on a 360-degree audio parameter input in advance, and outputs the determined audio input direction.
  • an azimuth and accelerometer or a global positioning system (GPS) sensor may be used to calculate an azimuth, a position, an angle, an acceleration, and the like for image correction or addition of position information.
  • GPS global positioning system
  • the image processing unit 604 also performs processing described below.
  • the CPU 611 creates a panoramic image according to a method below.
  • the CPU 611 performs predetermined camera image processing such as Bayer conversion (RGB interpolation processing) on raw data input from the image sensor that inputs a spherical video, and creates a fisheye image (a video including curved-surface images).
  • the CPU 611 performs flattening processing such as dewarping processing (distortion correction processing) on the created fisheye video (curved-surface video) to create a panoramic image (video including flat-surface images) of a 360-degree surroundings of the meeting device 60.
  • the CPU 611 creates a talker image according to a method below.
  • the CPU 611 cuts out a portion including a speaking person (talker) from the panoramic image (video including flat-surface images) of the 360-degree surroundings, to create a talker image.
  • the CPU 611 cuts out, from the panoramic image, a talker image corresponding the direction of the talker which is the input direction of the audio determined from 360 degrees, using the audio sensor 618 and the audio processing unit 609.
  • the CPU 611 For cutting out an image of a person based on the input direction of the audio, specifically, the CPU 611 cuts out a 30-degree portion around the input direction of the audio identified from 360 degrees, and performs face detection on the 30-degree portion to cut out the talker image.
  • the CPU 611 further identifies talker images of a predetermined number of persons (e.g., three persons) who have most recently spoken, among talker images cut out from the panoramic image.
  • the panoramic image and one or more talker images are individually transmitted to the information recording application 41.
  • the meeting device 60 may generate a single image from the panoramic image and the one or more talker images to be transmitted to the information recording application 41.
  • the panoramic image and the one or more talker images are individually transmitted from the meeting device 60 to the information recording application 41.
  • FIG. 5A and FIG. 5B are diagrams illustrating an image capture range of the meeting device 60.
  • the meeting device 60 captures a 360-degree image in the horizontal direction.
  • the meeting device 60 has an image capture range extending predetermined angles up and down from a 0-degree direction that is horizontal to the height of the meeting device 60.
  • FIG. 6 is a schematic diagram illustrating a panoramic image and cutting out talker images from the panoramic image.
  • an image captured by the meeting device 60 is a portion 110 of a sphere, and thus has a three-dimensional shape.
  • the meeting device 60 divides angles of view into the predetermined degrees up and down and by the predetermined angle in the horizontal direction to perform perspective projection conversion on each of the angles of view.
  • a predetermined number of planar images are obtained by performing the perspective projection conversion on the entire 360-degree range in the horizontal direction without gaps.
  • a panoramic image 111 is obtained by laterally connecting the predetermined number of planar images.
  • the meeting device 60 performs face detection in a predetermined range centered around the input direction of audio in the panoramic image, and cuts out an image extending 15 degrees each (30 degrees in total) to the left and right from the center of the human face, to generate each talker image 112.
  • FIG. 7 is a block diagram illustrating functional configurations of the communication terminal 10, the meeting device 60, and the information processing system 50 of the record display system 100 according to the present embodiment.
  • the information recording application 41 operating on the communication terminal 10 implements a communication unit 11, an operation reception unit 12, a display control unit 13, an app screen acquisition unit 14, an audio reception unit 15, a device communication unit 16, a video storing unit 17, an audio data processing unit 18, a video replay unit 19 (display unit), an upload unit 20, an editing unit 21, and a search unit 22.
  • These units of functions on the communication terminal 10 are implemented by or caused to function by one or more of the components illustrated in FIG. 3 operating in accordance with instructions from the CPU 501 according to the information recording application 41 loaded from the HD 504 to the RAM 503.
  • the communication terminal 10 further includes a storage unit 1000 implemented by, for example, the HD 504 illustrated in FIG. 3 .
  • the storage unit 1000 includes an information storage area 1001, which is implemented by a database.
  • the communication unit 11 transmits and receives various types of information to and from the information processing system 50 via a communication network.
  • the communication unit 11 receives a list of teleconferences from the information processing system 50 and transmits an audio data recognition request to the information processing system 50.
  • the display control unit 13 control display of various screens serving as user interfaces in the information recording application 41 in accordance with screen transitions set in the information recording application 41.
  • the operation reception unit 12 receives various operations input to the information recording application 41.
  • the app screen acquisition unit 14 acquires screen information displayed by an application selected by the user or screen information of a desktop screen from the OS or the like.
  • the application selected by the user is the teleconference application 42
  • the app screen acquisition unit 14 acquires a screen generated by the teleconference application 42 (an image including a captured image of a user of the communication terminal 10 captured by a camera of the communication terminal 10 at each site, a display image of a shared material, and participant icons, participant names, and the like).
  • the screen information (application screen) displayed by the application is information displayed as a window by one or more applications (including the teleconference application) executed in the conference and acquired as an image by the information recording application.
  • the window of the application is rendered into a region of the entire desktop screen and displayed on a monitor or the like.
  • the screen information displayed by the application can be acquired by another application (e.g., the information recording application) as an image file or a moving image file formed of a plurality of consecutive images via an API of the OS or an API of the application displaying the screen.
  • the screen information of the desktop screen is information including an image of the desktop screen generated by the OS, and is similarly acquirable as an image file or a moving image file via an API of the OS.
  • the format of these image files is, for example, bitmap, Portable Network Graphics (PNG), or any other format.
  • the format of the moving image file is, for example, MP4 or any other format.
  • the audio reception unit 15 acquires sound (including audio data received from the teleconference application 42 during the teleconference) output from a microphone or an earphone of the communication terminal 10. Even when the output of sound is muted, the audio reception unit 15 can receive the sound.
  • the audio reception unit 15 can acquire audio data to be output by the communication terminal 10 via an API of the OS or an API of the app without a user intervention such as the selection of the teleconference application 42.
  • the audio data received by the teleconference application 42 from the second site 101 is also acquired.
  • the information recording application 41 may fail to acquire the audio data.
  • the sound acquired by the audio reception unit 15 may be the audio data to be output, without including the sound collected by the communication terminal 10. This is because the meeting device 60 separately collects the sound at the site.
  • the device communication unit 16 communicates with the meeting device 60 using a USB cable or the like. Alternatively, the device communication unit 16 may use a wireless local area network (LAN) or BLUETOOTH to communicate with the meeting device 60.
  • the device communication unit 16 receives a panoramic image and a talker image from the meeting device 60, and transmits the audio data acquired by the audio reception unit 15 to the meeting device 60.
  • the device communication unit 16 receives the combined audio data received by the meeting device 60.
  • the video storing unit 17 stores, as an individual moving image (video), the panoramic image and the talker image received by the device communication unit 16 and the screen information displayed by the application, acquired by the app screen acquisition unit 14.
  • the video storing unit 17 separately stores the combined audio data.
  • the video storing unit 17 may combine the audio data with the panoramic image to create a panoramic image with audio.
  • the audio data processing unit 18 requests the information processing system 50 to convert, into text data, the combined audio data received from the meeting device 60.
  • the video replay unit 19 displays (plays) a recorded video.
  • the recorded video is stored in the communication terminal 10 during the recording and then uploaded to the information processing system 50.
  • the upload unit 20 transmits the recorded video and the audio data to the information processing system 50.
  • the editing unit 21 edits (e.g., deletes a portion of the video or combines a plurality of videos) the recorded video in accordance with a user operation.
  • the search unit 22 searches the text data for a keyword input by a user.
  • the time of utterance of a matched text retrieved, as a match with the search keyword, from the text data is indicated as the time elapsed from the start of the conference (start of the recording).
  • FIG. 8 is a table illustrating an example of items of information on the recorded video stored in the information storage area 1001 according to the present embodiment.
  • the information on the recorded video includes items such as "conference ID,” “recording ID,” “update date/time,” “title,” “upload,” and “storage location.”
  • the information recording application 41 can download the conference information acquired by the information processing system 50 from the conference management system 9 as well as app-side conference information held by the information processing system 50.
  • the conference ID or the like included in the conference information is reflected in the information on the recorded video.
  • the information on the recorded video in FIG. 8 is stored by the communication terminal 10 operated by a certain user.
  • the item "conference ID” represents an identifier identifying a teleconference that has been held.
  • the conference ID is assigned when a schedule of the teleconference is registered in the conference management system 9, or is assigned by the information processing system 50 in response to a request from the information recording application 41.
  • the conference management system 9 is a system in which a schedule of a conference or a teleconference, a uniform resource locator (URL) such as a link for starting the teleconference, and reservation information of devices to be used in the conference or the teleconference are registered.
  • the conference management system 9 is, for example, a scheduler to which the communication terminal 10 connects via a network.
  • the conference management system 9 can transmit the registered schedule and the like to the information processing system 50.
  • the item "recording ID” represents an identifier identifying a video recorded in the teleconference.
  • the recording ID is assigned by the meeting device 60.
  • the recording ID may be assigned by the information recording application 41 or the information processing system 50. Different recording IDs are assigned to a same conference ID in a case where the recording is suspended in the middle of the teleconference but is started again for some reason.
  • the item "update date/time” represents the date and time when the recorded video is updated (or recording is ended).
  • the update date and time is the date and time of editing.
  • the item "title” represents a name of the conference (or a teleconference).
  • the title may be set when the conference is registered to the conference management system 9, or may be set by the user in any manner.
  • the item "upload” indicates whether a recorded video has been uploaded to the information processing system 50.
  • the item "storage location” indicates a location (a URL or a file path) where the recorded video and text data are stored in the storage service system 70.
  • the user can view the recorded video uploaded to the information processing system 50 as desired.
  • the panoramic image, the talker images, application screens, and the text data are stored with different file names starting with the URL, for example.
  • the meeting device 60 includes a terminal communication unit 61, a panoramic image generation unit 62, a talker image generation unit 63, a sound collection unit 64, and an audio synthesis unit 65. These functional units of the meeting device 60 are implemented by or caused to function by one or more of the components illustrated in FIG. 4 operating in accordance with instructions from the CPU 611 according to the control program loaded from the ROM 612 to the DRAM 614.
  • the terminal communication unit 61 communicates with the communication terminal 10 using a USB cable or the like.
  • the terminal communication unit 61 may communicate with the communication terminal 10 via a wireless LAN, BLUETOOTH, or the like.
  • the panoramic image generation unit 62 generates a panoramic image.
  • the talker image generation unit 63 generates a talker image. The method of generating a panoramic image and a speaker image has been described with reference to FIGS. 5A to 6 .
  • the sound collection unit 64 converts a sound signal received by the microphone 608 of the meeting device 60 into (digital) sound data (or audio data). Thus, the contents of utterances made by the user and the participants at the site where the communication terminal 10 is installed is collected.
  • the audio synthesis unit 65 combines the audio data transmitted from the communication terminal 10 and the sound collected by the sound collection unit 64. Accordingly, the speeches uttered at the second site 101 and those uttered at the first site 102 are combined.
  • the information processing system 50 includes a communication unit 51, an authentication unit 52, a screen generation unit 53, a conference information acquisition unit 54, and a text conversion unit 55. These functional unit of the information processing system 50 are implemented by or caused to function by one or more of the components illustrated in FIG. 3 operating in accordance with instructions from the CPU 501 according to the control program loaded from the HD 504 to the RAM 503.
  • the information processing system 50 further includes a storage unit 5000 implemented by, for example, the HD 504 illustrated in FIG. 3 .
  • the storage unit 5000 includes an app-side conference information storage area 5001 and a record information storage area 5002 each of which is implemented by a database, for example.
  • the communication unit 51 transmits and receives various kinds of information to and from the communication terminal 10. For example, the communication unit 51 transmits a list of teleconferences to the communication terminal 10 and receives an audio data recognition request from the communication terminal 10.
  • the authentication unit 52 authenticates a user who operates the communication terminal 10.
  • the authentication unit 52 authenticates the user by, for example, determining whether authentication information (a user ID and a password) included in a request for authentication received by the communication unit 51 matches authentication information stored in advance.
  • the authentication information may be a card number of an integrated circuit (IC) card, biometric authentication information of a face, a fingerprint, or the like.
  • the authentication unit 52 may authenticate the user by using an external authentication system or an authentication method such as Open Authentication (OAuth).
  • OAuth Open Authentication
  • the screen generation unit 53 generates screen information representing a screen to be displayed by the communication terminal 10.
  • the communication terminal 10 executes a native application
  • the communication terminal 10 stores the screen information and transmits the information to be displayed in a form of Extensible Markup Language (XML) or the like.
  • XML Extensible Markup Language
  • the screen information is generated in a format of hypertext markup language (HTML), XML, cascading style sheets (CSS), JAVASCRIPT, or the like.
  • the conference information acquisition unit 54 acquires the conference information from the conference management system 9 using an account of each user or a system account assigned by the information processing system 50.
  • the conference information acquisition unit 54 can acquire a list of teleconferences for which a user belonging to the tenant has a viewing authority.
  • the viewing authority may be added directly from the information recording application 41 operating on the communication terminal 10 to the conference information managed by the conference information acquisition unit 54.
  • the list of teleconferences for which the user belonging to the tenant has the viewing authority includes information on conferences set by the user and information on conferences for which another user has given the viewing authority to the user. Since the conference ID is set for a teleconference, the teleconference is associated with the record thereof by the conference ID.
  • the text conversion unit 55 uses an external speech recognition service to convert, into text data, audio data requested to be converted into text data by the communication terminal 10. In another example, the text conversion unit 55 performs this conversion.
  • FIG. 9 illustrates an example of the app-side conference information stored in the app-side conference information storage area 5001.
  • the app-side conference information is information that the information recording application 41 generates and stores, separately from the conference information, when the information recording application 41 generates the record of the teleconference.
  • the item "conference ID" is information identifying a conference.
  • the conference ID associates the conference information managed by the conference management system 9 with the app-side conference information.
  • the item "application ID” is identification information identifying the information recording application 41.
  • the item "user ID of access token receiver” is identification information identifying the user to which an access token is issued.
  • the item "URL of user ID of access token issuer” is the URL of the storage service system 70 that issues the access token.
  • issue date and time is a date and time when the access token is issued.
  • the date and time of issuance is determined at the time of issuance.
  • the item "effective date and time” is a date and time when the access token becomes valid.
  • the effective date and time is determined by the issuer.
  • the item "expiration date” is the expiration date of the access token.
  • the expiration date is determined by the issuer.
  • the item "authority information" is authority of processing permitted to the user using the access token.
  • the authority information is determined by the issuer.
  • the item "display name” is a display name of the user in the information recording application 41.
  • the item "surname” is the surname of the user.
  • the item "name” is the last name of the user.
  • FIG. 10 is a table of the information on records stored in the record information storage area 5002.
  • the information on the records includes a list of the videos recorded by all users belonging to one tenant.
  • the information on the records includes items of "conference ID,” “recording ID,” “update date/time,” “title,” “upload,” and “storage location.” These items may be the same as those in FIG. 8 .
  • the user may input a desired storage location on a user setting screen of the information recording application 41 operating on the communication terminal 10, so that the storage location (a path such as a URL of a cloud storage system) is stored in the record information storage area 5002.
  • the conference management system 9 includes a conference information management unit 31.
  • the functional units of the conference management system 9 are implemented by or caused to function by one or more of the components illustrated in FIG. 3 operating in accordance with instructions from the CPU 501 according to the control program loaded from the HD 504 to the RAM 503.
  • the conference management system 9 further includes a storage unit 3000 implemented by, for example, the HD 504 illustrated in FIG. 3 .
  • the storage unit 3000 includes a conference information storage area 3001, which is implemented by a database.
  • the conference information management unit 31 manages conference information, that is, information on conferences to be held by the tenant.
  • the conference information is stored in the conference information storage area 3001.
  • FIG. 11 illustrates an example of items of the conference information stored in the conference information storage area 3001.
  • the conference information is managed by the conference ID and includes items presented in the FIG. 11 .
  • the item "conference ID” is information identifying a conference.
  • the item "tenant ID” is the identification information identifying the tenant.
  • the item "title” is a name of the conference.
  • the item "organizer” is the organizer of the conference.
  • the item "participant” is a list of participants invited to the conference.
  • the item "accessible user list” is a list of users who can access the conference resource including the recorded video.
  • the item "ad-hoc participant” is a list of guest participants.
  • the item "location" is information on a conference room, such as the name of the conference room.
  • start time is a scheduled time at which the conference is to start.
  • the item "end time” is a scheduled time at which the conference is to end.
  • the item "conference creator” is an ID of a user who has registered the conference information.
  • the item "password” is a password for the participant to log in to the conference.
  • the storage service system 70 may be any service system that stores information. A description is given of a data structure of text data stored in the storage service system 70.
  • FIG. 12 is a diagram illustrating the structure of text data stored in the storage service system 70.
  • items of "ID,” "time,” “user,” and “text” are associated with each other, for example, in a database in a table format.
  • the item “ID” is identification information that is assigned when text data is divided into units of "text,” that is, multiple speeches (character strings), according to a predetermined rule.
  • the predetermined rule is set in the speech recognition service system 80. For example, the rule specifies dividing the text data when a silence continues for a certain period of time, dividing the text data by elapse of a certain period regardless of presence of silence, or dividing the text data by units of sentence detected by morphological analysis.
  • the item “time” is time information representing the utterance time of a specific "text” (speech) as a time elapsed from the start of the recording to the utterance. Since the so-called time of day is also recorded at the start of recording, the time (absolute time) of utterance of the character string being "text” is also known.
  • the item “user” indicates whether the utterance is made at the first site (where the meeting device 60 is located) or another site, which is determined by sound pressure or the like.
  • the item “text” is one or more character strings that is a part of the delimited text data divided according to the predetermined rule.
  • the "time” is associated with the "text.” Accordingly, when the text data includes a specific text matching a search keyword, the information recording application 41 can display the video from the "time” (timing of the utterance represented as the elapsed time from the start of the recording) associated with the specific text in the data structure, for example, illustrated in FIG. 12 .
  • FIG. 13 is a diagram illustrating an example of an initial screen 200 displayed by the information recording application 41 operating on the communication terminal 10 after a login.
  • the user of the communication terminal 10 connects to the information processing system 50 on the information recording application 41.
  • the user inputs authentication information, and when the login is successful, the initial screen 200 of FIG. 13 is displayed.
  • the initial screen 200 includes a fixed display button 201, a change front button 202, the panoramic image 203, one or more talker images 204a to 204c, and a start recording button 205.
  • each of the talker images 204a to 204c may be simply referred to as a "talker image 204," when not distinguished from each other.
  • the panoramic image 203 and the talker images 204 created by the meeting device 60 are displayed on the initial screen 200. This allows the user to decide whether to start recording while viewing the panoramic image 203 and the talker images 204.
  • the panoramic image 203 and the talker images 204 are not displayed.
  • the information recording application 41 may display the talker images 204 of all participants based on all faces detected from the panoramic image 203, or may display the talker images 204 of certain number (N) of persons who have made an utterance most recently.
  • N certain number
  • the talker images 204 of up to three persons are displayed. Display of the talker image 204 of a participant may be omitted until one of the participants makes an utterance (in this case, the number of the talker images 204 increases by one in response to an utterance).
  • the talker images 204 of three participants in a predetermined direction may be displayed (the talker images 204 are switched in response to an utterance).
  • an image of a predetermined direction (such as 0 degrees, 120 degrees, or 240 degrees) of 360 degrees in the horizontal direction is generated as the talker image 204.
  • a predetermined direction such as 0 degrees, 120 degrees, or 240 degrees
  • the setting of the fixed display is prioritized.
  • the fixed display button 201 is a button for the user to perform an operation of fixing a certain area of the panoramic image 203 as the talker image 204 in close-up.
  • FIG. 14 is a diagram illustrating an operation to be performed when the fixed display button 201 is on.
  • the user moves a rectangular window 206 over the panoramic image 203 with a pointing device such as a mouse or a touch panel.
  • the user overlays the window 206 on an image of, for example, the electronic whiteboard or a podium included in the panoramic image 203.
  • the user's operation is transmitted to the meeting device 60.
  • the meeting device 60 creates an image of the area selected with the window 206 from 360 degrees in the horizontal direction in the same size as the talker image 204 and transmits the created image to the communication terminal 10. This enables continuous display of, as the talker image 204, an object, such as a whiteboard other than a talker.
  • the change front button 202 is a button for the user to perform an operation of changing the front of the panoramic image 203. Since the panoramic image presents the 360-degree surroundings in the horizontal direction, the right end and the left end matches to the same direction. The user slides the panoramic image 203 leftward or rightward with a pointing device to set a particular participant to the front. The user's operation is transmitted to the meeting device 60. The meeting device 60 changes the angle set as the front in 360 degrees in the horizontal direction, creates the panoramic image 203, and transmits the panoramic image 203 to the communication terminal 10.
  • the information recording application 41 displays a recording setting screen 210 illustrated in FIG. 16 .
  • FIG. 15 illustrates an example of the device unrecognized screen 250.
  • the device unrecognized screen 250 displays a message 251 stating "Device is not recognized. Please turn on the device for connection.” The user viewing this message checks the power supply and the connection state of the meeting device 60.
  • FIG. 16 is a diagram illustrating an example of the recording setting screen 210 displayed by the information recording application 41.
  • the recording setting screen 210 allows the user to set whether to record (whether to include in a recorded video) the panoramic image 203 and the talker images 204 created by the meeting device 60 and the desktop screen of the communication terminal 10 or the screen of the application operating on the communication terminal 10.
  • the information recording application 41 records only audio (audio output by the communication terminal 10 and audio collected by the meeting device 60).
  • a camera toggle button 211 is a button for switching on and off of recording of the panoramic image and the talker image generated by the meeting device 60.
  • the camera toggle button 211 may allow settings for switching on and off of recording of the panoramic image and the talker image individually.
  • a PC screen toggle button 212 is a button for switching on and off of recording of the desktop screen of the communication terminal 10 or a screen of an application operating on the communication terminal 10. When the PC screen toggle button 212 is on, the desktop screen is recorded.
  • the user When the user desires to record the screen of the application, the user further selects the application in an application selection field 213.
  • the application selection field 213 names of applications operating on the communication terminal 10 are displayed in a pull-down format.
  • the information recording application 41 acquires the names of the applications from the OS.
  • the information recording application 41 can display names of applications that have a user interface (screen) among applications operating on the communication terminal 10.
  • the teleconference application 42 may be included in the applications to be selected.
  • the information recording application 41 can record materials displayed by the teleconference application 42 and participants at each site in a video.
  • names of various applications operating on the communication terminal 10 are displayed in the application selection field 213 in the pull-down format.
  • the user can flexibly select a screen of an application to be included in the video (the record of the teleconference).
  • the information recording application 41 can record the screens of all the applications selected by the user.
  • the audio in this case includes audio output from the communication terminal 10 (audio received by the teleconference application 42 from the second site 101) and audio collected by the meeting device 60.
  • the user settings may be set such that the user can selectively stop recording the audio of the teleconference application 42 and the audio of the meeting device 60.
  • the video is recorded in the following manner. Further, the video is displayed in real time in the recorded content confirmation window 214.
  • the panoramic image and the talker images captured by the meeting device 60 are displayed in the recorded content confirmation window 214.
  • the desktop screen or the screen of the selected application is displayed in the recorded content confirmation window 214.
  • the panoramic image and the talker images captured by the meeting device 60 and the desktop screen or the screen of the selected application are displayed side by side in the recorded content confirmation window 214.
  • an image generated by the information recording application 41 is referred to as a video or a record of a teleconference.
  • FIG. 17 is a diagram illustrating a display example of the recorded content confirmation window 214 when the camera toggle button 211 is on and the PC screen toggle button 212 is off.
  • the panoramic image 203 and the talker image 204 are displayed in large size in the recorded content confirmation window 214.
  • FIG. 18 illustrates a display example of the recorded content confirmation window 214 when the camera toggle button 211 is on and the PC screen toggle button 212 is on.
  • the panoramic image 203 and the talker image 204 are displayed on the left side, and the application screen 217 is displayed on the right side in the recorded content confirmation window 214.
  • the recorded content confirmation window 214 allows the user to confirm the content to be recorded (particularly, the image by the meeting device 60) in the video according to the setting on the recording setting screen 210 before starting the recording.
  • FIG. 18 is a display example of the video when only one application is selected, but when two or more applications are selected, the screens of the second and subsequent applications are sequentially connected to the right side.
  • the screens of the second and subsequent applications may be arranged vertically and horizontally in two dimensions.
  • the recording setting screen 210 further includes a check box 215 with a message "Automatically transcribe after uploading the record.”
  • the recording setting screen 210 further includes a button 216 labeled as "start recording now.” If the user checks the check box 215, text data converted from utterances made during the teleconference is attached to the recorded video. In this case, after the end of recording, the information recording application 41 uploads audio data to the information processing system 50 together with a text data conversion request.
  • the button 216 labeled as "start recording now” a recording-in-progress screen 220 is displayed as illustrated in FIG. 19 .
  • FIG. 19 is an example of the recording-in-progress screen 220 displayed by the information recording application 41 during recording.
  • the recording-in-progress screen 220 displays, in real time, the video recorded according to the conditions set by the user in the recording setting screen 210.
  • the recording-in-progress screen 220 in FIG. 19 corresponds to the case where the camera toggle button 211 is on and the PC screen toggle button 212 is off, and displays the panoramic image 203 and the talker images 204 (both are moving images) created by the meeting device 60.
  • the recording-in-progress screen 220 displays a recording icon 225, a pause button 226, and a recording end button 227.
  • the pause button 226 is a button for pausing the recording.
  • the pause button 226 also receives an operation of resuming the recording after the recording is paused.
  • the recording end button 227 is a button for ending the recording.
  • the recording ID does not change when the pause button 226 is pressed, whereas the recording ID changes when the recording end button 227 is pressed. After pausing or temporarily stopping the recording, the user can set the recording conditions set on the recording setting screen 210 again before resuming the recording or starting recording again.
  • the information recording application 41 may generate multiple video files each time the recording is stopped (e.g., when the recording end button 227 is pressed), or may consecutively connect the plurality of video files to generate a single video (e.g., when the pause button 226 is pressed).
  • the information recording application 41 may replay the multiple video files sequentially as one video.
  • the recording-in-progress screen 220 includes a button 221 labeled as "get information from calendar,” a conference name field 222, a time field 223, and a location field 224.
  • the button 221 labeled as “get information from calendar” allows the user to acquire conference information from the conference management system 9.
  • the information recording application 41 acquires a list of conferences for which the user has a viewing authority from the information processing system 50 and displays the acquired list of conferences. The user selects a teleconference to be held at that time from the list of conferences.
  • the information recording application 41 can also acquire the app-side conference information.
  • the conference information is reflected in the conference name field 222, the time field 223, and the location field 224.
  • the title, the start time and the end time, and the place included in the conference information are reflected in the conference name field 222, the time field 223, and the location field 224, respectively.
  • the information on the teleconference in the conference management system 9 is associated with the record by the conference ID.
  • FIG. 20 is an example of a conference list screen 230 displayed by the information recording application 41.
  • the conference list screen 230 presents a list of conferences, specifically, a list of the records (videos) recorded during teleconferences.
  • the list of conferences includes conferences held in a certain conference room as well as teleconferences.
  • the conference list screen 230 presents the conference information for which the login user is authorized to view, stored in the conference information storage area 3001, the app-side conference information, and the information associated with the teleconference and stored in the record information storage area 5002 in an organized manner.
  • the information on the recorded video stored in the information storage area 1001 may be further organized on the conference list screen 230.
  • the conference list screen 230 is displayed when the user selects a conference list tab 231 on the initial screen 200 of FIG. 13 .
  • the conference list screen 230 displays a list 236 of the videos (records) for which the user has the viewing authority.
  • a person who schedules a conference (a person who creates minutes of the conference) can set the viewing authority for a participant of the conference.
  • the list of conferences may be a list of stored records, a list of scheduled conferences, or a list of conference data.
  • the conference list screen 230 includes items of a check box 232, an update date/time 233, a title 234, and a status 235.
  • the check box 232 receives selection of a video file.
  • the check box 232 is used when the user desires to collectively delete video files.
  • the update date/time 233 indicates a recording start time or a recording end time of the video. In a case where the video is edited, the update date/time 233 indicates the date and time of the editing.
  • the title 234 indicates the title (such as a subject) of the conference.
  • the title may be transcribed from the conference information or set by the user.
  • the status 235 indicates whether the video has been uploaded to the information processing system 50. If the video has not been uploaded, "local PC" is displayed, whereas if the video has been uploaded, "uploaded” is displayed. In the case where the video has not been uploaded, an upload button is displayed. In a case where there is a video that has not yet been uploaded, it is desirable that the information recording application 41 automatically uploads the video when the user logs into the information processing system 50.
  • the information recording application 41 displays a video replay screen 240 (record replay screen) of FIG. 21 .
  • a recorded video can be displayed.
  • FIG. 21 is an example of the video replay screen 240 displayed by the information recording application 41 after the recorded video is selected.
  • the video replay screen 240 includes a display field 241, a transcription button 242, one or more text display fields 243, an automatic scroll button 244, and a search button 245.
  • the display field 241 includes a replay button 241a, a rewind button 241b, a fast forward button 241c, a time indicator 241 d, a replay speed button 241 e, and a volume button 241f.
  • the display field 241 displays a recorded video.
  • the panoramic image and the talker image are on the left side
  • the screen of the teleconference application 42 is on the right side.
  • the screen of the teleconference application 42 transitions between an image representing the site and an image of a document during the teleconference.
  • the user can view a screen of a desired scene by operating various buttons.
  • the transcription button 242 is a button that allows the user to switch whether to display the text data in the text display fields 243 in corresponding to the display time of the video.
  • the automatic scroll button 244 is a button that allows the user to switch whether to automatically scroll the text data irrespective of the display time.
  • the search button 245 is a button that allows the user to designate a keyword and search the text data using a keyword. A detailed description thereof will be given later.
  • a recorded video may be downloaded.
  • FIG. 22 is an example of an edit screen 260 for editing a video.
  • the edit screen 260 transitions from the recording-in-progress screen 220 automatically or in response to a predetermined operation by the user on the video replay screen 240.
  • the edit screen 260 has a first display field 261 and a second display field 262. A certain moment in the video being replayed is displayed in the first display field 261, and frames of the video are displayed in time series in the second display field 262.
  • the user can select one or more frames to delete unwanted frames.
  • the user can also extract a part of the frames and insert the part of the frames after a desired frame.
  • the editing unit 21 edits the video in accordance with a user's operation, and overwrites the existing video with the edited video or stores the edited video separately.
  • FIG. 23 is a sequence chart illustrating an example of recording a panoramic image, a talker image, and an application screen by the information recording application 41.
  • S 1 The user of the communication terminal 10 activates the information recording application 41 and connects the communication terminal 10 (the information recording application 41) to the information processing system 50. If the access token has expired, the display control unit 13 displays the login screen.
  • S6 The user inputs authentication information (e.g., user ID and a password) for logging into a tenant, to the information recording application 41.
  • the communication unit 11 implemented by the information recording application 41 transmits, to the information processing system 50, a login request with designation of the authentication information.
  • the communication unit 51 of the information processing system 50 receives the login request, and the authentication unit 52 authenticates the user on the basis of the authentication information.
  • the following description of the present embodiment is given on the assumption that the authentication is successful.
  • the communication unit 51 of the information processing system 50 transmits an access token 1 to the information recording application 41.
  • the communication unit 51 attaches the access token 1 to the subsequent communication with the information processing system 50.
  • the access token 1 is associated with the authority of the user who has logged in.
  • the user also logs into the storage service system 70 in a similar manner since the record (recorded video) is stored in the storage service system 70.
  • the user inputs authentication information (e.g., user ID and a password) for logging into the storage service system 70.
  • the operation reception unit 12, which is implemented by instructions from the CPU 501 operating according to the information recording application 41, receives the input.
  • the communication unit 11 implemented by the information recording application 41 transmits, to the information processing system 50, a login request with designation of the authentication information.
  • the communication unit 51 of the information processing system 50 receives the login request and transfers the login request to the storage service system 70 because the login request is for logging into the storage service system 70.
  • the storage service system 70 authenticates the user based on the authentication information. The following description of the present embodiment is given on the assumption that the authentication is successful.
  • the storage service system 70 transmits an access token 2 to the information processing system 50.
  • the communication unit 51 of the information processing system 50 receives the access token 2 and transmits the access token 2 to the information recording application 41.
  • the communication unit 51 attaches the access token 2 to the subsequent communication with the storage service system 70.
  • the access token 2 is associated with the authority of the user who has logged in.
  • the user operates the teleconference application 42 to start the teleconference.
  • the teleconference application 42 of the first site 102 transmits an image captured by the camera of the communication terminal 10 and audio collected by the microphone of the communication terminal 10 to the teleconference application 42 of the second site 101.
  • the teleconference application 42 of the second site 101 displays the received image on the display and outputs the received audio from the speaker.
  • the teleconference application 42 of the second site 101 transmits an image captured by the camera of the communication terminal 10 and audio collected by the microphone of the communication terminal 10 to the teleconference application 42 of the first site 102.
  • the teleconference application 42 of the first site 102 displays the received image on the display and outputs the received audio from the speaker.
  • Each teleconference application 42 repeats these processes to implement the teleconference.
  • S22 The user inputs settings relating to recording on the recording setting screen 210 illustrated in FIG. 16 , provided by the information recording application 41.
  • the operation reception unit 12 implemented by the information recording application 41 receives the settings.
  • a description is given on the assumption that both the camera toggle button 211 and the PC screen toggle button 212 are set to on.
  • a list of teleconferences is displayed in response to pressing of the button 221 labeled as "get information from calendar" illustrated in FIGS. 19 and 21 by the user.
  • the user selects a desired teleconference to be associated with the recorded video. Since the user has already logged into the information processing system 50, the information processing system 50 identifies teleconferences for which the user who has logged in has the viewing authority. Since the information processing system 50 transmits the list of the identified teleconferences to the communication terminal 10, the user selects a teleconference that is being held or to be held in the future. Thus, information on the teleconference such as the conference ID is determined.
  • the user can create a conference when creating a video.
  • the information recording application 41 creates a conference when creating a video and acquires a conference ID from the information processing system 50.
  • S23 The user instructs the information recording application 41 to start recording. For example, the user presses the button 216 labeled as "start recording now.”
  • the operation reception unit 12 implemented by the information recording application 41 receives the instruction.
  • the display control unit 13 displays the recording-in-progress screen 220.
  • the conference information acquisition unit 54 acquires a unique conference ID assigned by the conference management system 9.
  • the communication unit 51 transmits the conference ID to the information recording application 41.
  • the conference information acquisition unit 54 transmits a storage location (an URL of the storage service system 70) in which the video file is to be stored to the information recording application 41 via the communication unit 51.
  • the app screen acquisition unit 14 implemented by the information recording application 41 sends, to an application selected by the user, a request of a screen of the selected application. More specifically, the app screen acquisition unit 14 acquires the screen of the application via the OS.
  • the description given with reference to FIG. 23 is on the assumption that the user selects the teleconference application 42.
  • the video storing unit 17 implemented by the information recording application 41 notifies the meeting device 60 of the start of recording via the device communication unit 16. With the notification, the video storing unit 17 preferably sends information indicating that the camera toggle button 211 is on (a request for a panoramic image and a talker image). The meeting device 60 transmits the panoramic image and the talker image to the information recording application 41 regardless of the presence or absence of the request.
  • a unique recording ID is assigned.
  • the terminal communication unit 61 transmits the assigned recording ID to the information recording application 41.
  • the information recording application 41 assigns the recording ID.
  • the recording ID is acquired from the information processing system 50.
  • the audio reception unit 15 implemented by the information recording application 41 acquires audio data output by the communication terminal 10 (audio data received by the teleconference application 42).
  • the device communication unit 16 transmits the audio data acquired by the audio reception unit 15 and a combining request of audio to the meeting device 60.
  • the audio synthesis unit 65 In response to receiving the audio data and the combining request by the terminal communication unit 61 of the meeting device 60, the audio synthesis unit 65 combines (synthesizes) the received audio data with the audio of the surroundings collected by the sound collection unit 64. For example, the audio synthesis unit 65 adds the two audio data items together. Since clear audio around the meeting device 60 is recorded, the accuracy of text conversion of audio especially around the meeting device 60 (in the conference room) increases.
  • the communication terminal 10 is also capable of performing the audio synthesis. However, by distributing the recording function to the communication terminal 10 and the audio processing to the meeting device 60, load on each of the communication terminal 10 and the meeting device 60 is reduced. Alternatively, the recording function may be distributed to the meeting device 60, and the audio processing may be distributed to the communication terminal 10.
  • the panoramic image generation unit 62 of the meeting device 60 generates a panoramic image
  • the talker image generation unit 63 generates a talker image
  • the device communication unit 16 of the information recording application 41 repeatedly acquires the panoramic image and the talker image from the meeting device 60. Further, the device communication unit 16 repeatedly requests the meeting device 60 for the synthesized audio data to acquire the synthesized audio data. The device communication unit 16 may send a request to the meeting device 60 to acquire such images and data. Alternatively, the meeting device 60 that has received information that the camera toggle button 211 is on may automatically transmit the panoramic image and the talker image. The meeting device 60 that has received the combining request of audio may automatically transmit the synthesized audio data to the information recording application 41.
  • the display control unit 13 implemented by the information recording application 41 displays the application screen, the panoramic image, and the talker image side by side on the recording-in-progress screen 220.
  • the video storing unit 17 implemented by the information recording application 41 stores the application screen, the panoramic image, and the talker image acquired from the teleconference application 42 as different videos.
  • the video storing unit 17 designates the application screen, panoramic image, and the talker image that are repeatedly received to the frames constituting the video, so as to create each video.
  • the video storing unit 17 stores the audio data received from the meeting device 60.
  • the information recording application 41 repeats the above steps S31 to S36.
  • the device communication unit 16 implemented by the information recording application 41 notifies the meeting device 60 of the end of recording.
  • the meeting device 60 continues the generation of the panoramic image and the talker image and the synthesis of the audio.
  • the meeting device 60 may change the processing load by, for example, changing the resolution or frames per second depending on whether or not recording is being performed.
  • the video storing unit 17 implemented by the information recording application 41 combines the video with the audio data, to create the video combined with audio. If no recorded video is stored, the audio data may be independent. There are three types of recorded videos, i.e., an application screen video, a panoramic video, and a talker video. One of these videos to which the audio data is combined is determined in advance with the priority order. In addition, the video and the audio data may not necessarily be combined.
  • the audio data processing unit 18 requests the information processing system 50 to convert the audio data into text data.
  • the audio data processing unit 18 transmits, to the information processing system 50 via the communication unit 11, a conversion request of the audio data combined to the video, the conference ID, the recording ID, with designation of the URL of the storage location.
  • the communication unit 51 of the information processing system 50 receives the conversion request for converting the audio data, and the text conversion unit 55 converts the audio data into text data using the speech recognition service system 80.
  • the communication unit 51 stores the text data in the same storage location (the URL of the storage service system 70) as the storage location of the video.
  • the text data is associated with the video by the conference ID and the recording ID in the record information storage area 5002.
  • the text data may be managed by the conference information acquisition unit 54 of the information processing system 50 and stored in the storage unit 5000.
  • the communication terminal 10 may request the speech recognition service system 80 to perform speech recognition, and may store text data received from the speech recognition service system 80 in the storage location.
  • the speech recognition service system 80 returns the converted text data to the information processing system 50.
  • the speech recognition service system 80 directly transmits the text data to the URL of the storage location.
  • the speech recognition service system 80 may be selected or switched among multiple services according to the user settings in the information processing system 50.
  • the upload unit 20 implemented by the information recording application 41 stores the video (record of teleconference) in the designated storage location via the communication unit 11.
  • the video is associated with the conference ID and the recording ID. Having been uploaded is recorded in the video.
  • the user Since the user is notified of the storage location, the user can share the video with other participants by sending the storage location via e-mail or the like. Even when the video, the audio data, and the text data are created by different devices or apparatuses, these data are stored in one storage location, so that the user can view the collected image or data later in a simple manner.
  • the meeting device 60 or the communication terminal 10 transmits audio data to the information processing system 50 in real time.
  • the communication terminal 10 displays the text data transmitted from the meeting device 60 or returned from the information processing system 50 on the recording-in-progress screen 220 and stores the text data.
  • steps S31 to S36 does not have to be performed in the order presented in FIG. 23 .
  • the order of the audio data synthesis and the storing of the video may be switched.
  • FIG. 24 is a sequence diagram illustrating an example of a process in which the information recording application 41 downloads and displays a recorded video.
  • a user who wants to replay a recorded video inputs an operation of displaying the conference list screen 230 ( FIG. 20 ) on the information recording application 41. For that, the user has selected the conference list tab 231 on the initial screen 200 of FIG. 13 .
  • the operation reception unit 12 implemented by the information recording application 41 receives the selection.
  • S52 The communication unit 11 implemented by the information recording application 41 designates the access token 1 and transmits the conference list request to the information processing system 50.
  • the communication unit 51 of the information processing system 50 receives the conference list request, and the conference information acquisition unit 54 transmits the conference list request to the conference management system 9.
  • the conference information management unit 31 of the conference management system 9 specifies conference information for which the user of the access token 1 has the viewing authority. Further, the conference information management unit 31 acquires, from the information processing system 50, the app-side conference information associated with the specified conference information by the conference ID.
  • the conference information management unit 31 organizes the conference information stored by the conference management system 9 and the app-side conference information.
  • the conference management system 9 transmits the organized conference list to the information processing system 50. This organizing may be performed by the information processing system 50.
  • the communication unit 51 of the information processing system 50 receives the conference list, and the screen generation unit 53 generates the conference list screen 230.
  • the communication unit 51 of the information processing system 50 transmits the screen information of the conference list screen 230 to the information recording application 41.
  • the communication unit 11 implemented by the information recording application 41 receives the screen information of the conference list screen 230, and the display control unit 13 displays the conference list screen 230 on the display.
  • S59 The user selects the conference corresponding to video to be displayed.
  • the operation reception unit 12 implemented by the information recording application 41 receives the selection (the conference ID is specified).
  • S60 The communication unit 11 implemented by the information recording application 41 transmits a request for the record, to the information processing system 50, with designation of the conference ID.
  • the communication unit 51 of the information processing system 50 receives the request for the record.
  • the conference information acquisition unit 54 checks the presence of the conference in the conference management system 9. It is assumed that the conference is present. Note that this check may be omitted.
  • the screen generation unit 53 acquires the information on the record (the conference ID, the recording ID, update date and time, title, storage location, and the like) associated with the conference ID from the record information storage area 5002.
  • the screen generation unit 53 generates the video replay screen 240 using the information on the record.
  • the display field 241 may be a blank, or the title of the record may be displayed. If a thumbnail image of the video is stored in the record information storage area 5002, the thumbnail image may be displayed in the display field 241.
  • the communication unit 51 of the information processing system 50 transmits the screen information of the video replay screen 240 to the information recording application 41.
  • the communication unit 11 implemented by the information recording application 41 receives the screen information of the video replay screen 240, and the display control unit 13 displays the video replay screen 240.
  • S65 The user inputs an operation of starting replay of the video (the replay button 241a is turned on).
  • the operation reception unit 12 implemented by the information recording application 41 receives the operation.
  • S66 The communication unit 11 implemented by the information recording application 41 designates the access token 2 and accesses the storage location in the information storage area 1001.
  • the storage service system 70 returns, to the information recording application 41, a URL of the storage location of the record (the video and the text data).
  • the communication unit 11 implemented by the information recording application 41 receives the URL and connects to the storage location, to download the record.
  • the video replay unit 19 displays the recorded video on the video replay screen 240 and displays the text data corresponding to the recorded video.
  • the user can delete the conference information from the information recording application 41.
  • the user selects the conference to be deleted, on the conference list screen 230 (see FIG. 20 ).
  • the operation reception unit 12 implemented by the information recording application 41 receives the operation (the conference ID is specified).
  • the communication unit 11 implemented by the information recording application 41 connects to the storage location of the storage service system 70, designating the access token 2, and deletes the record.
  • the storage location is stored in the information storage area 1001.
  • the communication unit 11 implemented by the information recording application 41 connects to the conference management system 9, using the access token 1, and transmits a conference information deletion request, specifying the conference ID.
  • the conference management system 9 receives the deletion request.
  • the conference information management unit 31 deletes the conference information specified by the conference ID and the app-side conference information.
  • the record may include:
  • the audio data is combined with any one of a to c.
  • the items a to c may be absent depending on the recording setting made by the user.
  • the item d may also be absent if audio data is not converted into text data.
  • the user can switch the content to be displayed on the video replay screen 240 to one or more contents selected from the items a to c each of which is in the form of video (moving image).
  • FIG. 25 is a diagram of an example of the video replay screen 240 on which the panoramic image 203 and the talker image 204 are displayed.
  • the video replay screen 240 includes a video selection menu 249.
  • the video selection menu 249 includes a video button 246, a content button 247, and a video & content button 248.
  • the video button 246 is a button for displaying a panoramic image and a talker image
  • the content button 247 is a button for displaying an application screen
  • the video & content button 248 is a button for displaying the talker image and the application screen.
  • the video button 246 is pressed, and the panoramic image 203 and the talker image 204 are displayed in the display field 241.
  • FIG. 26 is a diagram of the video replay screen 240 in which the video & content button 248 is pressed and the talker image 204 and the application screen 217 are displayed.
  • the display field 241 the talker image 204 and the application screen 217 are displayed.
  • images of the talkers are vertically arranged.
  • the panoramic image is not displayed, the panoramic image may also be displayed.
  • FIG. 27 is a diagram of the video replay screen 240 in which the content button 247 is pressed and the application screen 217 is displayed.
  • the application screen 217 is displayed in the display field 241.
  • FIG. 28 is a diagram illustrating switching of the video (e.g., among the items a to c) displayed in the display field 241 in response to an operation on the video selection menu 249. As illustrated in FIG. 28 , the user can switch the video to be displayed, by operating the video selection menu 249.
  • FIG. 29 is a sequence diagram of an example of a process in which the information recording application 41 switches the video to be displayed in the display field 241 in response to a user operation.
  • S81 The user inputs an operation for switching the video to be displayed, that is, operates the video selection menu 249.
  • the operation reception unit 12 implemented by the information recording application 41 receives the switching operation.
  • the video replay unit 19 implemented by the information recording application 41 acquires the current display time of the video displayed in the display field 241 at the reception of the switching operation.
  • the video replay unit 19 displays the video (as a switched content to be displayed) designated by the switching operation, in the display field 241 from a scene corresponding to the display time acquired at the reception of the switching operation.
  • the target video can be displayed from the scene corresponding to the display time at which the switching operation is received.
  • the communication terminal 10 switching the video using record download by the information recording application 41.
  • the information processing system 50 may switch the video to be displayed. For example, when a web application provided by the information processing system 50 displays the record, the information processing system 50 performs such processing.
  • FIG. 30 is a sequence diagram illustrating a process in which the information recording application 41 displays the text data corresponding to the display time of the video displayed in the display field 241, based on the time information "time" (time elapsed from the start of the recording to the utterance of a specific speech) stored in the data structure illustrated in FIG. 12 .
  • time time elapsed from the start of the recording to the utterance of a specific speech
  • the operation reception unit 12 implemented by the information recording application 41 receives the operation.
  • the video replay unit 19 compares the item "time” (illustrated in FIG. 12 ) included in the text data with the display time. When there is a character string of the "text" within a predetermined period from the display time, the video replay unit 19 displays the character string in the text display field 243. Therefore, for example, the character string of the "text” is displayed in time series from top to bottom in accordance with the display time.
  • the text data is displayed corresponding to the scenes recorded in the video in this way, the user can easily confirm the remarks at the time of a specific scene in the video.
  • the information recording application 41 associates the video with the text data.
  • the information processing system 50 may perform such synchronization processing.
  • the information recording application 41 transmits the display time to the information processing system 50.
  • the information processing system 50 returns text data corresponding to the display time to the information recording application 41.
  • FIGS. 31A and 31B are schematic diagrams illustrating search of text data and the corresponding display of a video.
  • the user inputs "keyword” in a search window 265 and searches the text data for this keyword.
  • the search unit 22 searches the text data for "keyword” and displays “It keyword it”, “that keyword it”, and “this keyword it” as search results 266.
  • the user selects "this keyword it” with a pointing device.
  • the video replay unit 19 acquires the time associated with "this keyword it” from the text data.
  • the video replay unit 19 displays the currently displayed video in the display field 241 from the time (time elapsed from the start of recording to the utterance.
  • the corresponding scene of the video may be displayed as a still image.
  • FIG. 32 is a sequence diagram illustrating an example of a process in which the information recording application 41 displays the video in the display field 241 in association with the retrieved text data.
  • S101 The user inputs a keyword of the search and instructs the search.
  • the search unit 22 which is implemented by instructions from the CPU 501 operating according to the information recording application 41, searches the text data for the keyword.
  • S 103 The display control unit 13 displays a list of character strings of the text as search results (retrieved text).
  • S104 The user selects one of the character strings of the "text.”
  • the operation reception unit 12 implemented by the information recording application 41 receives the selection.
  • the video replay unit 19 implemented by the information recording application 41 displays the video from the time associated with the selected character string of the "text.”
  • the record display system 100 can saves the user from searching for the scene at which a specific utterance has occurred.
  • the information recording application 41 searches text data and displays a video in FIG. 32 , this processing may be performed by the information processing system 50.
  • the information recording application 41 transmits a keyword to the information processing system 50, and the information processing system 50 searches for text data for the keyword.
  • the information processing system 50 transmits the search result to the information recording application 41.
  • the information recording application 41 transmits the character string of the "text" selected by the user to the information processing system 50.
  • the information processing system 50 requests the information recording application 41 to perform display from the "time” associated with the character string of the "text.”
  • the information processing system 50 displays the video, the video is transmitted from the "time” associated with the selected character string of the "text.”
  • the record display system 100 simultaneously or selectively displays a panoramic image of the surroundings including the user, a talker image, and the screen of an application (such as the teleconference application 42) displayed in a teleconference.
  • an application such as the teleconference application 42
  • the record display system 100 displays both the screen information displayed by the application (e.g., the teleconference application) selected to be included in the record and the image information of the surroundings of the device at the site (e.g., in a conference room) recorded from the start of the recording to the end of the recording.
  • the record display system 100 can display the record of the content of the teleconference (telecommunication) and scenes of the site (e.g., scenes in a conference room) recorded thoroughly.
  • the information recording application 41 can display the video and the text data in association with each other.
  • the information recording application 41 can selectively display a video in accordance with a user operation.
  • the information recording application 41 can display the video in association with a character string (matched text) retrieved in the search.
  • the communication terminal 10 and the meeting device 60 may be integral with each other.
  • the meeting device 60 is externally attached to the communication terminal 10.
  • the meeting device 60 may be implemented by a spherical camera, a microphone, and a speaker connected to one another by cables.
  • the meeting device 60 may be disposed also at the second site 101.
  • the meeting device 60 at the second site 101 separately creates a video (combined with audio) and text data.
  • Multiple meeting devices 60 may be provided at a single site. In this case, multiple records are created for each meeting device 60.
  • the arrangement of the panoramic image 203, the talker image 204, and the screen of the application in the video in the present embodiment is merely an example.
  • the panoramic image 203 may be displayed below the talker images 204.
  • the record display system 100 may allow the user to change the arrangement or allow the user to individually turn on and off the display of the panoramic image 203 and the talker images 204 during replay of the video.
  • the functional configurations illustrated in, for example, FIG. 7 are divided according to main functions in order to facilitate understanding of processing executed by the communication terminal 10, the meeting device 60, and the information processing system 50.
  • the way of dividing processing in units or the name of the processing unit do not limit the scope of the present invention.
  • the processes performed by the communication terminal 10, the meeting device 60, and the information processing system 50 may be divided into a greater number of processing units in accordance with the content of the processing.
  • a single processing unit can be further divided into a plurality of processing units. For example, a system for creating a video may be separate from a system for displaying the video.
  • the apparatuses or devices described in one embodiment are just one example of multiple computing environments that implement the one embodiment in this specification.
  • the information processing system 50 includes multiple computing devices, such as a server cluster.
  • the multiple computing devices communicate with one another through any type of communication link including, for example, a network or a shared memory, and perform the operations described in the present disclosure.
  • the information processing system 50 may share the processing steps disclosed herein, for example, steps in FIG. 23 or the like in various combinations. For example, a process performed by a predetermined unit may be performed by multiple information processing apparatuses included in the information processing system 50. Further, the elements of the information processing system 50 may be combined into one server apparatus or are allocated to multiple apparatuses.
  • processing circuit or circuitry includes a programmed processor to execute each function by software, such as a processor implemented by an electronic circuit, and devices, such as an application specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), and conventional circuit modules arranged to perform the recited functions.
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • the present disclosure includes the following aspects.
  • a record display system includes an information recording application executed by a communication terminal.
  • the information recording application includes a display unit that displays a video created based on screen information and surrounding image information.
  • the screen information is information having displayed by a teleconference application selected on the information recording application.
  • the surrounding image information represents an image of surroundings of a device captured by the device.
  • the display unit displays the surrounding image information, talker image information cut out from the surrounding image information, and the screen information.
  • the screen information is information that is displayed in a form of a window by an application being executed and is acquired as an image by the information recording application, and the application being executed includes the teleconference application.
  • the surrounding image information around the device is image information acquired by capturing, with the device, a 360-degree area surrounding the device.
  • the surrounding image information around the device is transmitted from the device installed at a site to the information recording application.
  • the display unit further displays text data on the same screen on which the vide is displayed, simultaneously with the video.
  • Text data is based on audio data collected by the device and audio data output by the communication terminal.
  • the text data is based on data obtained by combining the audio data collected by the device and the audio data output by the communication terminal, and the display unit displays the text data obtained by converting the combined audio data by speech recognition.
  • the display unit displays the text data in a scrolling manner in association with a display time of the video.
  • the record display system further includes a search unit to search the text data for a keyword, and a display control unit to display a part of the text data matched the keyword.
  • the display unit When the part of the text data is selected, the display unit displays a scene of the video at an utterance time associated with the part of the text data.
  • the video includes the surrounding image information, the talker image information cut out from the surrounding image information, and the screen information.
  • the display unit switches a content to be displayed among a) the surrounding image information and the talker image information, b) the screen information, and c) the talker image information and the screen information in accordance with a switching operation of a user.
  • the display unit displays a) the surrounding image information and the talker image information, b) the screen information, or c) the talker image information and the screen information from a display time at a switching operation by the user.
  • the present invention can be implemented in any convenient form, for example using dedicated hardware, or a mixture of dedicated hardware and software.
  • the present invention may be implemented as computer software implemented by one or more networked processing apparatuses.
  • the processing apparatuses include any suitably programmed apparatuses such as a general purpose computer, a personal digital assistant, a Wireless Application Protocol (WAP) or third-generation (3G)-compliant mobile telephone, and so on. Since the present invention can be implemented as software, each and every aspect of the present invention thus encompasses computer software implementable on a programmable device.
  • the computer software can be provided to the programmable device using any conventional carrier medium (carrier means).
  • the carrier medium includes a transient carrier medium such as an electrical, optical, microwave, acoustic or radio frequency signal carrying the computer code.
  • transient medium is a Transmission Control Protocol/Internet Protocol (TCP/IP) signal carrying computer code over an IP network, such as the Internet.
  • the carrier medium may also include a storage medium for storing processor readable code such as a floppy disk, a hard disk, a compact disc read-only memory (CD-ROM), a magnetic tape device, or a solid state memory device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Acoustics & Sound (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
EP23153948.7A 2022-02-17 2023-01-30 Système d'affichage, procédé d'affichage et support Pending EP4231632A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022023067 2022-02-17
JP2022188591A JP2023120142A (ja) 2022-02-17 2022-11-25 記録情報表示システム、プログラム、記録情報表示方法

Publications (1)

Publication Number Publication Date
EP4231632A1 true EP4231632A1 (fr) 2023-08-23

Family

ID=85222136

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23153948.7A Pending EP4231632A1 (fr) 2022-02-17 2023-01-30 Système d'affichage, procédé d'affichage et support

Country Status (2)

Country Link
US (1) US20230262200A1 (fr)
EP (1) EP4231632A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040263636A1 (en) * 2003-06-26 2004-12-30 Microsoft Corporation System and method for distributed meetings
US20170041570A1 (en) * 2015-08-03 2017-02-09 Ricoh Company, Ltd. Communication apparatus, communication method, and communication system
US20190341050A1 (en) * 2018-05-04 2019-11-07 Microsoft Technology Licensing, Llc Computerized intelligent assistant for conferences
JP2021105688A (ja) 2019-12-27 2021-07-26 株式会社イトーキ 会議支援装置
US20210320953A1 (en) * 2020-04-10 2021-10-14 Microsoft Technology Licensing, Llc Content Recognition while Screen Sharing

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040263636A1 (en) * 2003-06-26 2004-12-30 Microsoft Corporation System and method for distributed meetings
US20170041570A1 (en) * 2015-08-03 2017-02-09 Ricoh Company, Ltd. Communication apparatus, communication method, and communication system
US20190341050A1 (en) * 2018-05-04 2019-11-07 Microsoft Technology Licensing, Llc Computerized intelligent assistant for conferences
JP2021105688A (ja) 2019-12-27 2021-07-26 株式会社イトーキ 会議支援装置
US20210320953A1 (en) * 2020-04-10 2021-10-14 Microsoft Technology Licensing, Llc Content Recognition while Screen Sharing

Also Published As

Publication number Publication date
US20230262200A1 (en) 2023-08-17

Similar Documents

Publication Publication Date Title
US20180052837A1 (en) Information processing apparatus, information processing method, and information processing system
JP2008293219A (ja) コンテンツ管理システム、コンテンツ管理システムにおける情報処理装置、情報処理装置におけるリンク情報生成方法、情報処理装置におけるリンク情報生成プログラム、及びリンク情報生成プログラムを記録した記録媒体
EP4064647A1 (fr) Système de gestion, système de communication, procédé de traitement d'informations et support
US20230292011A1 (en) Information processing system, image-capturing device, and display method
JP2016063477A (ja) 会議システム、情報処理方法、及びプログラム
EP4231632A1 (fr) Système d'affichage, procédé d'affichage et support
EP4221194A1 (fr) Système de création d'informations d'enregistrement, procédé de création d'informations d'enregistrement et moyens de support
EP4243409A1 (fr) Système et procédé d'affichage d'image, dispositif de capture d'image et moyens de support
JP2023120142A (ja) 記録情報表示システム、プログラム、記録情報表示方法
US20230289126A1 (en) System, method for adjusting audio volume, and apparatus
US20230297313A1 (en) Device management system, information processing method, information processing server, and non-transitory recording medium
US20230280961A1 (en) Device management system, information processing system, information processing device, device management method, and non-transitory recording medium
US20240004921A1 (en) Information processing system, information processing method, and non-transitory recording medium
JP2023120068A (ja) 音声処理システム、デバイス、音声処理方法
US20240194226A1 (en) Video editing device, video editing method, and computer program
US11936701B2 (en) Media distribution system, communication system, distribution control apparatus, and distribution control method
US20240031653A1 (en) Information processing server, record creation system, display control method, and non-transitory recording medium
EP4294019A1 (fr) Terminal d'affichage, système de communication, procédé d'affichage et procédé de communication
JP2019050444A (ja) データ編集のためのインターフェイス装置、キャプチャ装置、画像処理装置、データ編集方法及びデータ編集プログラム
WO2024094115A1 (fr) Procédé de partage de contenu et dispositif associé
US20230308622A1 (en) Display terminal, displaying method, and recording medium
JP2024008632A (ja) 情報処理システム、表示方法、プログラム、記録情報作成システム
JP2024025003A (ja) 記録情報作成システム、情報処理システム、プログラム
JP2024033276A (ja) 通信システム、情報処理システム、動画作成方法、プログラム
JP2016001785A (ja) コンテンツの視聴者に情報を提供するための装置およびプログラム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230130

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR