CN117608465A - Information processing apparatus, display method, storage medium, and computer apparatus - Google Patents

Information processing apparatus, display method, storage medium, and computer apparatus Download PDF

Info

Publication number
CN117608465A
CN117608465A CN202311047129.3A CN202311047129A CN117608465A CN 117608465 A CN117608465 A CN 117608465A CN 202311047129 A CN202311047129 A CN 202311047129A CN 117608465 A CN117608465 A CN 117608465A
Authority
CN
China
Prior art keywords
display
area
display area
information processing
conference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311047129.3A
Other languages
Chinese (zh)
Inventor
河崎佑一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Publication of CN117608465A publication Critical patent/CN117608465A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to an information processing device, a display method, a storage medium and a computer device, and aims to provide a technical proposal for determining a display area of text data based on the use frequency of each area. The information processing apparatus provided by the invention can communicate with an information processing system, wherein the information processing apparatus is provided with a display control part for displaying objects on a display, a communication part for receiving text data converted from sound acquired by a device from the information processing system, a display area decision part for deciding a display area for displaying the text data according to the use frequency of each area on the display, and a text display part for displaying the text data on the display area.

Description

Information processing apparatus, display method, storage medium, and computer apparatus
Technical Field
The invention relates to an information processing apparatus, a display method, a storage medium, and a computer apparatus.
Background
As is well known, an information processing apparatus such as an electronic blackboard is capable of displaying handwriting data drawn by a user with a dedicated electronic pen, a finger, or the like on a touch screen display. There is also a technique of converting speech of conference participants into text data by speech recognition and creating a conference record using the text data.
Currently, for example, patent document 1 (JP 2010-154089 a) discloses a technique for converting sound into text data and displaying the text data in real time. The scheme converts input sound into text data by voice recognition, and displays picture data and text data in a common area.
However, the conventional technique has a problem that the display area of the text data cannot be determined according to the frequency of use of each area. For example, if the display area is fixed for displaying text data, the input area of handwriting data or the like becomes narrower, and operability is lowered. Further, if the information processing apparatus simply superimposes and displays text data on the input area, operability is also lowered because handwriting data and the like are hidden by the text data.
Disclosure of Invention
The present invention has been made in view of the above-described problems, and an object thereof is to provide a technique for determining a display area of text data based on a frequency of use of each area.
The present invention provides an information processing apparatus communicable with an information processing system, including: a display control section for displaying an object on a display; a communication unit configured to receive text data converted from a sound acquired by a device from the information processing system; a display area determining unit configured to determine a display area for displaying the text data based on a frequency of use of each area on the display; and a text display section for displaying the text data on the display area.
The present invention provides a technique for determining a display area of text data based on the frequency of use of each area.
Drawings
Fig. 1 is a schematic diagram of a process in which an electronic blackboard displays an object together with text data converted from the voice of a user's utterance.
Fig. 2 is a schematic diagram of creating recorded information and storing a screen of an application program executed in a teleconference together with surrounding panoramic images.
Fig. 3 is a schematic diagram of the configuration of the recorded information creating system.
Fig. 4 is a block diagram of the hardware configuration of the information processing system and the terminal device.
Fig. 5 is a hardware configuration block diagram of the conference apparatus.
Fig. 6 is a schematic diagram of an imaging range of the conference apparatus.
Fig. 7 is a schematic diagram of a cut-out of a panoramic image and a speaker image.
Fig. 8 is a block diagram of the hardware configuration of the electronic blackboard.
Fig. 9 is a functional block diagram of a terminal device, a conference device, and an information processing system of the recording information creation system.
Fig. 10 is a schematic diagram of the moving image recording information stored in the information storage unit.
Fig. 11 is a schematic diagram of conference information managed by the communication management section.
Fig. 12 is a schematic diagram of correspondence information stored in the correspondence information storage unit, in which the conference ID and the device identification information are associated.
Fig. 13 is a schematic diagram of the record information and the object information stored in the record information storage unit.
Fig. 14 is a schematic diagram of a structure of text data as a part of recording information.
Fig. 15 is a functional block diagram of an electronic blackboard.
Fig. 16 is a schematic diagram of device identification information and the like stored in the device information storage unit.
Fig. 17 is a schematic diagram of object information stored in the object information storage unit.
Fig. 18 is a timing chart of processing of the conference device and the electronic blackboard by the recorded information producing system.
Fig. 19 is a schematic diagram of a rectangular area and an update number.
Fig. 20 is a schematic diagram of a per window update number aggregation method.
Fig. 21 is a schematic diagram of the total update number in the window and the update number set in the rectangular area.
Fig. 22 is a table for determining the size of the text.
Fig. 23 is a schematic view of a blank provided in the electronic blackboard.
Fig. 24 is a timing chart of processing of saving record information and object information in a conference.
Fig. 25 is a flowchart of the electronic blackboard decision display area processing.
Fig. 26 is a flowchart showing a process of updating the number of updates of the rectangular area by the area determination unit.
Fig. 27 is a flowchart of a process of the display area determining unit determining the display area based on the total update number.
Fig. 28 is a flowchart of a process in which the text display section displays text data in a display area.
Fig. 29 is a schematic diagram of the addition point of the update number changed according to the object type.
Fig. 30 is a flowchart showing a process of the display area determining unit for updating the number of updates of the rectangular area according to the type of the object.
Fig. 31 is a schematic view showing movement of a region.
Detailed Description
Hereinafter, the mode of carrying out the present invention will be described with reference to an information processing apparatus and a display method implemented by the information processing apparatus.
< display of text data >
Fig. 1 is a schematic diagram illustrating a process in which text data converted from a speech sound of a user is displayed together with an object by an electronic blackboard 2.
(1) The sound made by the user is transmitted to the information processing system 50 in real time through the conference device 60 and the terminal apparatus 10.
(2) The information processing system 50 converts sound into text data while dividing the sound by a silent section, the number of text, or the like. The information processing system 50 transmits text data to the electronic blackboard 2 corresponding to the conference device 60.
(3) The electronic blackboard 2 that receives the text data displays the text data 153 on the display screen mainly by the following two methods.
(i) The electronic blackboard 2 displays text data in the unused area 151 of the display 480. Since the object is not displayed in the unused area 151, the user can recognize the object and the text data in a state where the object is not hidden by the text data 153 and the text data 153 is not hidden by the object.
(ii) When there is no unused area 151, the electronic blackboard 2 determines an area of which the frequency of use is low, and determines the area as the display area 150 of the text data 153. The display area 150 is likely to display an object that is not an object of discussion, even if the object is obscured by the text data 153, with little effect.
As described above, the recorded information creating system according to the present embodiment displays text data in the unused area 151 or the display area 150 that has been used recently with a low frequency, and it is difficult to reduce the usability of the electronic blackboard 2.
< expression >
Meeting refers to a meeting discussion. A meeting may be referred to as a meeting, a gathering, an appraisal, a publishing, a presentation, a negotiation, a discussion, a seminar, and the like.
The series of operations, called "strokes", is performed by the user after pressing the input device on the display screen and continuing to move the input device and then away from the display screen. The strokes include tracking movement of the user without contacting the display. In this case, the electronic blackboard may use a mouse or pointing device to initiate strokes by a user's gesture, pressing a button with the user's hand or foot, or other means (e.g., using a mouse or pointing device). Further, the user may end the stroke using the same or a different gesture, releasing a button, or using a mouse or pointing device.
The stroke data is information displayed on the display based on a trajectory of coordinates input by the input device. The stroke data may be interpolated as desired. Handwriting data is data having more than one stroke of data. Handwriting input refers to the input of handwriting data by a user. Handwriting input may be performed by touching an interface, a tactile object such as a pen or stylus, or the body of the user. Handwriting input may also be performed by other types of input, such as gesture-based input, gesture tracking input, or other non-touch input by the user. Although handwriting input and handwriting input data are referred to in embodiments of the present invention, other ways of handwriting input may be used.
The display that is displayed on the display based on the stroke data is called an object. In the present embodiment, the object means a display object or the like. The object after handwriting recognition and conversion of the stroke data may include, in addition to text, a fixed character such as "end" and a figure such as a seal, a circle and a star, a straight line, and the like displayed as a mark. Text refers to a character string (character code) that mainly includes more than one character, as well as numbers, symbols, and the like. The text may be a string of characters.
The device is a device having a function of recording sound in a meeting or the like. The device sends the voice data directly or indirectly to the information handling system. The device may also have the function of generating a surrounding image by means of a camera. The present embodiment will be described with the term of the conference device 60.
The frequency of use of the area of the display means the frequency at which objects are input or displayed in the area. Preferably the frequency of use is close to the current time period.
< conference recording method example of teleconference >)
A method of making a conference recording with a panoramic image and a picture of an application will be briefly described with reference to fig. 2. Fig. 2 is a view for briefly describing the creation of recorded information to be saved together with surrounding panoramic images of an application program executed in a teleconference. As shown in fig. 2, users in the illustrated local site 102 are conducting teleconferencing with other sites 101 using the teleconferencing service system 90.
The recorded information creating system 100 of the present embodiment creates recorded information (conference record) using a horizontal panoramic image (hereinafter referred to as panoramic image) captured by the conference device 60 having a microphone and a speaker and a screen created by an application executed by the terminal device 10. Regarding sound, the recorded information producing system 100 combines sound collected by the teleconference application 42 and sound acquired by the conference device 60 into recorded information. As briefly described below.
(1) The information recording application 41 and the teleconference application 42 described later are run in the terminal apparatus 10. An application program for displaying data or the like may be run. The information recording application 41 transmits the sound (including the sound received by the teleconference application 42 from another point) output from the terminal apparatus 10 to the conference device 60 (second device, example device). Conference device 60 mixes (synthesizes) its own acquired sound with the sound of teleconferencing application 42.
(2) The conference device 60 includes a microphone, cuts out a speaker from the panoramic image according to the direction of the acquired sound, and generates a speaker image. The conference device 60 transmits the panoramic image and the speaker image to the terminal apparatus 10.
(3) The information recording application 41 operating on the terminal apparatus 10 can display the panoramic image 203 and the speaker image 204. The information recording application 41 combines any application screen selected by the user (e.g., screen 103 of the teleconferencing application 42), the panoramic image 203, and the speaker image 204. For example, the panoramic image 203, the speaker image 204, and the screen 103 of the application (hereinafter referred to as the combined image 105) are combined such that the panoramic image 203, the speaker image 204, and the screen 103 of the teleconference application 42 are set on the left side and the screen 103 of the application are set on the right side. Since the process of (3) is repeatedly performed, the combined image 105 becomes an animation (hereinafter referred to as combined image animation). The information recording application 41 combines the synthesized sound with the combined image moving image to create a moving image with sound.
In the present embodiment, the panoramic image 203, the speaker image 204, and the screen 103 of the application are described as an example, but these may be stored separately by the information recording application 41 and set on the screen at the time of playback.
(4) The information recording application 41 accepts the editing job (the user cuts out unnecessary parts) and completes the combined image animation. The combined image animation forms part of the recorded information.
(5) The information recording application 41 transmits the generated combined image animation (with sound) to the hosting service system 70 for saving.
(6) The information recording application 41 extracts only the sound from the combined image animation (the sound before the combination may be acquired), and transmits the extracted sound to the information processing system 50. The information processing system 50 transmits the sound to the sound recognition service system 80 that converts the sound into text data, and converts the sound into text data. The text data also includes data that is spoken a few minutes after the recording has started.
In the case of text processing in real time, the conference device 60 directly transmits sound to the information processing system 50. The information processing system 50 transmits text data obtained by voice recognition to the information recording application 41 in real time.
(7) Information handling system 50 adds and saves the text data to a hosting service system 70 that maintains the combined image animation. The text data constitutes a part of the record information.
The information processing system 50 may perform a charging process corresponding to the utilized service on the user. For example, the charge may be calculated based on the amount of text data, the file size in connection with the image animation, the processing time, etc.
In this way, the combined image animation displays a screen including a panoramic image around the user and a speaker image, and further displays an application program displayed in a teleconference, such as the teleconference application program 42. Participants or non-participants of the teleconference can reproduce scenes in the teleconference with a feeling of presence when viewing the combined image animation as a conference recording.
< System configuration >
Next, with reference to fig. 3, a system configuration of the recorded information producing system 100 will be described. Fig. 3 shows an example of a configuration of the recorded information creating system 100. Fig. 3 shows one of a plurality of sites (site 102) of the teleconference, and the terminal apparatus 10 of the site 102 communicates with the information processing system 50, the hosting service system 70, and the teleconference service system 90 via a network. The conference device 60 and the electronic blackboard 2 are also provided in the local site 102, and the terminal apparatus 10 is communicably connected to the conference device 60 via a USB cable or the like. The conference device 60, the electronic blackboard 2, and the information processing system 50 operate as a device management system.
At least the information recording application 41 and the teleconference application 42 are operated in the terminal apparatus 10. Teleconferencing application 42 may communicate with terminal apparatus 10 at other sites 101 via teleconferencing service system 90 over the network, and users at each site may be engaged in a conference at a remote site. The information recording application 41 generates recording information of the teleconference implemented by the teleconference application 42 using the functions of the information processing system 50 and the conference apparatus 60.
The present embodiment has been described by taking recorded information for creating a remote conference as an example, but the conference may not be a conference communicating with a remote site. That is, a meeting may be one in which only participants within one site are participating. In this case, the processing of the information recording application 41 is not changed except that the image captured by the conference device 60 and the collected sound are respectively saved without being synthesized.
The terminal apparatus 10 incorporates a camera (which may be external) of a normal angle of view, and captures a front image including the user 107 who operates the terminal apparatus 10. The normal viewing angle refers to an image that is not a panoramic image, but is mainly a planar image in the present embodiment, not a curved surface as in an all-celestial sphere image. Thus, the user can conduct a conventional teleconference using the teleconference application 42 without being aware of the information recording application 41. The information recording application 41 and the conference device 60 have no influence on the teleconference application 42, except for an increase in the processing load of the terminal apparatus 10. Teleconference application 42 may also send the panoramic image and speaker image captured by conference device 60 to teleconference service system 90.
The information recording application 41 communicates with the conference device 60 to generate recording information. The information recording application 41 also synthesizes the sound acquired by the conference device 60 and the sound received by the teleconference application 42 from other sites. This synthesis is sometimes performed by the conference device 60. The conference device 60 is a conference device including a panoramic image capturing apparatus, a microphone, and a speaker. The camera provided in the terminal apparatus 10 can capture only a limited range on the front surface, and the conference device 60 can capture an image of the entire circumference (not necessarily the entire circumference) surrounding the conference device 60. Conference facility 60 may always draw the multiple participants 106 shown in fig. 3 into a corner.
Further, the conference device 60 performs cutting out of a speaker image of the panoramic image, and the like. The conference device 60 is not limited to being on a desk, and may be disposed anywhere on its own site 102. The conference device 60 is capable of capturing full celestial sphere images and thus may be provided on a ceiling, for example.
The information recording application 41 executes: the application program to be run in the terminal device 10, the image composition for the recorded information (creation of the composite image animation), playback of the composite image animation, and reception of editing are displayed in a list. The information recording application 41 also performs list display of teleconferences that have been or will be implemented, and the like. The list of teleconferences is used for information related to the recorded information, and the user can combine the teleconferences with the recorded information.
The teleconference application 42 performs communication connection with other sites 101, transmission and reception of images and voices of other sites 101, display of images, output of voices, and the like.
The information recording application 41 and the teleconferencing application 42 may be Web applications or local applications. The Web application is an application that performs processing in cooperation with a program on a Web server and a program on a Web browser, and is an application that does not need to be installed in the terminal apparatus 10. The native application is an application installed and used in the terminal apparatus 10. In this embodiment, both of them are local applications.
The terminal device 10 may be a general-purpose information processing device having a communication function, such as PC (Personal Computer), a smart phone, or a tablet terminal. The terminal device 10 may be an electronic blackboard 2, a game machine, a personal digital assistant (Personal Digital Assistant, PDA), a wearable PC, a car navigation system, an industrial machine, a medical device, a network appliance, or the like. The terminal device 10 may be any device that runs the information recording application 41 and the teleconference application 42.
The electronic blackboard 2 displays data handwritten on the touch panel by an input means such as an electronic pen 490 or a finger on a display. The electronic blackboard 2 can communicate with the terminal device 10 and the like by wire or wireless, and reads a screen displayed by the terminal device 10 and displays the screen on a display. The electronic blackboard 2 can convert handwriting data into text data or share information displayed on a display with electronic blackboards 2 of other sites. The electronic blackboard 2 may be a system in which a projector projects an image on a simple whiteboard without a touch panel. The electronic blackboard 2 may be a tablet terminal, a notebook PC, a PDA, a game machine, or the like provided with a touch panel.
The electronic blackboard 2 can communicate with the information processing system 50. For example, the electronic blackboard 2 can receive information from the information processing system 50 by polling the information processing system 50 after being powered on.
The information processing system 50 is one or more information processing apparatuses provided on a network. The information processing system 50 has one or more server applications and basic services that cooperate with the information recording application 41 to perform processing. The server application manages teleconference lists, recording information of teleconference records, various settings, and registration paths. The basic service performs user authentication, subscription, charging processing, and the like.
All or a portion of the functionality of information handling system 50 may reside in either a cloud environment or a locally deployed environment. The information processing system 50 may be constituted by a plurality of server apparatuses or by one information processing apparatus. For example, the server application and the base service may be provided by different information processing apparatuses, and each function within the server application may exist in the information processing apparatus. Information handling system 50 may also be integral with a registration service system 70, a speech recognition service system 80, described below.
The hosting service system 70 is a storage means on the network, and provides a hosting service for accepting storage of files and the like. As the hosting service system 70, one Drive (registered trademark), google Workspace (registered trademark), drop Box (registered trademark), and the like are widely known. Hosting service system 70 may also be locally deployed NAS (Network Attached Storage), or the like.
The voice recognition service system 80 provides a service of performing voice recognition on voice data and converting the voice data into text data. The speech recognition service system 80 may be a general commercial service or may be part of the functionality of the information handling system 50.
< hardware Structure >
With reference to fig. 4, the hardware configuration of the information processing system 50 and the terminal device 10 according to the present embodiment will be described.
Information processing system and terminal device
Fig. 4 is a schematic diagram illustrating a hardware configuration of the information processing system 50 and the terminal device 10 according to the present embodiment. As shown in fig. 4, the information processing system 50 and the terminal apparatus 10 are configured by a computer, and include a CPU501, a ROM502, a RAM503, an HD (Hard Disk) 504, a HDD (Hard Disk Drive) controller 505, a display 506, an external connection I/F (Interface) 508, a network I/F509, a bus 510, a keyboard 511, a pointing device 512, an optical drive 514, and a media I/F516.
The CPU501 controls the overall operation of the information processing system 50 and the terminal apparatus 10. The ROM502 stores programs such as IPL for driving the CPU 501. The RAM503 is used as a work area of the CPU 501. HD504 stores various data such as programs. The HDD controller 505 controls reading and writing of various data in the HD504 according to the control of the CPU 501. Display 506 displays various information such as a cursor, menu, window 130, text, or images. The peripheral connection I/F508 is an interface for connecting various external devices. The peripheral in this case is, for example, USB (Universal Serial Bus) memory or a printer or the like. The network I/F509 is an interface for data communication by using a network. The bus 510 is an address bus, a data bus, or the like for electrically connecting the respective components such as the CPU501 shown in fig. 4.
The keyboard 511 is an input means having a plurality of keys for inputting characters, numerical values, various instructions, or the like. The pointing device 512 is an input means for making selections and execution of various instructions, selection of a processing object, movement of a cursor, and the like. The optical disk drive 514 is an example of a device for controlling reading and writing of various data from and to the optical storage medium 513 as a removable recording medium. The optical storage medium 513 may be CD, DVD, blu-ray (registered trademark) or the like. The medium I/F516 is used to control reading and writing of data in the recording medium 515 such as a flash memory.
Conference apparatus
The hardware configuration of the conference device 60 is described with reference to fig. 5. Fig. 5 is a schematic diagram showing the hardware configuration of a conference device 60 capable of capturing 360 ° images. The conference device 60 is a device that uses an imaging element and images a 360 ° moving image around the device at a predetermined height, but the number of imaging elements may be 1 or 2 or more. The camera unit may be provided with a rear-mounted 360-degree moving image capturing unit, which is not necessarily a dedicated device, and may be mounted on a PC, a digital camera, a smart phone, or the like, so that the camera unit may have substantially the same function.
As shown in fig. 5, the conference apparatus 60 includes an imaging unit 601, an image processing unit 604, an imaging control unit 605, a microphone 608, voice processing units 609, CPU (Central Processing Unit) 611, ROM (Read Only Memory) 612, SRAM (Static Random Access Memory) 613, DRAM (Dynamic Random Access Memory) 614, an operation unit 615, a peripheral connection I/F616, a communication unit 617, an antenna 617a, a voice sensor 618, and a female connector 621 for Micro USB.
The imaging unit 601 includes a wide-angle lens (so-called fisheye lens) 602 having a 360 ° angle of view for hemispherical image formation, and imaging elements 603 (image sensors) provided corresponding to the respective wide-angle lenses. The image pickup element 603 includes an image sensor such as a CMOS (Complementary Metal Oxide Semiconductor) sensor and a CCD (Charge Coupled Device) sensor that convert and output an optical image of the fisheye lens 602 into image information of an electric signal, a time generation circuit that generates a horizontal or vertical synchronization signal and a pixel clock of the image sensor, a register group in which various commands, parameters, and the like necessary for the operation of the image pickup element are set, and the like.
An image pickup element 603 (image sensor) of the image pickup unit 601 is connected to an image processing unit 604 via a parallel I/F bus. On the other hand, the image pickup element 603 of the image pickup unit 601 and the image pickup control unit 605 are connected via a serial I/F bus (I2C bus or the like). The image processing unit 604, the imaging control unit 605, and the voice processing unit 609 are connected to the CPU611 through a bus 610. Further, a ROM612, an SRAM613, a DRAM614, an operation portion 615, a peripheral connection I/F616, a communication portion 617, a voice sensor 618, and the like are also connected to the bus 610.
The image processing unit 604 acquires image data output from the imaging element 603 via a parallel I/F bus, performs predetermined processing on the image data, and creates data of a panoramic image and a speaker image from a fisheye image. Further, image processing section 604 performs a combination process of the panoramic image and the speaker image, and outputs one moving image.
The imaging control unit 605 generally uses the imaging control unit 605 as a master device, the imaging element 603 as a slave device, and sets a command or the like to a register group of the imaging element 603 by using an I2C bus. Necessary instructions and the like are received from the CPU611. The imaging control unit 605 similarly obtains state data of the register group of the imaging element 603 using the I2C bus, and sends the state data to the CPU611.
The imaging control unit 605 also instructs the imaging element 603 to output image data when an imaging start button of the operation section 615 is pressed or when an imaging start instruction is received from the PC. Depending on the conference device 60, there are also some cases where a preview display function by a display (for example, a display of a PC or a smartphone) or a function of a corresponding animation display is provided. In this case, the image data from the image pickup element 603 is continuously output at a prescribed frame rate (frame/min).
As described below, the imaging control unit 605 also functions as a synchronization control device, and synchronizes the output timing of the image data of the imaging element 603 in cooperation with the CPU 611. In the present embodiment, the conference device 60 is not provided with a display, but a display unit may be provided.
Microphone 608 converts sound into sound (signal) data. The voice processing unit 609 receives voice data output from the microphone 608a, the microphone 608b, and the microphone 608c via the I/F bus, mixes the voice data, and performs predetermined processing. The voice processing unit 609 also determines the direction of a sound source (speaker) from the sound levels (volume) input from the microphone 608a, the microphone 608b, and the microphone 608 c.
The CPU611 executes necessary processing while controlling the overall action of the conference apparatus 60. The ROM612 stores various programs for operating the conference device 60. The SRAM613 and the DRAM614 are working memories, and store programs executed by the CPU611, data in processing, and the like. In particular a DRAM614, which holds image data during processing by the image processing unit 604 and data of the processed equidistant cylindrical projection image.
The operation unit 615 is a generic name of operation buttons such as an imaging start button 615 a. By operating the operation unit 615, the user inputs settings such as execution of power ON/OFF, execution of communication connection, various image capturing modes, image capturing conditions, and the like, in addition to starting image capturing and image capturing.
The peripheral connection I/F616 is an interface for connecting various external devices. The external device in this case is, for example, a PC or the like. The moving picture data or image data stored in the DRAM614 is transmitted to an external terminal through the peripheral connection I/F616 or recorded in an external medium.
The communication unit 617 may communicate with the cloud server via the internet by wireless communication technology such as Wi-Fi (registered trademark) via an antenna 617a provided in the conference device 60, and may transmit the stored moving image data and image data to the cloud server. The communication unit 617 may communicate with nearby devices using a short-range wireless communication technology such as BLE (Bluetooth Low Energy, registered trademark) and NFC.
The voice sensor 618 is a sensor that acquires 360 ° sound information, and determines which direction of 360 ° around the conference device 60 (horizontal plane) is louder to input sound. The voice processing unit 609 determines the strongest direction based on the input 360 ° sound parameter and outputs a 360 ° sound input direction.
Other sensors (azimuth/acceleration sensor, GPS, etc.) may be used to calculate azimuth, position, angle, acceleration, etc. for image compensation and position information addition.
The image processing unit 604 creates a panoramic image by the following method. The CPU611 performs predetermined camera image processing such as Bayer conversion (RGB complementary processing) on RAW data input from an image sensor that inputs spherical images, and generates fisheye images (curved images). Then, the generated fisheye image (curved image) is subjected to a deparap process (distortion compensation process), and a 360 ° panoramic image (planar image) in which the periphery of the conference device 60 is photographed is generated.
CPU611 creates a speaker image in the following manner. The CPU611 creates a speaker image, and cuts out the speaker from the 360 ° panoramic image (planar image) taken around. The CPU611 cuts out a speaker image from the panoramic image using the 360 ° determined sound input direction output from the voice sensor 618 and the voice processing unit 609 as a speaker direction. In this case, the method of cutting out the image of the person from the sound input direction is to cut out 30 ° around the sound direction determined from 360 ° and perform the face detection therein. The CPU611 further determines, among the cut-out speaker images, a speaker image of a specific number of people (3, etc.) who have recently uttered.
The panoramic image and one or more speaker images may be transmitted to the information recording application 41, respectively, and the conference device 60 may create one image from these images and transmit the image to the information recording application 41. In the present embodiment, it is assumed that the panoramic image and one or more speaker images are transmitted from the conference device 60 to the information recording application 41, respectively.
Fig. 6 is a schematic diagram of the imaging range of the conference apparatus 60. As shown in fig. 6 (a), the photographing range of the conference apparatus 60 in the horizontal direction is 360 °. As shown in fig. 6 (b), the conference device 60 sets the direction horizontal to the height thereof to 0 ° and sets the imaging range to a predetermined angle.
Fig. 7 is a schematic diagram of cutting out a panoramic image and a speaker image. As shown in fig. 7, since the image photographed by the conference device 60 constitutes a part 110 of the sphere, it has a three-dimensional shape. As shown in fig. 6 (b), the conference device 60 performs perspective projection conversion by dividing the view angle by a predetermined angle up and down and a predetermined angle left and right. Since a predetermined number of planar images can be obtained by performing perspective projection conversion without any gap in the horizontal direction 360 ° as a whole, panoramic image 111 can be obtained by connecting a predetermined number of planar images left and right. The conference device 60 performs face detection within a predetermined range around the sound direction from the panoramic image, and cuts out 15 ° (30 °) from the center of the face to the left and right, thereby generating a speaker image 112.
Electronic blackboard
Fig. 8 is a schematic diagram of the hardware configuration de of the electronic blackboard 2. As shown in fig. 8, the electronic board 2 includes CPU (Central Processing Unit), ROM (Read Only Memory), 402, RAM (Random Access Memory), 403, SSD (Solid State Drive), 404, network I/F405, and peripheral connection I/F (Interface) 406.
The CPU401 controls the operation of the electronic blackboard 2 as a whole. The ROM402 holds IPL (Initial Program Loader) and other programs for starting up the OS. The RAM403 is used as a work area of the CPU 401. The SSD404 stores various data such as programs for the electronic blackboard 2. The network I/F405 controls communication with a communication network. The peripheral connection I/F406 is an interface for connecting various external devices. The external devices in this case are, for example, USB (Universal Serial Bus) memory 430, external devices (microphone 440, speaker 450, camera 460).
The electronic blackboard 2 includes an image pickup apparatus 411, a GPU412, a display controller 413, a contact sensor 414, a sensor controller 415, an electronic pen controller 416, a near field communication circuit 419, and an antenna 419a, a power switch 422, and a selection switch class 423 of the near field communication circuit 419.
Wherein the image pickup apparatus 411 displays display information of the display of the external PC470 as still or moving pictures. GPU (Graphics Processing Unit) 412 is a semiconductor chip that processes patterns specifically. The display controller 413 controls and manages screen display so as to output an output image from the GPU412 to the display 480 or the like. The contact sensor 414 detects contact of the electronic pen 490 or the user's hand H, etc. on the display 480. The sensor controller 415 controls the process of the touch sensor 414. The touch sensor 414 performs coordinate input and coordinate detection by an infrared ray blocking method. The coordinate input and coordinate detection method is a method in which two light receiving and emitting devices provided at both ends above the display 480 emit a plurality of infrared rays in parallel to the display 480, and the infrared rays are reflected by a reflecting member provided around the display 480 to receive light returned on the same optical path as that of the light emitted from the light receiving element. The light receiving and emitting device as the contact sensor 414 outputs a position where infrared rays are blocked by the object (a position on the light receiving element) to the sensor controller 415, and the sensor controller 415 determines a coordinate position as a contact position of the object based on the two position information. Electronic pen controller 416 determines whether pen tip contact and pen tail contact are present by bluetooth communication with electronic pen 490 for navigation by display 480. The short-range communication circuit 419 is a communication circuit such as NFC (Near Field Communication) or Bluetooth (registered trademark). The power switch 422 is a switch for switching ON/OFF of the power supply of the electronic blackboard 2. The selection switch 423 is a group of switches for adjusting the brightness, color, and the like of the display 480.
The electronic blackboard 2 also has a bus 410. The bus 410 is an address bus, a data bus, or the like for electrically connecting the respective components of the CPU401 or the like shown in fig. 8.
The touch sensor 414 is not limited to the infrared ray blocking method, and may be a capacitive touch panel that detects a change in capacitance to determine a touch position. The touch sensor 414 may be a resistive touch panel in which the touch position is determined based on a voltage change of 2 opposing resistive films. The touch sensor 414 may be a touch panel of an electromagnetic induction type that detects electromagnetic induction generated by a contact object contacting the display unit to determine a contact position. In addition, the contact sensor 414 may use various detection means. The electronic pen controller 416 may determine not only the pen tip and the pen tail of the electronic pen 490, but also whether or not the portion of the electronic pen 490 held by the user or other portions of the electronic pen are in contact.
With respect to functions
Next, with reference to fig. 9, the functional configuration of the recorded information producing system 100 will be described. Fig. 9 is a functional block diagram of the terminal device 10, the conference apparatus 60, and the information processing system 50 in the recorded information producing system 100.
Terminal device
The information recording application 41 that is run in the terminal device 10 includes a communication unit 11, an operation receiving unit 12, a display control unit 13, an application screen acquisition unit 14, a voice acquisition unit 15, a device communication unit 16, a video recording control unit 17, a voice data processing unit 18, a video recording playback unit 19, an uploading unit 20, an editing processing unit 21, and a decoding unit 22. Each of these components included in the terminal device 10 is a function or a device that functions according to a command operation from the CPU501 that executes the information recording application 41 developed from the HD504 to the RAM503, and is any one of the components shown in fig. 4. The terminal device 10 further includes a storage unit 1000 configured by an HD504 and the like shown in fig. 4. The storage unit 1000 is provided with an information storage unit 1001.
The communication unit 11 communicates various information with the information processing system 50 via a network. The communication unit 11 receives a list of teleconferences from the information processing system 50, or transmits a request for recognizing voice data to the information processing system 50, for example.
The display control unit 13 displays various screens as user interfaces in the information recording application 41 in accordance with the screen transition set in the information recording application 41. The operation receiving unit 12 receives various operations on the information recording application 41.
The application screen acquiring unit 14 acquires a desktop screen or a screen of an application selected by the user from OS (Operating System) or the like. In the case where the application selected by the user is the teleconference application 42, a screen (an image of each site, an image of material, or the like) generated by the teleconference application 42 is obtained.
The voice acquiring unit 15 acquires voice data received by the teleconference application 42 in the teleconference from the teleconference application 42. Note that the voice acquired by the voice acquisition unit 15 is only voice data received in the teleconference, and does not include the voice collected by the terminal apparatus 10. This is because the conference device 60 additionally collects sound.
The device communication unit 16 communicates with the conference device 60 using a USB cable or the like. The device communication unit 16 may communicate with the conference device 60 via a wireless LAN, bluetooth (registered trademark), or the like. The device communication unit 16 receives the panoramic image and the speaker image from the conference device 60, and transmits the voice data acquired by the voice acquisition unit 15 to the conference device 60. The device communication section 16 receives voice data synthesized by the conference device 60.
The video recording control unit 17 combines the panoramic image and the speaker image received by the device communication unit 16 with the application screen acquired by the application screen acquisition unit 14, and creates a combined image. The video recording control unit 17 generates a combined image moving image by connecting the repeatedly created combined images in time series, and creates a combined image moving image with sound by combining the voice data synthesized by the conference device 60 with the combined image moving image.
The voice data processing unit 18 requests the information processing system 50 to convert voice data combined with the combined image animation extracted by the video recording control unit 17 or synthesized voice data received from the conference device 60 into text data.
The video recording/reproducing unit 19 reproduces the combined video. The combined video animation is stored in the terminal device 10 in a video, and then uploaded to the information processing system 50.
The uploading unit 20 transmits the combined image animation to the information processing system 50 at the end of the teleconference.
The editing processing unit 21 performs editing (deletion of a part, stitching, etc.) of the combined image animation according to the operation of the user.
The decoding section 22 detects the two-dimensional code included in the panoramic image, and analyzes the two-dimensional code to obtain the device identification information.
Fig. 10 shows the moving image record information stored in the information storage unit 1001. The movie recording information includes various items such as a conference ID, a video ID, an update time, a title, an upload destination, and a storage destination. After the user logs in the information processing system 50, the information recording application 41 downloads conference information from the conference information storage unit 5001 of the information processing system 50. The conference ID and the like contained in the conference information will be reflected in the video recording information. The moving image recording information of fig. 10 is information stored in the terminal apparatus 10 operated by a certain user.
The conference ID is identification information that identifies the teleconference held. The conference ID is a number at the time of teleconference reservation registration in the conference management system 9, or a number given by the information processing system 50 in accordance with a request from the information recording application 41.
The video ID is identification information for identifying the combined image animation recorded in the teleconference. The video ID is given a number by the conference device 60, or may be given a number by the information recording application 41 or the information processing system 50. The same conference ID is assigned a different video ID, which indicates that video recording is suspended in the middle of a teleconference, but restarted for some reason.
The update time is the time when the video is updated (recording is completed) in combination with the image animation. In the case of editing the combined image animation, it is the date and time of editing.
The title is the name of the meeting. The setting may be performed when registering a conference in the conference management system 9, or may be arbitrarily set by a user.
The upload indicates whether or not the combined image animation has been uploaded to the information processing system 50.
The save destination indicates a location (URL or file path) where the image animation, text data, and object information are saved in the registry service system 70. Thus, the user can browse the uploaded combined image animation at will. The image animation and text data are stored in combination, for example, with a different file name following the URL.
Conference apparatus
The description of fig. 9 is returned. The conference device 60 includes a terminal communication unit 61, a panoramic image generation unit 62, a speaker image generation unit 63, a sound collection unit 64, and a voice synthesis unit 65. These parts of the conference device 60 are functions or devices that function by operating any one of the components shown in fig. 5 in accordance with a command from the CPU611 that executes a program developed from the ROM612 to the DRAM 614.
The terminal communication unit 61 communicates with the terminal apparatus 10 using a USB cable or the like. The terminal communication unit 61 may be connected to the terminal device 10 not only by a wired cable but also by a wireless LAN, bluetooth (registered trademark), or the like.
The panoramic image creation unit 62 creates a panoramic image. The speaker image creating unit 63 creates a speaker image. These manufacturing methods are described with reference to fig. 6 and 7.
The sound collecting unit 64 converts sound acquired by a microphone in the conference device 60 into voice data (digital data). In this way, the utterance contents of the user and the participant are collected at the terminal device 10.
The voice synthesis unit 65 synthesizes the voice transmitted from the terminal device 10 and the voice collected by the sound collection unit 64. Thus, the sounds made by the other sites 101 and the sounds made by the sites 102 are summed.
Information processing system
The information processing system 50 includes a communication unit 51, an authentication unit 52, a screen generation unit 53, a communication management unit 54, a device management unit 55, and a text conversion unit 56. Each of these components included in the information processing system 50 is any one of the components shown in fig. 4, and functions or devices that function according to a command operation from the CPU501 according to a program developed from the HD504 to the RAM 503. The information processing system 50 includes a storage unit 5000 constituted by an HD504 and the like shown in fig. 4. The storage unit 5000 includes a conference information storage unit 5001, a video information storage unit 5002, a correspondence setting information storage unit 5003, and an object information storage unit 5004. The object information storage unit 5004 is described in the electronic blackboard 2.
The communication unit 51 transmits and receives various information to and from the terminal device 10. The communication unit 51 transmits a list of teleconferences to the terminal apparatus 10, for example, and receives a request for recognizing voice data from the terminal apparatus 10.
The authentication unit 52 authenticates a user who operates the terminal device 10. The authentication unit 52 authenticates the user based on whether or not authentication information (user ID and password) included in the authentication request received by the communication unit 51 matches authentication information stored in advance, for example. The authentication information may be biometric authentication information such as a card number of an IC card, a face, a fingerprint, or the like. The authentication unit 52 may perform authentication by an external authentication system, OAUTH, or other authentication method.
The screen generating section 53 supplies information displayed by the information recording application 41. Since the screen is composed of the information recording application 41, the screen generating unit 53 supplies the content of the information recording application 41 to the terminal device 10 in the form of XML or the like. When the terminal device 10 executes the Web application, the screen generating unit 53 generates screen information to be displayed by the Web application. The screen information is made of HTML, XML, CSS (Cascade Style Sheet), javaScript (registered trademark), or the like.
The communication management unit 54 obtains information on a teleconference from the conference management system 9 through an account of each user or a system account given to the information processing system 50. The communication management unit 54 associates conference information of the reserved conference with the conference ID, and stores the conference information in the conference information storage unit 5001. The communication management unit 54 acquires meeting information of the viewing authority of the user belonging to the tenant. Since the conference ID is set in the conference, the teleconference and the recorded information can be made to correspond by the conference ID.
When receiving the device identification information of the electronic blackboard 2 and the conference device 60 used in the conference, the device management unit 55 associates them and stores them in the correspondence relation setting information storage unit 5003. Accordingly, the conference ID, the device identification information of the electronic blackboard 2, and the device identification information of the conference device 60 correspond to each other. Since the conference ID also corresponds to the combined image animation, the handwriting data input on the electronic blackboard 2 also corresponds to the combined image animation. After the recording (conference end), the device management unit 55 deletes the correspondence relation from the correspondence relation setting information storage unit 5003.
The text conversion unit 56 converts the voice data requested by the terminal device 10 to be converted into text data by using the external voice recognition service system 80. The text conversion unit 56 itself may perform conversion.
Fig. 11 shows an example of conference information managed by the communication management unit 54 and stored in the conference information storage unit 5001. The communication management unit 54 can acquire a list of teleconferences in which the user belonging to the tenant has viewing authority, using the account. In the present embodiment, teleconferencing is taken as an example, but a list of teleconferences includes a conference held in only one conference room.
Conference information is managed by a conference ID, and associated with participants, titles (conference names), start times, end times, places, and the like. These are examples of conference information, and the conference information may also include other information.
The item of the participant is a participant of the conference.
Items of the title indicate conference contents such as conference names and conference topics.
The item of start time is the date and time of the scheduled start of the meeting.
The item of end time is the date and time of the scheduled end of the meeting.
The items of the place are holding places of a meeting, such as a meeting room, a branch company name, a building, and the like.
The item of electronic blackboard information is identification information of the electronic blackboard 2 used in the conference.
The item of conference equipment is identification information of the conference equipment 60 used in the conference.
The item of viewing authority is a user ID registered as a user having viewing authority when the sponsor of the conference registers conference information in advance or after holding. For example, only the participants, the participants+arbitrary user names, or arbitrary user names, etc. of each conference are registered in the conference information. When searching for a user other than the registered user, the searching unit 58 does not provide the search result concerning the recorded information and the object information of the conference, even if the search is suitable.
As shown in fig. 10 and 11, the recorded combined image animation in the conference can be determined by the conference ID.
The video information stored in the video information storage unit 5002 may be the same as that of fig. 10. However, the information processing system 50 has a list of the combined image animations recorded by all users belonging to the tenant.
Fig. 12 shows correspondence setting information in which the device identification information of the electronic blackboard 2 and the device identification information of the conference device 60 correspond to the conference ID stored in the correspondence setting information storage 5003. The correspondence setting information is held from the start of the transmission of the device identification information to the information processing system 50 by the information recording application 41 to the end of the recording.
Deposit service system
The hosting service system 70 may be any service system that stores the record information and the object information. The recorded information storage unit 7001 stores recorded information (combined with image moving images and text data) and object information.
Fig. 13 shows recorded information and object information stored in the recorded information storage section 7001. As shown in fig. 13, image animation, text data, and object information are combined as record information, and stored in association with a conference ID. The combined image animation contains synthesized voice, and the text data is formed by voice recognition transformation of the synthesized voice data. The object information is information related to an object such as handwriting data input to the electronic blackboard 2 described later. The record information and the object information correspond to the conference ID, and thus also correspond to the conference information.
Fig. 14 is a schematic diagram of a structure of text data of recorded information as a part. As shown in fig. 14, the text data corresponds to items of ID, time, recognition result string, voice data, point recognition information, speaker ID.
The ID is identification information that is numbered when dividing the local point voice and the other point voices by a predetermined rule. The conference device 60 (and at least one of the voice recognition service system 80) is provided with a prescribed rule, for example, division is performed when a silent state for a certain time continues, forced division is performed for a certain time or the number of words even if there is no silent state, division is performed for each sentence detected by morphological analysis, and the like.
Time is the speaking time of the duration from the start of video recording. The so-called time is also saved at the start of recording, so that the speaking time (time of standard time) of text can also be known.
The recognition result text string is a part of text data in which the divided synthetic speech is converted by speech recognition. The synthesized speech is speech data that becomes a conversion source of the recognition result character string.
The speech data is synthesized speech (division completed) by synthesizing the local site speech and the other site speech after the site is determined.
The point identification information is identification information of a point from which voice data is transmitted, which is determined based on the sound pressure of the point sound and the sound pressure of other point sounds. The data point identification information is, for example, 1 for the present data point and 2 for the other data points.
The speaker ID is a user ID of the speaker indicating the recognition result string. The utterance of which participant can also be determined by the user ID. There are various ways to identify a speaker in a conference. The recognition method may be any method, for example, a method of registering a voice print of a company member in advance, a method of detecting the direction of a speaker by the conference device 60 and performing face recognition on a participant in the direction, or the like. In a venue where each speaker is equipped with a microphone, the speaker is found by which microphone collects sound.
In this way, since the text data (here, the recognition result character string) is associated with the speaker ID, when the text data is searched for using the participant name, the search unit 58 can search for the speaker ID and find the text data uttered by the participant.
Electronic blackboard
Fig. 15 is a functional block diagram of the electronic blackboard 2. The electronic blackboard 2 includes a contact position detecting unit 31, a drawing data generating unit 32, a data recording unit 33, a display control unit 34, a code generating unit 35, a communication unit 36, an authentication unit 37, an operation receiving unit 38, a display area determining unit 39, and a text display unit 40. Each function of the electronic blackboard 2 is a function or device realized by any one of the components shown in fig. 8 according to a command operation from the CPU401 according to a program developed from the SSD404 to the RAM 403.
The contact position detecting unit 31 detects coordinates of a position where the electronic pen 490 contacts the contact sensor 414. The drawing data generation unit 32 obtains coordinates of the pen tip contact of the electronic pen 490 from the contact position detection unit 31. The drawing data generation unit 32 generates stroke data by interpolating the coordinate point sequence and connecting the coordinate point sequence.
The display control section 34 displays handwriting data, a menu for user operation, and the like on the display.
The data recording unit 33 stores handwritten data, graphics such as circles and triangles, seals such as ends, and pictures and files of the PC, etc. written on the electronic blackboard 2 in the object information storage unit 3002. Images, files, and the like of handwriting data, graphics, and screens of PCs are handled as objects.
The communication unit 36 is connected to Wi-Fi or LAN and communicates with the information processing system 50. The communication unit 36 transmits the object information to the information processing system 50, or receives the object information stored in the information processing system 50 from the information processing system 50, and displays the object information on the display 480.
The code generation unit 35 encodes the device identification information of the electronic blackboard 2 stored in the device information storage unit 3001 and information indicating that the device is usable in the conference into a two-dimensional pattern, and generates a two-dimensional code. The code generation section 35 may also encode device identification information of the electronic blackboard 2 and information indicating that it is a device that can be used in a conference as a barcode. The device identification information may be a serial number or UUID (Universally Unique Identifier), etc. Or may be user-set.
The authentication unit 37 authenticates the user of the electronic blackboard 2. The authentication method may be the same as the authentication section 52. The authentication unit 37 may request the authentication unit 52 to perform authentication.
The operation receiving unit 38 receives a menu, buttons, and the like displayed on the electronic blackboard 2 by a user operation.
The display area determining unit 39 determines the position and size of the display area 150 in which the text data converted from the voice is displayed on the display 480. The display area determination unit 39 sets the update number of the rectangular area to which the object is input to be larger as the object input time is later.
The text display unit 40 displays text data converted from sound on the display area 150 determined by the display area determination unit 39. The text display unit 40 scrolls text data from above to below or from below in the display area 150, or deletes text data after a predetermined time has elapsed.
The electronic blackboard 2 has a storage portion 3000 constituted by the SSD404 shown in fig. 8 and the like. The storage unit 3000 includes a device information storage unit 3001 and an object information storage unit 3002.
Fig. 16 shows device identification information and the like stored in the device information storage unit 3001.
The device identification information is identification information of the electronic blackboard 2.
The IP address is an IP address for connecting the other device to the electronic blackboard 2 via a network.
The password is used for authentication when the electronic blackboard 2 is connected to another device.
Fig. 17 is a schematic diagram of the object information stored in the object information storage unit 3002. The object information is information for managing the objects displayed on the electronic blackboard 2. The object information is transmitted to the information processing system 50 and used as a conference record. When the electronic blackboard 2 is provided at another point during the teleconference, the object information can be shared.
The item of the conference ID sets the identification information of the conference notified from the information processing system 50.
Item setting identification information of the identification object of the object ID.
The item setting object of the class includes handwriting data, graphics, images, and the like. Handwriting is stroke data (columns of coordinates points). The graph is a triangle or quadrilateral geometry. The image is image data such as Jpeg, png, TI/FF acquired from a PC, the internet, or the like. Whichever object, the data body is stored in association with the object ID. Handwritten data may be transformed into text by character recognition.
One screen of the electronic blackboard 2 is called a page. The item of a page is its number of pages.
The item of coordinates sets the position of the object with reference to the designated origin of the electronic blackboard 2. The position of the object is, for example, the upper left vertex of the circumscribed rectangle of the object. The coordinates are expressed, for example, in pixel units of the display.
The width and height of the circumscribed rectangle of the item setting object of the size.
The item of the note ID is a user ID of the user who inputs the object. The user logs on to the electronic blackboard 2 before starting to use the electronic blackboard 2. The user ID may be determined by logging in. For example, in the case where only one user simultaneously inputs the electronic blackboard 2, the user ID of the last logged-in user corresponds to the object. In the case where a plurality of users are simultaneously input on the electronic blackboard 2, the electronic pen and the user ID may establish a correspondence. For example, the user ID of the input object is determined from the electronic pen used by the user to input by associating the ID of the electronic pen with the user ID in the order in which the user logs in. Multiple user IDs are registered in the item of the notebook ID because one handwritten data has multiple strokes, each of which is handwritten by a different user. In the present embodiment, the notebook ID may be given by the information processing system 50 or the electronic blackboard 2.
The time stamp item sets the time at which the input object is started. The timestamp may also be the time when the input of the object ended. The timestamp of handwriting data is the input time of the initial or final stroke when the strokes are divided according to the time between them. The time stamp may be an absolute time or an elapsed time (from the power on of the electronic blackboard) after the start of the conference.
Action or process
Fig. 18 is a timing chart of a process of the recorded information producing system 100 relating the conference device 60 to the electronic blackboard 2. The user who is set to participate in the conference uses the conference device 60 and the electronic blackboard 2 to participate in the conference in the same conference.
S1: the electronic blackboard 2 in the conference room used in the conference communicates with the information processing system 50 set in advance after the power is turned on, and the device identification information of the host is specified, and the registration can be associated with the conference. The information processing system 50 also grasps the IP address of the electronic blackboard 2.
S2: the code generating unit 35 of the electronic blackboard 2 in the conference room used in the conference generates the device identification information of the host computer and the two-dimensional code indicating that the device identification information is encoded, and the two-dimensional code is displayed by the display control unit 34. The two-dimensional code may also include a password for authenticating another device by the electronic blackboard 2.
S3: the user holds the terminal apparatus 10 and the conference device 60, enters a conference room provided with the electronic blackboard 2, and connects the terminal apparatus 10 and the conference device 60 with a USB cable. The conference device 60 is started by power supply or power supply of the USB cable. Thereby, the conference device 60 becomes a standby state. The user starts the information recording application 41 of the terminal apparatus 10. The conference device 60 starts shooting and collecting sound by the information recording application 41 starting communication with the conference device 60. The panoramic image creation unit 62 of the conference device 60 creates a panoramic image (image data) including the shooting surroundings of the two-dimensional code.
S4: the terminal communication section 61 of the conference device 60 transmits the panoramic image to the terminal apparatus 10.
S5: the device communication unit 16 of the terminal apparatus 10 receives the panoramic image, and the decoding unit 22 detects the two-dimensional code displayed on the electronic blackboard 2 from the panoramic image. The decoding unit 22 decodes the two-dimensional code, and when it is determined that a device indicating that the device can be used in a conference is embedded, acquires device identification information of the electronic blackboard 2 from the two-dimensional code. The two-dimensional code may be analyzed by the conference device 60.
S6: the communication unit 11 of the information recording application 41 specifies the device identification information of the electronic blackboard 2, and transmits a request for registering a conference to the information processing system 50. The communication section 11 further transmits the device identification information of the conference device 60 to the information processing system 50. The information processing system 50 grasps the IP address of the terminal device 10.
S7: upon receiving a request for registering a conference (device identification information), the communication unit 51 of the information processing system 50 issues a conference ID to the communication management unit 54. When the information recording application 41 receives a selection of a meeting from a meeting list screen or the like, the communication unit 51 adds the meeting ID to the device identification information, so the communication management unit 54 does not issue the meeting ID.
S8: then, the device management unit 55 associates the device identification information of the electronic blackboard 2 and the conference device 60 with the conference ID, and stores the same in the association information storage unit 5003.
S9, S10: the communication unit 51 of the information processing system 50 notifies the terminal device 10 and the electronic blackboard 2 of the conference ID. The communication unit 11 of the terminal apparatus 10 receives and stores the conference ID. Similarly, the communication unit 36 of the electronic blackboard 2 stores the conference ID after receiving the conference ID. The terminal apparatus 10 receives at least one of the conference ID and the device identification information as a response to the login request for the conference. The electronic blackboard 2 and the information processing system 50 can communicate by a two-way communication system such as Web Socket which enables push-type communication from the information processing system 50 to the electronic blackboard 2.
Since the electronic blackboard 2 and the terminal apparatus 10 have the same conference ID, the electronic blackboard 2 and the conference device 60 correspond to a conference. Thereafter, the terminal apparatus 10 adds at least one of the conference ID and the device identification information of the conference device 60 to the transmitted data, and the electronic blackboard 2 adds at least one of the conference ID and the device identification information to the transmitted data. In the present embodiment, the conference ID is added to the communication, but even if the device identification information or the device identification information of the conference device 60 is added to the communication, the information processing system 50 can determine the conference ID from the correspondence setting information.
< determination of text display area position of electronic blackboard >
A method for determining the text data display area 150 of the electronic blackboard 2 will be described with reference to fig. 19 to 21. Fig. 19 is a schematic diagram of rectangular areas and the number of updates, where (a) represents dividing the display 480 of the electronic blackboard 2 into respective rectangular areas 4 in a lattice shape. The area where several rectangular areas 4 are connected becomes the text display area 150. If the size of the display area 150 is assumed to be a size capable of accommodating text data of about 3, the size of one rectangular area 4 may be assumed to be a size capable of accommodating text data of about 2 to 5 letters, and the size of the rectangular area 4 is designed according to the number of pixels of the display 480, etc. In addition, the user can set the size of the rectangular area 4. The rectangular area 4 smaller than the display area 150 is used for the determination of the display area 150 in order to allow the size of the display area 150 to be changed.
The display area determining unit 39 detects coordinates of the electronic pen 490 and the finger on the display 480 from pen down to pen up, and sets the update number in the rectangular area 4 where the coordinates pass. The update number is a value obtained by adding 1 point to the maximum update number in all the rectangular areas 4.
Fig. 19 (b) is a schematic diagram of update number setting. The user handwriting strokes 120. As shown in fig. 19 (b), the stroke 120 passes through 12 rectangular areas 4. In fig. 19 (b), the stroke 120 starts to pass from the rectangular region 4 of coordinates (2, 4) to the rectangular region 4 of coordinates (5, 4). The display area determining unit 39 manages the update number for each rectangular area 4. After the stroke 120 is detected, 1 point is added to the maximum update number among the update numbers of all the rectangular areas 4, and the rectangular area 4 in which the stroke is detected is set.
Fig. 19 (c) shows the update number set in each rectangular area 4 according to the stroke 120. The update number of the coordinates (2, 4) is 1 because the rectangular area 4 of the update number is not set before the coordinates (2, 4). In the order of the additional strokes 120, the update number is incremented by 1 point, and the update number of coordinates (5, 4) is "12". By thus managing the update count, it is unnecessary to manage when the rectangular area 4 is used (manage the time of handwriting strokes on the rectangular area 4).
The next time a stroke is detected, the updated number of rectangular areas 4 through which the stroke passes is assumed to be "13". By managing the update number in this way, it can be determined that the rectangular area 4 having a small update number is not used so much, that is, the use frequency is low.
When the user erases one stroke using the eraser function, the update number of the rectangular area 4 having the stroke is 0. This is because there are no strokes, and thus an unused area is formed.
Fig. 20 is a schematic diagram of the update number aggregation method for each window 130. The display area determining unit 39 sums up the update numbers of the respective windows 130, and determines the window 130 having the smallest sum up update number as the display area 150. As shown in fig. 20, the display area determining unit 39 regards the rectangular area 4 of "the first area number N (the area number of the rectangular area 4 in the height direction) ×the second area number M (the area number of the rectangular area 4 in the width direction)" as one window 130. Window 130 is defined such that at least about three text data sets can be set. In fig. 20, n=2, m=3, but this is just one example. The display area determining unit 39 gradually enlarges the window 130, and sets a display area 150 larger than three pieces of text data. The electronic blackboard 2 can increase the number of text data displayed in the display area 150 and improve the visibility.
The display area determining unit 39 calculates the total of the update numbers of the rectangular areas 4 in the window 130 while shifting the rectangular areas 4 one by one. As shown in fig. 20 a, the display area determining unit 39 slides the window 130 of 1 rectangular area in the right direction from the upper left corner of the display 480, and calculates the total number of updates (total number of updates) of the rectangular area 4 in the window 130. As shown in fig. 20 (b), when the window 130 reaches the right end of the display 480, the display area determining unit 39 lowers the window 130 by 1 rectangular area, and resumes sliding from the left end of the display 480 (fig. 20 (c)). This is repeated until window 130 reaches the lower right corner of display 480.
However, in practice, the display area determining unit 39 may calculate the total update number in the window 130 including the rectangular area 4 in which the strokes are detected.
Fig. 21 shows an example of the update number set in the rectangular area 4 and the total update number in the window 130. For example, the total update number of window 130a is 7, the total update number of window 130b is 14, and the total update number of window 130c is 0. The display area determination unit 39 determines the window 130 having the smallest frequency of use as the display area 150. In fig. 21, the window 130c is defined as the display area, but when there are a plurality of windows 130 whose total update number is 0, the display area determining unit 39 alternately increases N and M by 1 to calculate the total update number. In this way, the display area determining unit 39 can increase the size of the display area 150.
The user may also designate the display area 150. For example, the user selects a menu for designating the display area 150, and designates the area with the electronic pen 490 or a finger. When the display area 150 is designated by the user, the display area 150 is preferentially used. The user may move the display area 150 set by himself or herself, or may move the display area 150 determined by the display area determining unit 39.
< determination of character size >
The text display unit 40 preferably determines the character size of the text data so that the text data can completely enter the display area 150. Fig. 22 is an example of a text size determination table. In fig. 22, the size of the display area 150 (the number of vertical and horizontal pixels) is set to be different depending on whether the size is equal to or larger than a threshold (equal to or larger than 200 pixels) or smaller than the threshold (equal to or smaller than 200 pixels). The size of the display area 150 is determined by the size of one rectangular area 4 and N and M described above.
The text display unit 40 may determine the text size not only based on the size of the display area 150 but also based on the size of the display area 150 and the number of characters of the text data. In this case, the size of the display area 150 and the number of characters of the text data correspond to the character size.
< display of text data Using blank >
Referring to fig. 23, display of text data with the display 480 blank is described. Fig. 23 is a schematic view of the space 160 of the electronic blackboard 2. Although the object region 162 in which an object can be placed is limited in the electronic blackboard 2, there are some cases where other spaces 160 exist outside the object region 162. The blank is a portion where nothing can be recorded on the display 480, and is a portion where a non-graphics state is maintained.
For example, in fig. 23, for 9:16, the aspect ratio of the display 480 is 4:3, a blank 160 is created. When such a blank 160 exists, and the minimum size of the display area 150 including about 3 pieces of text data is satisfied, the display area determining unit 39 determines that the text data 161 is displayed in the blank 160.
Typically in the electronic blackboard 2, the display 480 is maximally used for the object region 162, and thus the blank 160 does not exist in many cases. However, when the PC or the like has the same function as the electronic blackboard 2 through the application program, the user can set the target area 162, and therefore, a blank 160 may be generated.
< treatment step >
The flow from the start of the conference to the display of the text data is described with a timing chart and a flowchart.
First, with reference to fig. 24, a flow from the start to the end of a conference will be described. Fig. 24 is a processing timing chart of saving record information and object information in a conference. As described in fig. 18, the correspondence relationship between the electronic blackboard 2 and the conference device 60 is set in advance by the correspondence relationship setting information storage unit 5003.
S21: the user operates the information recording application 41 of the terminal apparatus 10 to instruct the start of the conference. For example, the user may select a meeting from a list of meeting information. The user may additionally operate teleconferencing application 42 to initiate a teleconference with other points, or not.
S22: the operation receiving unit 12 of the terminal device 10 receives the conference start, and the communication unit 11 designates a conference ID and transmits the conference start to the information processing system 50.
S23: next, the user designates a meeting to the electronic blackboard 2 and instructs to participate in the meeting. After the process of fig. 18, the process of step S23 may be omitted.
S24: accordingly, the communication unit 36 of the electronic blackboard 2 designates the device identification information and the conference ID of the electronic blackboard 2, and transmits conference participation to the information processing system 50.
S25: in response to the start of the conference, the device communication section 16 of the information recording application 41 requests the conference device 60 to start recording.
S26: upon receiving the video recording start request, the terminal communication unit 61 of the conference device 60 starts the panoramic image creation unit 62, and the speaker image generation unit 63 starts the creation of the speaker image. The sound collecting section 64 collects surrounding sounds. The sound synthesis is performed when the teleconference is held, and the sound synthesis may be performed in any one of the conference device 60 and the terminal apparatus 10.
S27: the terminal communication unit 61 transmits the panoramic image and the speaker image and the collected voice data to the terminal device 10. The device communication unit 16 of the terminal apparatus 10 receives the panoramic image and the speaker image and the collected voice data. The video recording control unit 17 synthesizes the panoramic image, the speaker image, and the voice data of the sound collection, and further synthesizes the video of the teleconference application 42, thereby generating a synthesized image moving image. The display control unit 13 displays the combined image moving image.
S28: the communication unit 11 of the terminal apparatus 10 requests the information processing system 50 to recognize the voice data in real time. Real time means that the maximum delay time is guaranteed. Conference device 60 may send the voice data directly to information handling system 50.
S29: when the communication section 51 of the information processing system 50 receives the voice data, the text conversion section 56 converts the voice data into text data.
S30: the communication section 51 of the information processing system 50 transmits text data to the electronic blackboard 2 corresponding to the conference device 60 by the conference ID.
S31: the communication unit 36 of the electronic blackboard 2 receives the text data, and the display area determination unit 39 determines the display area 150. The text display unit 40 displays text data on the display area 150. The processing of this display will be described in detail with reference to fig. 25 to 27.
S32: the user inputs an object such as a stroke on the electronic blackboard 2. The drawing data generation unit 32 generates strokes, and the display control unit 34 displays the strokes on the display 480. The data recording unit 33 generates object information.
S33: the communication unit 36 of the electronic blackboard 2 transmits object information such as stroke data to the information processing system 50. The communication unit 36 may collectively transmit the target information after the conference is completed. The communication unit 51 of the information processing system 50 receives the object information and stores the object information in the object information storage unit 5004 in time series.
S34: after the conference is ended, the user operates the information recording application 41 of the terminal apparatus 10 to instruct the conference to end.
S35: the operation receiving unit 12 of the terminal device 10 receives the conference end, and the communication unit 11 designates a conference ID and transmits the conference end to the information processing system 50. In response to the conference end, the communication section 11 transmits the combined image moving image to the information processing system 50 together with the conference ID.
S36: the communication unit 51 of the information processing system 50 receives the combined image animation, and the communication unit 51 stores the record information (combined image animation, text data) and the object information in the hosting service system 70.
S37: in response to the conference end, the device communication section 16 of the information recording application 41 requests the conference device 60 to stop recording. Although the video recording control unit 17 stops the video recording, the information recording application 41 may continue to receive the panoramic image, the speaker image, and the voice data of the collection.
S38: the user indicates to the electronic blackboard 2 that the conference is over. Therefore, the electronic blackboard 2 no longer transmits the object information to the information processing system 50.
< determination of display area >
Fig. 25 is a flowchart illustrating an example of the process of determining the display area 150 by the electronic blackboard 2. For example, the process of fig. 25 is performed every time writing of a stroke is detected.
The contact position detecting unit 31 detects contact of the electronic pen 490 or the finger with the display 480 (S101).
When the contact is detected, the display area determining unit 39 updates the update number of the rectangular area 4 through which the stroke passes, and calculates the update number total for each window 130 including the rectangular area 4 (S102).
The display area determining unit 39 determines the display area 150 based on the update number total (S103). Specific processing in steps S102 and S103 will be described with reference to fig. 26 and 27.
Fig. 26 is a flowchart showing a process of the display area determining unit 39 updating the update number of the rectangular area 4. The processing of fig. 26 is carried out for each rectangular region 4.
When the object input is detected in the rectangular area 4 (yes in S111), the display area determining unit 39 determines the current maximum update number among the update numbers set in all the rectangular areas (S112).
Then, the display area determining unit 39 updates the update number of the rectangular area 4 to which the object is input to the maximum update number +1 point (S113). The display area determining unit 39 performs the above-described processing for each rectangular area 4.
Fig. 27 is a flowchart of the process of determining the display area 150 by the display area determining unit 39 based on the update number total.
As shown in fig. 26, the display area determining unit 39 updates the number of updates of the rectangular area 4 of the detected stroke (S201). The display area determining unit 39 repeatedly adds 1 point to the maximum update count before each rectangular area 4 in which a stroke is detected, and sets a process of the rectangular area 4 in which a stroke is detected.
Next, the display area determining unit 39 sets a preset minimum window size N, M (S202).
The display area determining unit 39 obtains the total number of updates for each window 130 according to the set window size (S203). The calculation of the update number total is only required to be performed in the window 130 of the rectangular area 4 containing the updated update number.
Next, the display area determining unit 39 determines whether or not there is one or more windows 130 for which the update count is 0 (S204).
If the determination at step S204 is yes, the window size may be increased, and therefore the display area determining unit 39 increases N or M by 1 (S205), and the process starts from step S203. In the case where N is increased first, the display area determining section 39 increases M by 1 in the next step S205, thereby alternately increasing N and M. In order to suppress the display area 150 from unnecessarily increasing, a maximum value may be set for N and M.
If the determination at step S204 is no, the display area determining unit 39 determines whether or not N and M remain at the minimum value (S206).
When N and M are the minimum values (yes in S206), since it is not determined that there is a window 130 with the total number of updates being 0 at a time in step S204, the display area determining unit 39 determines the window 130 with the minimum total number of updates in each window 130 as the display area 150 (S207). When there are a plurality of windows 130 whose total number of updates is the same, the display area determining unit 39 determines, for example, the window 130 at the highest position, the window 130 at the right end or the window 130 at the left end as the display area 150.
If N and M are not kept at the minimum value (no in S206), it is determined in step S204 that there is a window 130 having an update count of 0 at a time, and therefore, the display area determining unit 39 determines, as the display area, the window 130 having the update count of 0 among N and M in which it is finally determined that there is the window 130 having the update count of 0 (S208). When there are a plurality of windows 130 whose total number of updates is 0, the display area determining unit 39 determines, for example, the window 130 at the highest position, the window 130 at the right end or the window 130 at the left end as the display area 150.
Fig. 28 is a flowchart of a process in which the text display unit 40 displays text data in the display area 150.
The communication unit 36 of the electronic blackboard 2 receives text data from the information processing system 50 at any time (S301).
The text display unit 40 obtains the position of the display area 150 determined by the display area determination unit 39 (S302).
The text display unit 40 refers to the text size determination table of fig. 22, and determines the text size of the text data based on the size of the display area 150 (S303).
The text display unit 40 displays text data on the display area 150 (S304). The text display unit 40 sequentially arranges text data from above to below or from below to above the display area 150. When the display area 150 is not blank, the text display unit 40 scrolls and discharges the oldest text data among the text data being displayed, scrolls the remaining text data, and displays new text data in the display area 150. The text display unit 40 does not need to display text data by a dialog box as shown in fig. 1, and can display each text data without any gap.
When the text data for which the specified time has elapsed after the display is displayed in the display area 150 (yes in S305), the text display unit 40 deletes the text data (S306). Thus, the remaining of the old text data can be suppressed.
< weighting of update number >
In the above description, the update number is set to be 1 point added to the previous maximum update number in the rectangular area 4 where the stroke is detected, but the addition point added to the update number may be changed according to the kind of the object.
Fig. 29 shows an example of the addition of the update number that changes according to the type of the object. The stamp 165 is displayed on the display 480 in fig. 29. The stamp is a graphic or icon such as a fixed character or a sign which can be displayed by a simple operation. For example, characters or marks such as "finish", "quasi", "AI" and the like which are often used in a conference may be made into the stamp 165. Since the stamp 165 does not convey conference content, the need to be visible at any time is low. In contrast, when the seal 165 is input, the display area determination unit 39 sets the addition point of the update number smaller than the stroke. For example, the dot of the update number of the rectangular area 4 to which the stamp 165 is input is "0.5". In this way, it is easier to display text data in the area where the stamp 165 is displayed.
The electronic blackboard 2 can display a screen displayed by the PC470 as an image 166. The image 166 has a still picture and an animation, and the still picture is described herein. If it is considered to superimpose text data on the image 166, it is preferable that the area where important information is less. For example, it is known that a region with a spatial frequency smaller than a threshold is a flat image, and the amount of information is small. The spatial frequency is the number of repetitions of a structure included in a unit length in a spatially periodic structure such as a stripe. The spatial frequency is obtained by DCT (Discrete Cosine Transform) and fourier transform of the two-dimensional image.
In contrast, when the image 166 is input, the display region determining unit 39 calculates the spatial frequency for each rectangular region 4, and determines whether or not the spatial frequency is smaller than the threshold value. When the spatial frequency is smaller than the threshold, the display area determining unit 39 sets the addition point of the update number smaller than that when the spatial frequency is equal to or larger than the threshold. For example, when the spatial frequency is less than the threshold value, the addition point of the update number is "0.5", and when the spatial frequency is equal to or greater than the threshold value, the addition point of the update number is "2.0". That is, in the case of the image 166, the dot is smaller than the stroke data in the rectangular region 4 where the spatial frequency of the image 166 is smaller than the threshold value, and larger than the stroke data in the rectangular region 4 above the threshold value.
In fig. 29, since the update number becomes smaller in an area with a small amount of information like sky, text data is easily displayed in an area with a small amount of information. Although the image 166 occupies a relatively large area, it is easy to superimpose text data on the image for display.
When the image 166 is an animation, the display area determining unit 39 changes the dotting based on the change of the image. The display area determining unit 39 obtains the difference between the moving picture frames, and if the difference is smaller than the threshold value, adds a small addition point (the images are sequentially input, and therefore, a value of 0 or a value close to 0 may be used) to the update number, and if the difference is equal to or greater than the threshold value, adds a larger addition point (1 or the like) to the update number.
Fig. 30 is a flowchart showing a process of the display area determining unit 39 for updating the update number of the rectangular area 4 according to the type of the object. The processing of fig. 30 is performed for each rectangular region 4.
When the object input is detected in the rectangular area 4 (yes in S121), the display area determining unit 39 determines the current maximum update number among the update numbers set for all the rectangular areas (S122).
The display area determining unit 39 determines the input object type (S123).
Then, the display area determining unit 39 updates the update number of the rectangular area 4 to which the object is input to the maximum update number+the point corresponding to the object type (S124). The display area determining unit 39 performs the above-described processing for each rectangular area 4.
In addition, as in the image 166, in the case where the objects are input in the rectangular region 4 at substantially the same time, the display region determining unit 39 may process the rectangular region 4 out of the order of input. In this case, the display area determining unit 39 sets the maximum update number to the same value for all the rectangular areas 4 overlapping the image 166, and uses, for example, the maximum update number among the update numbers set for all the rectangular areas 4 before the input of the image 166.
< movement of display area >
Referring to fig. 31, movement of the display area 150 is explained. Fig. 31 is a schematic diagram showing movement of the region 150. In the display area 150 determined by the display area determining unit 39, text data is often continuously displayed, and the user has fewer handwriting strokes and the like. However, since the display area 150 is originally an area with low frequency of use, the user may use it again. For example, the user erases the object 171 using an eraser function. At this time, the update number of the rectangular area 4 overlapping the object 171 is 0.
Next, as shown in fig. 31 (b), when the user handwriting the handwriting data 172 in the rectangular area 4 overlapping the object 171, the update number may be higher than that in other areas. The display area determination unit 39 determines the display area 150 every time a stroke is detected, and therefore, a new display area 150 can be set immediately upon detection of the handwriting data 172. Because text data is displayed before handwriting data 172, sometimes the user cannot see the complete handwriting data 172. The text display unit 40 may make the text data translucent when the user handwriting on the display area 150.
In fig. 31 (c), the display area 150 moves to the upper left of the display 480. In this way, the user can also perform handwriting by eliminating strokes of the display area 150, and move the display area 150 by handwriting.
In addition to handwriting, the user can move the display area 150 to an arbitrary position by long-pressing the display area 150 or a menu operation.
When the display area determination unit 39 moves the location of the display area 150, the display area 150 may be displayed in a new location after deleting the original display area 150. Alternatively, the display area determination unit 39 may display the display area 150 at a new place while leaving the original display area 150. The original display area 150 is deleted after a predetermined time has elapsed.
< Main Effect >
As described above, the electronic blackboard 2 of the present embodiment displays text data in the unused area or the display area 150 which has been used less frequently recently, and therefore, it is difficult to reduce the convenience of use of the electronic blackboard 2.
< other applicable examples >
While the best mode for carrying out the invention has been described above with reference to the embodiments, the invention is not limited to these embodiments, and various changes and substitutions may be made therein without departing from the spirit of the invention.
For example, in the present embodiment, the electronic blackboard 2 can determine the display area 150 and display text data, but by executing an application program, the terminal device 10 having the same function as the electronic blackboard 2 determines the display area 150 and displays text data. The terminal device 10 is used as a general-purpose information processing device at ordinary times, but if the user executes an application program, the user can write handwriting as with the dedicated device of the electronic blackboard 2. The terminal device 10 may or may not have a touch panel. A user can input characters through a keyboard, and can also input strokes through a mouse.
In the present embodiment, the electronic blackboard 2 determines the display area 150 as a single body and displays text data, but the electronic blackboard 2 and the information processing system 50 may execute Web applications. The electronic blackboard 2 runs a Web browser therein, and the electronic blackboard 2 runs a Web application program transmitted by the information processing system 50, thereby realizing the function of the electronic blackboard 2. The stroke data handwritten on the electronic blackboard 2 is transmitted to the information processing system 50. At this time, the determination of the display area 150 may be performed by the information processing system 50 or by the electronic blackboard 2.
The conversion of the sound acquired by the conference device 60 into text data may be performed by the electronic blackboard 2.
When the electronic blackboard 2 communicates with other points to share the display object, each electronic blackboard 2 determines the display area 150. Since the objects to be displayed are the same, the electronic blackboard 2 for each point determines the display area 150 at the same position. However, when the size of the rectangular area 4 and the number of N and M are arbitrarily set by the user at other points, the position and size of the display area 150 may be different. Other electronic blackboards 2 can share the position of the display area 150 determined by one electronic blackboard 2.
The terminal apparatus 10 and the conference device 60 may also be integral. The conference device 60 may be external to the terminal device 10. Conference device 60 may be a combination of a dome camera, microphone, and speaker connected by cables.
Conference device 60 may also be provided in other sites 101. Other sites 101 additionally use conferencing device 60 to create combined image animation and text data. There may be multiple conference devices 60 at the same site. In this case, each conference device 60 makes a plurality of pieces of recording information.
The setting of the panoramic image 203, the speaker image 204, and the screen of the application in the combined image animation used in the present embodiment is merely an example. The panoramic image 203 may be below, the speaker image 204 may be above, the user may change the setting, and the display and non-display of the panoramic image 203 and the speaker image 204 may be switched at the time of reproduction, respectively.
In order to facilitate understanding of the processing of the terminal apparatus 10, the conference device 60, and the information processing system 50, the configuration examples of fig. 9, 15, and the like are divided according to the main functions. The division and name of the processing units are not limiting to the invention. The processing of the terminal apparatus 10, the conference device 60, and the information processing system 50 may be divided into more processing units according to the processing contents. The segmentation may be performed in such a way that one processing unit contains more processing.
The population of devices described in the examples are representative of only one of a plurality of computing environments for implementing the embodiments disclosed herein. In one implementation, information handling system 50 includes multiple computing devices, such as a server cluster. The plurality of computing devices are arranged to communicate with each other over any type of communication link (including networks, shared memory, etc.) to perform the processes disclosed herein.
Further, the information processing system 50 may be configured to share the disclosed processing steps in various combinations, such as fig. 18, 24. For example, the processing performed by the prescribed unit may be performed by a plurality of information processing apparatuses provided in the information processing system 50. The information processing system 50 may be integrated into one server device or may be divided into a plurality of devices.
The functions of the above-described embodiments may be implemented by one or more processing circuits. Herein, "processing circuit" herein includes a processor programmed by software to perform functions like a processor installed by a circuit, as well as devices designed to perform the above functions ASIC (Application Specific Integrated Circuit), DSP (Digital Signal Processor), FPGA (Field Programmable Gate Array), and existing circuit modules, and the like.
< additional notes >
[1] An information processing apparatus communicable with an information processing system, having: a display control section for displaying an object on a display; a communication unit configured to receive text data converted from a sound acquired by a device from the information processing system; a display area determining unit configured to determine a display area for displaying the text data based on a frequency of use of each area on the display; and a text display section for displaying the text data on the display area.
[2] The information processing apparatus according to claim 1, wherein the display area determination section determines the area in which the object has been displayed as the display area.
[3] The information processing apparatus according to claim 1 or 2, wherein when an object is input to the area, the display area determining unit manages the use frequency of the area with the updated update number, and determines the use frequency based on the update number if the update number of the area to which the object is input is larger as the object input time is later.
[4] The information processing apparatus according to claim 3, wherein the display area determining unit performs a summation process on the window including the area to which the object is input, obtains a summation of update numbers of the areas included in a window having a first area number in a height direction and a second area number in a width direction, and determines the window having the smallest summation of the update numbers as the display area.
[5] The information processing apparatus according to claim 4, wherein the display area determining section updates the update number of the area to which the object is input by adding a point to the update number of the area having the largest update number, and changes the point according to the kind of the object.
[6] The information processing apparatus according to claim 5, wherein when the kind of the object is stroke data, the point is larger than when the kind of the object is a stamp.
[7] The information processing apparatus according to claim 5 or 6, wherein when the kind of the object is an image, the number of points is smaller than the number of points of the stroke data in a region where a spatial frequency of the image is smaller than a threshold value, and the number of points is larger than the number of points of the stroke data in a region where the spatial frequency of the image is equal to or greater than the threshold value.
[8] The information processing apparatus according to any one of claims 4 to 7, wherein when the window having the total number of updates is 0, the display area determining unit increases the first area number or the second area number, performs a summation process in the window including the area to which the object is input, obtains a total of the update numbers of the areas included in the window, and determines the window having the smallest total of the update numbers as the display area.
[9] The information processing apparatus according to claim 8, wherein the text display section changes the size of the text data in accordance with the size of the display area determined by the first area number or the second area number.
[10] The information processing apparatus according to any one of claims 1 to 9, wherein the display area determination unit determines, when a blank that does not display the object exists on the display, the blank as the display area.
Description of the reference numerals
10. Terminal
50. Information processing system
60. Conference apparatus
70. Storage service system
100. Recorded information producing system

Claims (13)

1. An information processing apparatus communicable with an information processing system, having:
a display control section for displaying an object on a display;
a communication unit configured to receive text data converted from a sound acquired by a device from the information processing system;
a display area determining unit configured to determine a display area for displaying the text data based on a frequency of use of each area on the display; and
and a text display section for displaying the text data on the display area.
2. The information processing apparatus according to claim 1, wherein the display area determination section determines the area in which the object has been displayed as the display area.
3. The information processing apparatus according to claim 2, wherein,
When an object is input in the area, the display area determining unit manages the use frequency of the area by the updated update number,
the update number of the area in which the object is input is set to be larger as the update number is input later in time, and the use frequency is determined based on the update number.
4. The information processing apparatus according to claim 3, wherein,
the display area determining unit performs a summation process in a window including the area to which the object is input, obtains a summation of update numbers of the areas included in the window having a first area number in a height direction and a second area number in a width direction,
and determining the window with the smallest total of the update numbers as the display area.
5. The information processing apparatus according to claim 4, wherein,
the display area determining unit adds a point to the update number of the area having the largest update number to update the update number of the area to which the object is input,
the point number is changed according to the kind of the object.
6. The information processing apparatus according to claim 5, wherein when the kind of the object is stroke data, the point is larger than when the kind of the object is a stamp.
7. The information processing apparatus according to claim 5 or 6, wherein when the kind of the object is an image, the number of points is smaller than the number of points of the stroke data in a region where a spatial frequency of the image is smaller than a threshold value, and the number of points is larger than the number of points of the stroke data in a region where the spatial frequency of the image is equal to or greater than the threshold value.
8. The information processing apparatus according to claim 4, wherein,
when the window having the total number of updates of 0 is present, the display area determining unit increases the first area number or the second area number, performs a summation process on the window including the area to which the object is input, obtains a total of the number of updates of the area included in the window,
and determining the window with the smallest total of the update numbers as the display area.
9. The information processing apparatus according to claim 8, wherein the text display section changes the size of the text data in accordance with the size of the display area determined by the first area number or the second area number.
10. The information processing apparatus according to claim 1, wherein the display area determination section determines, when there is a blank on the display that the object is not displayed, the blank as the display area.
11. A display method for execution by an information processing apparatus communicable with an information processing system, comprising:
a display control step of displaying an object on a display;
a communication step of receiving text data converted from a sound acquired by the device from the information processing system;
a display area determining step of determining a display area for displaying the text data according to the frequency of use of each area on the display; and
and a text display step of displaying the text data on the display area.
12. A computer-readable storage medium in which a program for realizing the following functional parts by an information processing apparatus that communicates with an information processing system is stored, comprising:
a display control section for displaying an object on a display;
a communication unit configured to receive text data converted from a sound acquired by a device from the information processing system;
a display area determining unit configured to determine a display area for displaying the text data based on a frequency of use of each area on the display; and
and a text display section for displaying the text data on the display area.
13. A computer apparatus comprising a memory and a processor, the memory storing a program, the program being executable by the processor to perform the following functions:
A display control section for displaying an object on a display;
a communication unit configured to receive text data converted from a sound acquired by a device from the information processing system;
a display area determining unit configured to determine a display area for displaying the text data based on a frequency of use of each area on the display; and
and a text display section for displaying the text data on the display area.
CN202311047129.3A 2022-08-22 2023-08-18 Information processing apparatus, display method, storage medium, and computer apparatus Pending CN117608465A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022132011A JP2024029642A (en) 2022-08-22 2022-08-22 Information processing device, display method, program
JP2022-132011 2022-08-22

Publications (1)

Publication Number Publication Date
CN117608465A true CN117608465A (en) 2024-02-27

Family

ID=89948469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311047129.3A Pending CN117608465A (en) 2022-08-22 2023-08-18 Information processing apparatus, display method, storage medium, and computer apparatus

Country Status (2)

Country Link
JP (1) JP2024029642A (en)
CN (1) CN117608465A (en)

Also Published As

Publication number Publication date
JP2024029642A (en) 2024-03-06

Similar Documents

Publication Publication Date Title
EP2701152B1 (en) Media object browsing in a collaborative window, mobile client editing, augmented reality rendering.
US9324305B2 (en) Method of synthesizing images photographed by portable terminal, machine-readable storage medium, and portable terminal
CN112243583B (en) Multi-endpoint mixed reality conference
KR20180133743A (en) Mobile terminal and method for controlling the same
US10990344B2 (en) Information processing apparatus, information processing system, and information processing method
US11394757B2 (en) Communication terminal, communication system, and method of sharing data
US11917329B2 (en) Display device and video communication data processing method
KR102061867B1 (en) Apparatus for generating image and method thereof
EP2950503A1 (en) Communication system, transfer control device, communication method, and computer program product
US20230292011A1 (en) Information processing system, image-capturing device, and display method
KR101666922B1 (en) Multi-display Device And Method Of Providing Information Using The Same
JP2013004001A (en) Display control device, display control method, and program
US11221760B2 (en) Information processing apparatus, information processing method, and storage medium
CN117608465A (en) Information processing apparatus, display method, storage medium, and computer apparatus
US20230297313A1 (en) Device management system, information processing method, information processing server, and non-transitory recording medium
US20240031653A1 (en) Information processing server, record creation system, display control method, and non-transitory recording medium
US20240004921A1 (en) Information processing system, information processing method, and non-transitory recording medium
JP2024008632A (en) Information processing system, display method, program, and recording information creation system
EP4243409A1 (en) System and method for displaying image, image-capturing device, and carrier means
US11966658B2 (en) System and method for displaying image, image-capturing device, and recording medium
US11687312B2 (en) Display apparatus, data sharing system, and display control method
JP2024025003A (en) Record information creation system, information processing system, program
US20230368399A1 (en) Display terminal, communication system, and non-transitory recording medium
US20230289126A1 (en) System, method for adjusting audio volume, and apparatus
JP7287156B2 (en) Display device, display method, program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination