WO2022239152A1 - 情報提示装置、情報提示方法、及びプログラム - Google Patents

情報提示装置、情報提示方法、及びプログラム Download PDF

Info

Publication number
WO2022239152A1
WO2022239152A1 PCT/JP2021/018067 JP2021018067W WO2022239152A1 WO 2022239152 A1 WO2022239152 A1 WO 2022239152A1 JP 2021018067 W JP2021018067 W JP 2021018067W WO 2022239152 A1 WO2022239152 A1 WO 2022239152A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
display
information presentation
presentation device
positional relationship
Prior art date
Application number
PCT/JP2021/018067
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
宇翔 草深
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2023520656A priority Critical patent/JPWO2022239152A1/ja
Priority to PCT/JP2021/018067 priority patent/WO2022239152A1/ja
Publication of WO2022239152A1 publication Critical patent/WO2022239152A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention relates to an information presentation device, an information presentation method, and a program.
  • a responder hereinafter referred to as, for example, a customer service staff
  • a conversation partner hereinafter referred to as a counterpart of the responder
  • communicates and exchanges information to be conveyed such as characters, character information that expresses the intention of the conversation partner, and character information that allows the conversation partner to understand some of the responses and answers of the corresponding respondent.
  • characters, character information that expresses the intention of the conversation partner, and character information that allows the conversation partner to understand some of the responses and answers of the corresponding respondent are displayed on the same screen of one display at the same time.
  • a cross-language dialogue device which is an example of an information presentation device, translates the utterances of a respondent and a conversation partner who use different languages into character information that enables communication between them. , are displayed on the same screen of one display at the same time.
  • processing is performed to make it easier to see character information to be conveyed to the conversation partner by turning the display direction of only the display portion related to the conversation partner on one display screen.
  • each of a plurality of terminal units enables communication between users who use different languages (hereinafter referred to as conversation participants, collectively referring to a respondent and a conversation partner).
  • a cross-language dialogue device is described, and Patent Literature 2 describes a dialogue processing method that enables communication between users who use different languages.
  • Patent Document 3 when the person communicating with the other party of the conversation, the display method of the information on the information display device is turned upside down to display the information that the person wants to convey to the other party of the conversation in an easy-to-see direction. A display device is described.
  • JP 2004-355226 A Japanese Patent Application Laid-Open No. 2005-322145 Japanese Patent Application Laid-Open No. 2001-195049
  • Japanese Patent Application Laid-Open Nos. 2005-100001 and 2004-200322 propose techniques for smooth response, such as translation, but they do not consider the position of the user and a method of dynamically controlling the layout based on the position of the user. . Further, in Patent Document 3, as a configuration, the display of information is turned upside down to make it easier for the user to see the information. is not considered.
  • An object of the present invention which has been made in view of such circumstances, is to provide an information presentation device, an information presentation method, and a program that can dynamically change the screen layout according to the user's positional relationship.
  • an information presentation device is an information presentation device that presents information to a plurality of users via a display. and a screen layout determination unit that dynamically determines a screen layout according to the positional relationship and outputs display information based on the screen layout to the display.
  • an information presentation method is an information presentation method in an information presentation device for presenting information to a plurality of users via a display, wherein the information presentation device presents information to a plurality of users. obtaining position information and estimating the positional relationship of the user with respect to the display; dynamically determining a screen layout according to the positional relationship; and outputting display information based on the screen layout to the display. ,including.
  • a program causes a computer to function as the above information presentation device.
  • information can be used in a screen layout according to the characteristics of the conversation without complicated operations.
  • FIG. 1 is a block diagram showing a configuration example of an information presentation device according to a first embodiment
  • FIG. FIG. 4 is a schematic diagram showing a screen layout based on an estimated positional relationship
  • FIG. 4 is a schematic diagram showing a screen layout based on an estimated positional relationship
  • FIG. 10 is a diagram illustrating dynamic changes in screen layout according to positional relationships during dialogue
  • 4 is a flow chart showing an example of an information presentation method executed by the information presentation device according to the first embodiment
  • It is a block diagram which shows the structural example of the information presentation apparatus which concerns on 2nd Embodiment.
  • FIG. 9 is a flow chart showing an example of an information presentation method executed by the information presentation device according to the second embodiment; It is a block diagram which shows the structural example of the information presentation apparatus which concerns on 3rd Embodiment.
  • 10 is a flow chart showing an example of an information presentation method executed by an information presentation device according to a third embodiment; It is a block diagram which shows the structural example of the information presentation apparatus which concerns on 4th Embodiment. It is a block diagram which shows the structural example of the information presentation apparatus which concerns on 4th Embodiment.
  • 1 is a schematic diagram illustrating a display that can be viewed from any 360-degree position;
  • FIG. 1 is a schematic diagram illustrating a display that can be viewed from any 360-degree position;
  • FIG. 11 is a block diagram showing a configuration example of an information presentation device according to a fifth embodiment
  • FIG. 4 is a schematic diagram of presenting information by cooperation of a plurality of devices
  • FIG. 14 is a flow chart showing an example of an information presentation method executed by an information presentation device according to a fifth embodiment
  • FIG. 1 is a block diagram showing a schematic configuration of a computer functioning as an information presentation device
  • FIG. 1 is a block diagram showing a configuration example of an information presentation device 1 according to the first embodiment.
  • the information presentation device 1 shown in FIG. 1 includes a positional relationship estimation unit 11, a screen layout determination unit 12, and a screen display unit 13. Note that the screen display unit 13 may be provided outside the information presentation device 1 .
  • the positional relationship estimation unit 11 receives and acquires the user's positional information, and estimates the user's positional relationship with respect to the screen display unit 13 (hereinafter simply referred to as "positional relationship"). Then, positional relationship estimation section 11 outputs positional relationship information indicating the estimated positional relationship to screen layout determination section 12 .
  • the positional information of the user is acquired using a microphone, an environmental camera, Bluetooth (registered trademark), UWB (Ultra Wide Band), etc., which function in cooperation with the information presentation device 1, and is transmitted to the positional relationship estimation unit 11. .
  • the screen layout determining unit 12 dynamically determines the screen layout of information to be presented according to the positional relationship estimated by the positional relationship estimating unit 11.
  • the screen layout determination unit 12 then generates display information based on the determined screen layout, and outputs the display information to the screen display unit 13 .
  • the screen display unit 13 is a display that displays the display information input from the screen layout determination unit 12.
  • the screen display unit 13 is, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display.
  • the display may be a transmissive display.
  • the information presentation device 1 estimates the positional relationships among the three users U1, U2, and U3, determines a screen layout based on the estimated positional relationships, and displays information based on the determined screen layout. is output to the transmissive display D (screen display unit 13).
  • the user U3 is having a conversation with the users U1 and U2 who are side by side with the transmissive display D interposed therebetween.
  • normal characters are presented on the screen used by the user U3, and reversed characters are presented on the screens used by the users U1 and U2.
  • FIG. 2B user U2 and U user U3 who are side by side are having a conversation with user U1 with transmissive display D interposed therebetween.
  • normal characters are presented on the screens used by the users U2 and U3, and reversed characters are presented on the screen used by the user U1.
  • Fig. 3 is a diagram explaining the dynamic change of the screen layout according to the positional relationship during the dialogue.
  • mode A one in which the respondent and the conversation partner face each other across a single transmissive display D and view the screen
  • mode B A mode in which the other party views the same screen of one transmissive display D from a side-by-side position is also included.
  • mode A shown on the left side of FIG. 3
  • mode B a screen layout is presented in which normal characters are presented on the screen of the attendant and reversed characters are presented on the screen of the conversation partner.
  • mode B shown on the right side of FIG.
  • a screen layout for presenting normal characters is presented on the screens of the attendant and the conversation partner.
  • the screen layout is dynamically changed according to the positional relationship.
  • mode B the viewing directions of the respondent and the conversation partner are the same, so it is easy to have a conversation with reference to an object.
  • FIG. 4 is a flowchart showing an example of an information presentation method executed by the information presentation device 1.
  • FIG. 4 is a flowchart showing an example of an information presentation method executed by the information presentation device 1.
  • step S101 upon receiving the user's position information, the positional relationship estimation unit 11 estimates the positional relationship.
  • step S102 the screen layout determining unit 12 determines the screen layout of information to be presented based on the positional relationship estimated by the positional relationship estimating unit 11.
  • step S103 the screen display unit 13 displays information based on the screen layout determined by the screen layout determination unit 12.
  • step S104 if it is determined that the conversation is ongoing, the steps from step S101 to step S103 are repeatedly executed, and as shown in FIG.
  • the mode is changed from the mode A to the mode A, the screen layout for displaying information is changed accordingly.
  • the information display ends.
  • the information presentation device 1 it is possible to dynamically change the screen layout according to the positional relationship.
  • Information can be used with the corresponding screen layout.
  • FIG. 5 is a block diagram showing a configuration example of the information presentation device 2 according to the second embodiment.
  • the information presentation device 2 shown in FIG. 5 includes a positional relationship estimation unit 11, a screen layout determination unit 12, a screen display unit 13, a voice acquisition unit 14, a speaker/position estimation unit 15, a positional relationship DB 16, Prepare.
  • the information presentation device 2 differs from the information presentation device 1 according to the first embodiment in that it further includes a voice acquisition unit 14, a speaker/position estimation unit 15, and a positional relationship DB 16.
  • FIG. The same reference numerals as in the first embodiment are assigned to the same configurations as in the first embodiment, and the description thereof is omitted as appropriate.
  • the voice acquisition unit 14 is a microphone or the like that acquires the user's uttered voice.
  • the voice acquisition unit 14 outputs voice information indicating the acquired voice to the speaker/position estimation unit 15 .
  • the positional relationship DB 16 associates and stores the use position of the respondent and the position of the voice acquisition unit 14 .
  • the speaker/position estimation unit 15 estimates the position of the speaker (the user who spoke at a certain point in time) based on the voice acquired by the voice acquisition unit 14 . For example, if the voice acquisition unit 14 is a directional microphone or a microphone array, the speaker/location estimation unit 15 can estimate the location of the speaker based on the loudness of the voice. Then, the speaker/position estimation unit 15 outputs position information indicating the estimated speech position to the positional relationship estimation unit 11 and updates the speaker position information stored in the positional relationship DB 16 .
  • the positional relationship estimation unit 11 refers to the positional relationship DB 16 and estimates the positional relationship between the speakers participating in the conversation, triggered by the end of the processing in the speaker/position estimation unit 15 . For example, if there are two speakers A and B, we assume that they are standing on the same side of the display, with speaker A on the right and speaker B on the left. . As another example, assume that speakers A and B are facing different sides of the display. Then, positional relationship estimating section 11 outputs positional relationship information indicating the estimated positional relationship to screen layout determining section 12 .
  • FIG. 6 is a flowchart showing an example of an information presentation method executed by the information presentation device 2 according to the second embodiment.
  • the information presentation device 2 Before the information presentation device 2 executes the information presentation method, as a preparation, it associates the usage position of the respondent with the position of the microphone and stores them in the positional relationship DB 16 .
  • step S201 the voice acquisition unit 14 of the information presentation device 2 acquires the uttered voice.
  • step S202 the speaker/position estimation unit 15 estimates the speaker and its position based on the acquired voice data, and updates the speaker and its position information stored in the positional relationship DB 16.
  • step S203 the positional relationship estimation unit 11 estimates the positional relationship of the speakers participating in the conversation based on the information stored in the positional relationship DB 16.
  • step S204 the screen layout determining unit 12 determines the screen layout of information to be presented based on the positional relationship estimated by the positional relationship estimating unit 11.
  • step S205 the screen display unit 13 displays information based on the screen layout determined by the screen layout determination unit 12.
  • step S206 if it is determined that the conversation is ongoing, the steps from step S201 to step S205 are repeatedly executed, and as shown in FIG. Alternatively, when the form B is changed to the form A, the screen layout for displaying information is changed accordingly. When it is determined that the conversation has ended, the information display ends.
  • the information presentation device 2 it is possible to acquire the user's location information by a simple method.
  • FIG. 7 is a block diagram showing a configuration example of the information presentation device 3 according to the third embodiment.
  • the information presentation device 3 shown in FIG. 7 includes a positional relationship estimation unit 11, a screen layout determination unit 12, a screen display unit 13, a speaker/position estimation unit 15, a positional relationship DB 16, an image acquisition unit 17, A face authentication DB 18 is provided.
  • the information presentation device 3 differs from the information presentation device 2 according to the second embodiment in that it includes an image acquisition section 17 instead of the voice acquisition section 14 and further comprises a face authentication DB 18 .
  • the same reference numerals as in the second embodiment are assigned to the same configurations as in the second embodiment, and the description thereof is omitted as appropriate.
  • the image acquisition unit 17 is a camera or the like that acquires image information of the user.
  • the image acquisition unit 17 outputs the acquired image information indicating the face of the speaker used for face recognition and face authentication to the speaker/position estimation unit 15 .
  • face authentication refers to an authentication method that identifies an individual based on the positions of feature points such as the eyes, nose, and mouth of the face, as well as the position and size of the face area.
  • face recognition identifies people, but does not identify individuals. That is, if face authentication is performed using the information registered in the face authentication DB 18, it is possible to distinguish between the attendant and the customer.
  • the face authentication DB 18 pre-registers the face authentication information required to identify the respondent as a specific individual by face authentication.
  • the speaker/position estimation unit 15 executes face recognition based on the image information acquired by the image acquisition unit 17 to estimate the user's position information. Then, the speaker/position estimation unit 15 outputs the estimated position information of the user to the positional relationship estimation unit 11 . In addition, the speaker/position estimation unit 15 updates the position information of the user stored in the positional relationship DB 16 .
  • the speaker/position estimation unit 15 performs face authentication based on the face authentication information registered in the face authentication DB 18 and the image information acquired by the image acquisition unit 17, are estimated separately.
  • the respondent is, for example, a customer service staff, and the other user is, for example, a conversation partner of the respondent.
  • the positional relationship estimating unit 11 refers to the positional relationship DB 16 to estimate the positional relationship, and outputs the estimated positional relationship to the screen layout determining unit 12 when the processing in the speaker/position estimating unit 15 ends.
  • face authentication since face authentication is not used, as a preparation, the use position of the person receiving the call and the position of the microphone are linked and stored in the positional relationship DB 16.
  • face authentication is used. , there is no need to predetermine the position of the attendant.
  • the information presentation device 3 may further include the above-described voice acquisition section 14 in addition to the image acquisition section 17 .
  • the voice acquisition unit 14 outputs voice information indicating the acquired voice to the speaker/position estimation unit 15 .
  • the speaker/position estimating unit 15 can also estimate the utterance position based on the voice acquired by the voice acquiring unit 14 .
  • FIG. 8 is a flowchart showing an example of an information presentation method executed by the information presentation device 3 according to the third embodiment.
  • the information presentation device 3 executes the information presentation method, as a preparation, the information of the responder necessary for identifying the responder as a specific individual by face authentication is registered in the face authentication DB 18.
  • step S301 the image acquisition unit 17 of the information presentation device 3 acquires image information of the speaker's face for performing face authentication and face recognition.
  • step S302 the speaker/position estimation unit 15 performs face authentication and face recognition to estimate the speaker and its position, and updates the position information of the speaker stored in the positional relationship DB 16.
  • step S303 the positional relationship estimation unit 11 estimates the positional relationship of the speakers participating in the conversation based on the information stored in the positional relationship DB 16.
  • step S304 the screen layout determining unit 12 determines the screen layout of information to be presented based on the positional relationship estimated by the positional relationship estimating unit 11.
  • step S305 the screen display unit 13 displays information based on the screen layout determined by the screen layout determination unit 12.
  • step S306 if it is determined that the conversation is ongoing, the steps from step S301 to step S305 are repeatedly executed, and as shown in FIG.
  • the mode is changed from the mode A to the mode A, the screen layout for displaying information is changed accordingly.
  • the information display ends.
  • the information presentation device 3 it is possible to acquire the user's location information by a simple method. In addition, since it is possible to distinguish between the responder and other users, it is possible to present different information to the responder and other users. Further, when the sound acquisition unit 14 is combined with the image acquisition unit 17, the user's position information can be acquired with higher accuracy.
  • FIG. 9A and 9B are block diagrams showing configuration examples of an information presentation device 4 according to the fourth embodiment.
  • the information presentation device 4 according to the fourth embodiment is an information presentation device that has a display that can be viewed not only from a plane but also from any position of 360 degrees. For this reason, the reference numeral of the screen display unit related to the information presentation device 4 is set to 13' instead of 13.
  • FIG. FIG. 9A differs from FIG. 5, which is a block diagram of the information presentation device 2 according to the second embodiment, in that it has a screen display unit 13'.
  • FIG. 9B is different from FIG. 7, which is a block diagram of the information presentation device 3 according to the third embodiment, in that it has a screen display unit 13'.
  • FIG. 10A when using a display D that can be viewed from any 360-degree position, information is output in normal characters when users U1, U2, and U3 view the screen from their respective positions.
  • FIG. 10B is a top view of FIG. 10A.
  • the information presentation device 4 it is possible to increase the degree of freedom in viewing directions, and for example, to display an aerial image.
  • FIG. 11 is a block diagram showing a configuration example of the information presentation device 5 according to the fifth embodiment.
  • the information presentation device 5 shown in FIG. 11 includes a positional relationship estimation unit 11, a screen layout determination unit 12, a screen display unit 13a, a voice acquisition unit 14, a speaker/position estimation unit 15, a positional relationship DB 16, A screen output distribution unit 19 and a screen position DB 20 are provided.
  • the information presentation device 5 differs from the information presentation device 2 according to the second embodiment in that it further includes a screen output distribution unit 19 and a screen position DB 20 .
  • the same reference numerals as in the second embodiment are assigned to the same configurations as in the second embodiment, and the description thereof is omitted as appropriate.
  • the information presentation device 5 is connected to one or more screen display units 13b-13n.
  • the screen position DB 20 stores position information for each of the plurality of screen display units 13a to 13n.
  • the screen layout determining unit 12 determines the screen layout of each of the screen display units 13a to 13n based on the positional relationship estimated by the positional relationship estimating unit 11 and the positional information stored in the screen position DB20. Then, the screen layout determination section 12 outputs display information based on the screen layouts of the screen display sections 13a to 13n to the screen output distribution section 19.
  • FIG. 1 The screen layout determining unit 12 determines the screen layout of each of the screen display units 13a to 13n based on the positional relationship estimated by the positional relationship estimating unit 11 and the positional information stored in the screen position DB20. Then, the screen layout determination section 12 outputs display information based on the screen layouts of the screen display sections 13a to 13n to the screen output distribution section 19.
  • the screen layout determining unit 12 based on the positional relationship of users U1, U2, and U3 and the positions of transmissive displays D1 and D2, Display information to be presented to the user U1 is generated on the transmissive display D1, and display information to be presented to the users U2 and U3 is generated on the transmissive display D2.
  • the transmissive display D1 Display information to be presented to the user U1 is generated on the transmissive display D1
  • display information to be presented to the users U2 and U3 is generated on the transmissive display D2.
  • normal characters are presented on the screen used by the user U2
  • reversed characters are presented on the screens used by the users U1 and U3.
  • the screen output distribution unit 19 distributes and transfers the display information input from the screen layout determination unit 12 to the plurality of screen display units 13a to 13n.
  • the information presentation device 5 may present information by cooperation of a plurality of devices. That is, the information presentation device 5 may present information on a display provided in the device in cooperation with a device owned by the conversation partner and provided outside the information presentation device 5 .
  • the screen display units 13b to 13n in FIG. 11 mean displays in a plurality of devices. Such devices include personal digital assistants such as smartphones and tablets.
  • the screen output distribution unit 19 transfers display information to the device by short-range wireless communication such as Bluetooth or UWB.
  • the information presentation device 5 may further include the above-described image acquisition section 17 and face authentication DB 18 in addition to the voice acquisition section 14 .
  • the image acquisition unit 17 outputs the acquired image information indicating the face of the speaker used for face recognition and face authentication to the speaker/position estimation unit 15 .
  • the speaker/position estimating unit 15 can also perform face authentication and face recognition based on the image acquired by the image acquiring unit 17 to estimate the position information of the user.
  • FIG. 13 is a flowchart showing an example of an information presentation method executed by the information presentation device 5 according to the fifth embodiment.
  • the position of each screen is saved in the screen position DB 20 as a preparation.
  • step S401 the voice acquisition unit 14 of the information presentation device 5 acquires the uttered voice.
  • step S402 the speaker/position estimation unit 15 estimates the speaker and position, and updates the speaker and its position information stored in the positional relationship DB 16.
  • step S403 the positional relationship estimation unit 11 estimates the positional relationship of the speakers participating in the conversation based on the information stored in the positional relationship DB 16.
  • step S404 the screen layout determining unit 12 determines the screen layout of information to be presented based on the positional relationship estimated by the positional relationship estimating unit 11 and the positional information of each screen stored in the screen position DB 20.
  • step S405 based on the screen layout passed from the screen layout determination unit 12, the screen output distribution unit 19 requests the screen display units 13a to 13n to display content and display output.
  • step S406 the screen display units 13a to 13n output information to the screen based on the determined screen layout.
  • step S407 if it is determined that the conversation is ongoing, the steps from step S401 to step S406 are repeatedly executed, and as shown in FIG.
  • the form B is changed to the form A
  • the screen layout for displaying information is changed accordingly.
  • the information display ends.
  • the positional relationship estimating unit 11, the screen layout determining unit 12, the speaker/position estimating unit 15, and the screen output distribution unit 19 in the information presentation devices 1, 2, 3, 4, and 5 described above are control arithmetic circuits (controllers). form part of The control arithmetic circuit may be configured by dedicated hardware such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array), or may be configured by a processor, or may include both. may be
  • FIG. 14 is a block diagram showing a schematic configuration of a computer functioning as the information presentation devices 1, 2, 3, 4 and 5.
  • the computer 100 may be a general-purpose computer, a dedicated computer, a workstation, a PC (Personal Computer), an electronic notepad, or the like.
  • Program instructions may be program code, code segments, etc. for performing the required tasks.
  • the computer 100 includes a processor 110, a ROM (Read Only Memory) 120, a RAM (Random Access Memory) 130, and a storage 140 as storage units, an input unit 150, an output unit 160, and communication and an interface (I/F) 170 .
  • Each component is communicatively connected to each other via a bus 180 .
  • the voice acquisition unit 14 in the information presentation devices 2, 4 and 5 and the image acquisition unit 17 in the information presentation devices 3 and 4 may be constructed as the input unit 150.
  • the screen display unit 13 may be constructed as the output unit 160 .
  • the ROM 120 stores various programs and various data.
  • RAM 130 temporarily stores programs or data as a work area.
  • the storage 140 is configured by a HDD (Hard Disk Drive) or SSD (Solid State Drive) and stores various programs including an operating system and various data.
  • the ROM 120 or storage 140 stores the program according to the present invention.
  • the positional relationship DB 16, the face authentication DB 18, and the screen position DB 20 according to the embodiment of the present invention may be constructed as the storage 140.
  • the processor 110 is specifically a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), SoC (System on a Chip), etc. may be configured by a plurality of processors of The processor 110 reads a program from the ROM 120 or the storage 140 and executes the program using the RAM 130 as a work area, thereby performing control of each configuration and various arithmetic processing. Note that at least part of these processing contents may be realized by hardware.
  • CPU Central Processing Unit
  • MPU Micro Processing Unit
  • GPU Graphics Processing Unit
  • DSP Digital Signal Processor
  • SoC System on a Chip
  • the program may be recorded on a recording medium readable by the computer 100.
  • a program can be installed in the computer 100 by using such a recording medium.
  • the recording medium on which the program is recorded may be a non-transitory recording medium.
  • the non-transitory recording medium is not particularly limited, but may be, for example, a CD-ROM, a DVD-ROM, a USB (Universal Serial Bus) memory, or the like.
  • this program may be downloaded from an external device via a network.
  • the positional relationship DB 16, the face authentication DB 18, and the screen position DB 20 may be constructed in such a recording medium.
  • An information presentation device that presents information to a plurality of users via a display, Control for acquiring position information of the user, estimating the positional relationship of the user with respect to the display, dynamically determining a screen layout according to the positional relationship, and outputting display information based on the screen layout to the display
  • An information presentation device comprising: (Appendix 2) The control unit obtaining the user's uttered voice; 2. The information presentation device according to additional item 1, wherein the position information of the user is estimated based on the uttered voice. (Appendix 3) The control unit obtaining image information of the user; 2. The information presentation device according to claim 1, wherein face recognition is performed based on the image information to update the position information of the user.
  • Appendix 4 further comprising a face authentication DB for registering face authentication information for identifying a respondent as a specific individual by face authentication, 4.
  • the information presentation device according to claim 3, wherein the control unit performs face authentication based on the face authentication information and the image information, and distinguishes and estimates the positions of the attendant and other users.
  • Appendix 5 5.
  • the information presentation device according to any one of additional items 1 to 4, wherein the control unit outputs the display information to a display that can be viewed from any position 360 degrees.
  • Appendix 6) further comprising a screen position DB that stores position information of a plurality of displays that display the display information; The control unit distributes and transfers the display information to the plurality of displays, 6.
  • the information presentation device according to any one of additional items 1 to 5, wherein a screen layout of each of the plurality of displays is determined based on the positional relationship and the positional information of the displays.
  • Appendix 7 An information presentation method in an information presentation device for presenting information to a plurality of users via a display, By the information presentation device, obtaining position information of the user and estimating a positional relationship of the user with respect to the display; dynamically determining a screen layout according to the positional relationship and outputting display information based on the screen layout to the display;
  • Information presentation methods including;
  • (Appendix 8) A non-temporary storage medium storing a computer-executable program, the non-temporary storage medium storing a program that causes the computer to function as the information presentation device according to any one of appendices 1 to 6.
  • positional relationship estimation unit 12 screen layout determination unit 13, 13′, 13a to 13n screen display unit (display) 14 Voice Acquisition Unit 15 Speaker/Position Estimation Unit 16 Positional Relationship DB 17 Image Acquisition Unit 18 Screen Position DB 19 Screen output distribution unit 20 Screen position DB 100 computer 110 processor 120 ROM 130 RAM 140 storage 150 input unit 160 output unit 170 communication interface (I/F) 180 bus
  • I/F communication interface

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2021/018067 2021-05-12 2021-05-12 情報提示装置、情報提示方法、及びプログラム WO2022239152A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023520656A JPWO2022239152A1 (zh) 2021-05-12 2021-05-12
PCT/JP2021/018067 WO2022239152A1 (ja) 2021-05-12 2021-05-12 情報提示装置、情報提示方法、及びプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/018067 WO2022239152A1 (ja) 2021-05-12 2021-05-12 情報提示装置、情報提示方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2022239152A1 true WO2022239152A1 (ja) 2022-11-17

Family

ID=84028032

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/018067 WO2022239152A1 (ja) 2021-05-12 2021-05-12 情報提示装置、情報提示方法、及びプログラム

Country Status (2)

Country Link
JP (1) JPWO2022239152A1 (zh)
WO (1) WO2022239152A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024176898A1 (ja) * 2023-02-24 2024-08-29 京セラ株式会社 プログラム、表示制御装置及び制御方法

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015049931A1 (ja) * 2013-10-04 2015-04-09 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
WO2015098188A1 (ja) * 2013-12-27 2015-07-02 ソニー株式会社 表示制御装置、表示制御方法及びプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015049931A1 (ja) * 2013-10-04 2015-04-09 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
WO2015098188A1 (ja) * 2013-12-27 2015-07-02 ソニー株式会社 表示制御装置、表示制御方法及びプログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024176898A1 (ja) * 2023-02-24 2024-08-29 京セラ株式会社 プログラム、表示制御装置及び制御方法

Also Published As

Publication number Publication date
JPWO2022239152A1 (zh) 2022-11-17

Similar Documents

Publication Publication Date Title
CN111634188B (zh) 用于投屏的方法和装置
WO2009104564A1 (ja) 仮想空間における会話サーバ、会話のための方法及びコンピュータ・プログラム
WO2021083125A1 (zh) 通话控制方法及相关产品
US20140081634A1 (en) Leveraging head mounted displays to enable person-to-person interactions
US20110316880A1 (en) Method and apparatus providing for adaptation of an augmentative content for output at a location based on a contextual characteristic
KR102193029B1 (ko) 디스플레이 장치 및 그의 화상 통화 수행 방법
US20150237300A1 (en) On Demand Experience Sharing for Wearable Computing Devices
WO2018061173A1 (ja) Tv会議システム、tv会議方法、およびプログラム
WO2022239152A1 (ja) 情報提示装置、情報提示方法、及びプログラム
US20140232655A1 (en) System for transferring the operation of a device to an external apparatus
JP2020136921A (ja) ビデオ通話システム、およびコンピュータプログラム
JP2019057047A (ja) 表示制御システム、表示制御方法及びプログラム
US20190026265A1 (en) Information processing apparatus and information processing method
US20160364383A1 (en) Multi-channel cross-modality system for providing language interpretation/translation services
KR102511720B1 (ko) 360 비디오에서 화자의 음성을 시각적으로 표시하기 위한 장치 및 방법
US20220270331A1 (en) XR Preferred Movement Along Planes
CN107343101A (zh) 一种实现定向录音的方法、装置、设备及存储介质
Iwata et al. Towards a mobility enhanced user interface design for multi-task environments: An experimental study on cognitive workload measurement
JP2005222316A (ja) 会話支援装置、会議支援システム、受付業務支援システム及びプログラム
JP2013205995A (ja) サーバ、電子機器、サーバの制御方法、サーバの制御プログラム
JP7198952B1 (ja) 保険相談システム、募集人端末、および保険相談用プログラム
WO2024194946A1 (ja) 情報処理装置、情報処理システム、情報処理方法、および情報処理プログラム
JP7414053B2 (ja) 情報処理装置、プログラム及び情報通信方法
EP3951724A1 (en) Information processing apparatus, information processing method, and recording medium
JP2018173910A (ja) 音声翻訳システム及び音声翻訳プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21941886

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023520656

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21941886

Country of ref document: EP

Kind code of ref document: A1