JP2017103801A - Communication terminal, communication terminal control method, and communication terminal control program - Google Patents

Communication terminal, communication terminal control method, and communication terminal control program Download PDF

Info

Publication number
JP2017103801A
JP2017103801A JP2017007512A JP2017007512A JP2017103801A JP 2017103801 A JP2017103801 A JP 2017103801A JP 2017007512 A JP2017007512 A JP 2017007512A JP 2017007512 A JP2017007512 A JP 2017007512A JP 2017103801 A JP2017103801 A JP 2017103801A
Authority
JP
Japan
Prior art keywords
communication terminal
user
communication
terminal
evaluation information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2017007512A
Other languages
Japanese (ja)
Inventor
さゆり 柚木▲崎▼
Sayuri Yuzukizaki
さゆり 柚木▲崎▼
Original Assignee
株式会社Jvcケンウッド
Jvc Kenwood Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Jvcケンウッド, Jvc Kenwood Corp filed Critical 株式会社Jvcケンウッド
Priority to JP2017007512A priority Critical patent/JP2017103801A/en
Publication of JP2017103801A publication Critical patent/JP2017103801A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide a communication terminal, a communication terminal output control method, and a communication terminal output control program for making it easy to perform simultaneous calls with a plurality of communication parties using a communication system.SOLUTION: In a control unit 15 of a communication terminal 10X for communicating with other communication terminals including at least a first communication terminal and a second communication terminal, an acquisition unit obtains information indicating an attention level of a user of the first communication terminal to display information on a user of the second communication terminal displayed on a display unit 12 of the first communication terminal. An output control unit adjusts output related to the other communication terminals output to the user of the communication terminal from the communication terminal on the basis of at least the information acquired by the acquisition unit.SELECTED DRAWING: Figure 2

Description

The present invention relates to a communication terminal capable of calling simultaneously with a plurality of parties, a communication terminal control method, and a communication terminal control program.

Traditionally, as a means to communicate with multiple remote parties at the same time,
There are conference systems and video chat systems. In these systems, a virtual space constructed on a communication network is shared by a plurality of communication terminals, and image information corresponding to the users of each communication terminal is displayed on each communication terminal while enabling mutual communication. ing.

In this way, by displaying image information corresponding to a plurality of parties in a call on each communication terminal, it is possible to provide a realistic communication space.

Japanese Patent Laid-Open No. 11-234640

When talking to a plurality of other parties using these systems, if the voices of a plurality of people are generated at the same time, it becomes difficult to hear the content of the conversation.

Therefore, there is a technology that makes it easy to hear the content of a speech of a partner with a high degree of gaze by increasing the voice of a specific partner that the user using the communication terminal is gazing at.

However, when this technology is used, only the voice of the other party that the user using the communication terminal is gazing at is increased, so when the other party is talking to someone else, It is difficult to hear the voice of the person who is talking to the person you are watching. For this reason, there was a problem that it was impossible to hear a natural conversation.

In addition, in order to make it easy for the user using the communication terminal to hear the voice of the other party of the conversation, the user who is watching the user and the other party of the conversation need not be watched alternately according to the conversation. There was a problem not to be.

Therefore, an object of the present invention is to provide a communication terminal, a communication terminal output control method, and a communication terminal output control program for facilitating simultaneous communication with a plurality of parties using a communication system. .

In order to achieve the above object, a first communication terminal of the present invention is a communication terminal that communicates with another communication terminal including at least a second communication terminal and a third communication terminal, wherein the first communication terminal Is configured to adjust a display state of display information related to the third communication terminal based on a result of detecting a degree of interest of the user of the second communication terminal with respect to the third communication terminal.

According to the communication terminal control method of the present invention, the first communication terminal communicates with another communication terminal including at least the second communication terminal and the third communication terminal, and the first communication terminal is connected to the first communication terminal. The display state of the display information related to the third communication terminal is adjusted based on the result of detecting the degree of interest of the user of the second communication terminal with respect to the third communication terminal.

In the communication terminal control program of the present invention, the first communication terminal communicates with another communication terminal including at least the second communication terminal and the third communication terminal, and the computer of the first communication terminal A step of adjusting a display state of display information related to the third communication terminal based on a result of detecting a degree of interest of the user of the second communication terminal with respect to the third communication terminal. To do.

According to the communication terminal, the communication terminal output control method, and the communication terminal output control program of the present invention, it is possible to facilitate simultaneous calls with a plurality of parties using the communication system.

It is a general view which shows the structure of the communication system using the communication terminal by 1st Embodiment and 2nd Embodiment of this invention. It is a block diagram which shows the structure of the communication terminal by 1st Embodiment and 2nd Embodiment of this invention. It is a block diagram which shows the structure of the service management server connected to the communication terminal by 1st Embodiment and 2nd Embodiment of this invention. It is a flowchart which shows operation | movement of the communication terminal by 1st Embodiment and 2nd Embodiment of this invention. It is a screen block diagram which shows the state by which the image information corresponding to several call partners was displayed on the display part of the communication terminal by 1st Embodiment and 2nd Embodiment of this invention. It is a graph which shows the production | generation timing of the detection period of gaze point and gaze evaluation information in the audio | voice control part of the communication terminal by 1st Embodiment and 2nd Embodiment of this invention. It is explanatory drawing (upper stage) which shows the gaze state and utterance state between the users using the communication terminal by 1st Embodiment of this invention, and explanatory drawing (lower stage) which shows the audio | voice output state in each communication terminal. It is explanatory drawing (upper stage) which shows the gaze state and utterance state between the users using the communication terminal by 2nd Embodiment of this invention, and explanatory drawing (lower stage) which shows the audio | voice output state in each communication terminal.

As an embodiment of the present invention, a user X, a user A, a user B, a user C, and a user D use a communication terminal that they own and use a simultaneous call service provided by a service management server. A case where a conversation is performed in a virtual space on the network will be described. This is an example, and the communication method may be communication via the Internet or communication via a telephone line. Moreover, you may communicate directly between each communication terminal.

<< First Embodiment >>
<Configuration of Communication System Using Communication Terminal According to First Embodiment>
A configuration of a communication system using the communication terminal according to the first embodiment of the present invention will be described with reference to FIG.

The communication system 1 according to the present embodiment is used by a communication terminal 10X used by a user X, a communication terminal 10A used by a user A, a communication terminal 10B used by a user B, a communication terminal 10C used by a user C, and a user D. The communication terminal 10 </ b> D and the service management server 20 are connected via the network 30.

The configuration of each communication terminal 10X, 10A to 10D is shown in FIG. Communication terminal 10X, 10A ~
10D has the operation input part 11, the display part 12, the audio | voice input part 13, the gaze point detection part 14, the audio | voice control part 15, and the audio | voice output part 16, respectively. Each communication terminal 10X, 10X
A to 10D need only have at least a gaze point detection unit, a gaze evaluation information generation unit, a gaze evaluation information correction unit, and the like, and may not necessarily be the same communication terminal.

The operation input unit 11 is configured by an input interface such as an operation button or a keyboard, and inputs an operation content by a user.

The display unit 12 includes a monitor, and displays image information (display information) indicating a user who uses another communication terminal (hereinafter referred to as “another terminal user”) via the network 30. This image information may be an icon, a user's image, or a moving picture of the user.

The voice input unit 13 is composed of, for example, a microphone, converts voice emitted by a user who uses the own communication terminal (hereinafter referred to as “own terminal user”) into an electric signal, converts the voice signal into data, and converts the voice into data. Information is transmitted to the service management server 20.

The gazing point detection unit 14 is configured by using an imaging device or the like, and detects coordinate information (hereinafter referred to as “gazing point”) indicating the position in the display unit 12 where the own terminal user is gazing by detecting the line of sight of the own terminal user. "
Called. ) Is detected. It should be noted that various known methods can be used as a method for detecting the position where the user is gazing.

The voice control unit 15 includes a CPU, a memory, and the like, and includes a gaze evaluation information generation unit 151, a gaze evaluation information correction unit 152, and a volume determination unit 153 as an output control unit.

The gaze evaluation information generation unit 151 generates gaze evaluation information from the gaze point of the terminal user detected by the gaze point detection unit 14 and transmits the gaze evaluation information to the service management server 20. The gaze evaluation information is indicated by a value obtained by evaluating the gaze degree of the own terminal user for each other terminal user.

The gaze evaluation information correction unit 152 acquires gaze evaluation information of a plurality of other terminal users from the service management server 20, and corrects the gaze evaluation information generated by the gaze evaluation information generation unit 151 based on these. In this manner, the gaze evaluation information correction unit 152 also has a function as an acquisition unit that acquires information indicating the degree of gaze in other communication terminals.

The sound volume determination unit 153 determines the sound volume of each other terminal user based on the gaze evaluation information after being corrected by the gaze evaluation information correction unit 152.

The audio output unit 16 includes a speaker, receives the audio information of the other terminal user from the service management server 20, adjusts the sound volume determined by the sound volume determination unit 153, and outputs it.

The service management server 20 provides a simultaneous call service, and includes a communication control unit 21, an image management unit 22, a voice management unit 23, and a gaze evaluation information management unit 24, as shown in FIG.

  The communication control unit 21 establishes communication between communication terminals using the simultaneous call service.

The image management unit 22 stores image information indicating the user registered in advance by the user of the simultaneous call service. Further, when communication is established between a plurality of communication terminals by the communication control unit 21, image information indicating the user of the communication terminal serving as a communication destination is transmitted to each communication terminal.

When the communication control unit 21 establishes communication between a plurality of communication terminals, the voice management unit 23 receives the voice information input from each communication terminal and transmits it to the communication terminal of the communication destination.

The gaze evaluation information management unit 24 receives gaze evaluation information input from each communication terminal when the communication control unit 21 establishes communication between a plurality of communication terminals, and transmits the gaze evaluation information to each communication terminal. .

<Operation of Communication System According to First Embodiment>
Next, in the communication system 1 according to the present embodiment, when the communication terminals 10X, 10A to 10D perform simultaneous calls using the simultaneous call service, refer to the flowchart of FIG. 4 for processing executed in each communication terminal. I will explain. Hereinafter, any one of the communication terminals 10X, 10A to 10D will be described as “communication terminal 10”.

First, when a request for using the simultaneous call service is input from the operation input unit 11 of the communication terminal 10 by the operation of the own terminal user, the service management server 20 is accessed via the network 30. Then, the communication control unit 21 of the service management server 20 establishes communication with a plurality of communication terminals 10 that are call destinations (step S1). In the present embodiment, it is assumed that communication is established between five communication terminals 10X, 10A to 10D in a state where simultaneous communication is possible.

When communication is established between the communication terminals 10 </ b> X, 10 </ b> A to 10 </ b> D, image information indicating users of other communication terminals 10 that are communication destinations is transmitted from the image management unit 22 to the respective communication terminals 10. For example, the communication terminal 10X includes a user A of other communication terminals 10A to 10D that are communication destinations.
Image information indicating ~ D is transmitted.

In each communication terminal 10, image information indicating the user of another communication terminal 10 transmitted from the service management server 20 is received (step S2) and displayed on the same screen by the display unit 12 (step S2).
Step S3).

As an example, FIG. 5 shows a screen display diagram when image information indicating users of other communication terminals 10A to 10D is displayed on the display unit 12 of the communication terminal 10X.

The screen display diagram of FIG. 5 includes image information 121A indicating user A, image information 121B indicating user B, image information 121C indicating user C, image information 121D indicating user D,
A name information column 122 displaying a list of name information of each user is displayed.

When communication is established between the communication terminals 10X, 10A to 10D, transmission / reception of voice information between these communication terminals is started (step S4), and the input from the voice input unit 13 of each communication terminal 10 is started. The voice information of the terminal user is transmitted to the service management server 20. The voice information received by the service management server 20 is transmitted to another communication terminal 10.

Next, the gaze point detection unit 14 detects the gaze point of the user terminal user to detect the gaze point of the user terminal user (step S5). A known technique is used for detecting the gazing point. For example, the cornea (black eye) and the sclera (white eye) are detected from video information of the user's face taken by an imaging device (not shown) installed in the communication terminal 10. This is done by measuring the eye movement using the difference in reflectance with respect to light.

Next, the gaze evaluation information generation unit 151 generates gaze evaluation information that evaluates the gaze degree of each terminal user by the own terminal user based on the gaze information detected by the gaze point detection unit 14. This gaze evaluation information is generated by evaluating how much of the other terminal user's image information the user terminal user is gazing at among the image information displayed in the display unit 12. The generated gaze evaluation information is transmitted to the service management server 20 and acquired by the gaze evaluation information management unit 24 (step S6).

In this embodiment, each communication terminal 10 generates gaze evaluation information every second based on the gaze point information detected in the latest 5 seconds as shown in FIG. Sent.

In FIG. 6, the periods (t0 to t5, t1 to t6, t2 to t7...) Indicated by the bold lines are the gaze point detection target periods, and the end points (t5, t6, t7. This is the generation timing of gaze evaluation information and the transmission timing to the service management server 20.

In the service management server 20, when the gaze evaluation information generated by each communication terminal 10 is acquired in the gaze evaluation information management unit 24, the gaze evaluation information of the other communication terminals 10 is transmitted to each communication terminal. .

That is, the gaze evaluation information of the communication terminals 10A to 10D is transmitted to the communication terminal 10X, the gaze evaluation information of the communication terminals 10X and 10B to D is transmitted to the communication terminal 10A, and to the communication terminal 10B. Gaze evaluation information of the communication terminals 10X, 10A, 10C, and 10D is transmitted to the communication terminals 10C, 10A, 10B, and 10D.
Gaze evaluation information is transmitted to the communication terminal 10D and the communication terminals 10D and 10A-1
0C gaze evaluation information is transmitted.

In each communication terminal 10, the gaze evaluation information of the other communication terminal 10 transmitted from the service management server 20 is acquired by the gaze evaluation information correction unit 152 (step S7). Then, the gaze evaluation information correction unit 152 corrects the gaze evaluation information of the own terminal user generated by the gaze evaluation information generation unit 151 based on the acquired gaze evaluation information of the other communication terminal 10 (step S8). .

Next, the sound volume determining unit 153 determines the sound volume for each other terminal user so that the sound volume increases as the gaze evaluation information value increases (step S9).

The voice output unit 16 adjusts and outputs the voice information of the other terminal users received from the service management server 20 to the determined volume (step S10).

The processes in steps S5 to S10 described above are repeated until the call connection is disconnected (step S11).

The gaze evaluation information generation process, correction process, and volume determination process executed in steps S6 to S9 described above will be described with reference to a specific example shown in FIG.

In this example, as shown in the upper part of FIG. 7, when user A and user B are having a conversation,
Of the five seconds that are subject to gazing point detection, user X gazes at user A's image information for four seconds,
The case where the user A gazes at the image information of the user B for 4 seconds, the user B gazes at the image information of the user A for 5 seconds, and the user C gazes at the image information of the user X for 5 seconds is shown.

In this case, in the communication terminal 10X, since the own terminal user X watches the other terminal user A for 4 seconds in 5 seconds, the gaze evaluation information for the other terminal user A is generated as “80%”
Since the own terminal user A does not watch the other terminal users B, C, and D for 5 seconds (0 seconds), the gaze evaluation information for the other terminal users B, C, and D is generated as “0%”. Is done.

Similarly, in the communication terminal 10A, the gaze evaluation information for the other terminal user B is generated as “80%”, and the gaze evaluation information for the other terminal users X, C, and D is generated as “0%”. In 10B, the gaze evaluation information for the other terminal user A is generated as “100%”, and the gaze evaluation information for the other terminal users X, C, and D is generated as “0%”.

Similarly, in the communication terminal 10C, the gaze evaluation information for the other terminal user C is generated as “100%”, and the gaze evaluation information for the other terminal users A, B, and D is generated as “0%”.

Similarly, in the communication terminal 10D, the gaze evaluation information for all of the other terminal users X and A to C is generated as “0%”.

The gaze evaluation information is transmitted to the service management server 20, and each of the other communication terminals 1
Sent to 0.

By sending the gaze evaluation information of each communication terminal 10 from the service management server 20,
The communication terminal 10X acquires gaze evaluation information generated by the communication terminals 10A to 10D, the communication terminal 10A acquires gaze evaluation information generated by the communication terminals 10X and 10B to 10D, and the communication terminal 10B performs communication. The gaze evaluation information generated by the terminals 10X, 10A, 10C, and 10D is acquired. In the communication terminal 10C, the communication terminals 10X, 10A, 10B,
And the attention evaluation information generated in 10D is acquired, and in the communication terminal 10D, the communication terminal 10
The gaze evaluation information generated by X, 10A to 10C is acquired.

At this time, the communication terminal 10 of the user having the highest value of the gaze evaluation information of the own terminal user when viewed from each communication terminal 10 is the first communication terminal, and the value of the gaze evaluation information of the user of the first communication terminal is The communication terminal 10 used by the highest user is defined as the second communication terminal.

Then, in each communication terminal 10, the value of the gaze evaluation information of the user of the first communication terminal for the user of the second communication terminal is multiplied by the value of the gaze evaluation information of the own terminal user for the user of the first communication terminal. The combined value is calculated, and the gaze evaluation information of the own terminal user with respect to the user of the second communication terminal is corrected with the calculated value.

For example, in the communication terminal 10X, the gaze evaluation information “80%” of the user A of the communication terminal 10A (first communication terminal) with respect to the user B of the communication terminal 10B (second communication terminal) is added to the communication terminal 1
A value “64%” obtained by multiplying the user A's gaze evaluation information “80%” with respect to the user A of 0A (first communication terminal) is calculated, and the gaze evaluation information of the user terminal user X with respect to the other terminal user B is calculated. Is corrected to a value “64%” calculated from “0%”.

Similarly, in the communication terminal 10C, the gaze evaluation information “80%” of the user X of the communication terminal 10X (first communication terminal) with respect to the user A of the communication terminal 10A (second communication terminal) is added to the communication terminal 10X ( Gaze evaluation information “100 of own terminal user C for user X of first communication terminal)
The value “80%” multiplied by “%” is calculated, and the gaze evaluation information of the user terminal user C with respect to the other terminal user A is corrected to the value “80%” calculated from “0%”.

Here, the communication terminal 10 used by the user having the highest gaze evaluation information value of the user of the second communication terminal is defined as the third communication terminal, and the user of the second communication terminal with respect to the user of the third communication terminal. Is multiplied by the value of the gaze evaluation information of the user of the first communication terminal for the user of the second communication terminal and the value of the gaze evaluation information of the own terminal user for the user of the first communication terminal. A combined value may be calculated, and the gaze evaluation information of the own terminal user for the user of the third communication terminal may be corrected with the calculated value.

For example, in the communication terminal 10C, the gaze evaluation information “80%” of the user A of the communication terminal 10A (second communication terminal) for the user B of the communication terminal 10B (third communication terminal) is added to the communication terminal 10A (second communication terminal). Gaze evaluation information “80%” of the user X of the communication terminal 10X (first communication terminal) with respect to the user A of the communication terminal 10) and the gaze of the own terminal user C with respect to the user X of the communication terminal 10X (first communication terminal) The value “64%” obtained by multiplying the evaluation information “100%” is calculated, and the gaze evaluation information of the own terminal user C with respect to the other terminal user B is corrected to the value “64%” calculated from “0%”. The

Next, the sound volume determination unit 153 of each communication terminal 10 determines the sound volume for each other terminal user so that the sound volume increases as the gaze evaluation information after the correction process is higher.

In this embodiment, it is determined that the input voice information from other terminal users whose gaze evaluation information is 0% to 10% is amplified with an amplification factor of 10% of the maximum amplification factor of the own communication terminal 10, and gaze evaluation is performed. It is determined to amplify the input voice information from other terminal users whose information is 10% or more with the same amplification factor as the gaze evaluation information.

For example, in the communication terminal 10X, the input voice information from the other terminal user A is transmitted to the own communication terminal 10X.
Is determined to be amplified at an amplification factor of 80% of the maximum amplification factor, and input voice information from the other terminal user B is determined to be amplified at an amplification factor of 64%, and input from the other terminal users C and D is determined. It is decided to amplify the voice information with an amplification factor of 10%.

Similarly, in communication terminal 10A, it is determined that the input voice information from other terminal user B is amplified with an amplification factor of 80% of the maximum amplification factor of own communication terminal 10A, and other terminal users X, C, and D
It is decided to amplify the input voice information from 10%.

Similarly, in the communication terminal 10B, it is determined to amplify the input voice information from the other terminal user A with an amplification factor of 100% of the maximum amplification factor of the own communication terminal 10B, and from the other terminal users X, C, and D It is decided to amplify the input voice information by 10%.

Similarly, in communication terminal 10C, it is determined to amplify input voice information from other terminal user X at an amplification factor of 100% of the maximum amplification factor of own communication terminal 10C, and input voice information from other terminal user A is 80%. %, And the input voice information from other terminal user B is 64%
It is determined that the input voice information from the other terminal user D is amplified by 10%.

Similarly, in the communication terminal 10D, it is determined that the input voice information from the other terminal users X and A to C is amplified at 10% of the maximum amplification factor of the own communication terminal 10D.

The bar graph shown in the lower part of FIG. 7 shows the amplification factor of the input voice information for each other terminal user determined by each communication terminal 10.

In each communication terminal 10, the input voice information of the other terminal user received from the service management server 20 is amplified by the determined amplification factor and output from the voice output unit 16.

According to the present embodiment described above, when performing a simultaneous call with a plurality of opponents using the communication system, not only the voice of the opponent that the user's own terminal user is interested in but also watching, It is also possible to increase the volume of the voice of the other party who is having a conversation with the other party who is gazing, making it easier to hear.

In the first embodiment described above, the gaze evaluation information of the user of the first communication terminal with respect to the user of the second communication terminal and the gaze evaluation information of the user of the second communication terminal with respect to the user of the third communication terminal are considered. Then, although the case where the gaze evaluation information of the own communication terminal with respect to the other terminal user is corrected has been described, the gaze evaluation information of the user of the third communication terminal with respect to the user of the fourth communication terminal, and the fifth communication terminal User gaze evaluation information of the fourth communication terminal with respect to the user
In consideration of the gaze evaluation information of other terminal users acquired in a chain as in the above, the gaze evaluation information of the own terminal user with respect to the other terminal users may be corrected.

At this time, the volume may be forcibly lowered as the distance from the own communication terminal (how many communication terminals are routed) increases due to the chain. By configuring in this way, the volume can be adjusted so that the value of the gaze evaluation information increases as the relationship with the own communication terminal is stronger, and a more natural and realistic virtual space can be provided. . If at least the output related to the other communication terminal, which is output from the own communication terminal to the user of the own communication terminal, is adjusted based on the information indicating the degree of gaze in the other communication terminal, Although simultaneous calls can be facilitated, using the line-of-sight position of the own communication terminal can further facilitate simultaneous calls.

<< Second Embodiment >>
As a communication terminal according to the second embodiment of the present invention, in addition to the processing described in the first embodiment, the communication terminal performs processing so as to make it easy to hear the voices of other terminal users who are highly interested in the terminal user. Will be described.

Since the configuration of the communication system using the communication terminal according to the present embodiment is the same as the configuration of the communication system 1 described in the first embodiment, detailed description thereof is omitted.

Of the processes executed by each communication terminal 10 in the communication system 1 according to the present embodiment,
Since the processing corresponding to steps S1 to S7 in FIG. 4 is the same as that described in the first embodiment, detailed description thereof is omitted.

In the present embodiment, regarding the processing when the gaze evaluation information in step S8 is corrected,
This will be described using a specific example of FIG.

In this example, as shown in the upper part of FIG. 8, when user A and user B are having a conversation,
Of the five seconds that are subject to gazing point detection, user X gazes at user A's image information for four seconds,
User A watches user B's image information for 4 seconds, user B watches user A's image information for 5 seconds, user C watches user X's image information for 5 seconds, and user X speaks Shows the case.

In this case, each communication terminal 10 performs the gaze evaluation information generation process and correction process in the same manner as in the first embodiment. Further, although the gaze evaluation information from the own terminal user is not high, the gaze evaluation information from the own terminal user is not high. The evaluation information is high and the gaze evaluation information for the other terminal user who is speaking is corrected to be high.

For example, in the example of FIG. 8, the gaze evaluation information of the user X with respect to the user A is the highest “80%”, and it is determined that the user X is speaking to the user A because the user X is speaking. The In this case, although the user A is not gazing at the image information of the user X, the gaze evaluation information for the user X generated by the communication terminal 10A is “0” in order to make the user A easily aware of the voice from the user X. % ”,“ 8 ”that is the same value as the user X ’s gaze evaluation information for user A
It is corrected to “0%”.

Then, the sound volume determination unit 153 of each communication terminal 10 determines the sound volume for each other terminal user so that the higher the gaze evaluation information after the correction processing is, the larger the sound volume is. Audio information is output from the audio output unit 16.

According to the present embodiment described above, when making a simultaneous call with a plurality of other parties using the communication system, the own terminal user shows high interest and the other party's voice, and the own terminal user watches. The volume of the voice of the other party who is talking to the other party can be increased to make it easier to hear, and the volume of the voice of the other party who is speaking to the terminal user can be increased to make it easier to hear. .

In the first embodiment and the second embodiment described above, when adjusting the sound volume of a plurality of other terminal users, the amplification factor of the input voice information of each other terminal user is changed with respect to the maximum amplification factor of the own communication terminal. Although the case where it carries out was demonstrated, you may adjust by changing the ratio of the audio | voice level of each other terminal user's audio | voice information with respect to the maximum audio | voice level of an own communication terminal.

In addition, in the first embodiment and the second embodiment described above, when the line of sight is removed from the other terminal user that one of the users is gazing at, the sound volume is immediately adjusted accordingly, for several seconds. Even just looking around, the voice of the other party in the conversation may be cut off, making it difficult to hear the conversation.

Therefore, in each communication terminal, when the input voice information of any other terminal user is changed from a loud volume greater than or equal to a predetermined value to a smaller volume less than or equal to a predetermined volume, it fades out gradually over a predetermined time (for example, 10 seconds). The volume may be reduced as described above. By adjusting in this way, a more natural and realistic virtual space can be provided.

Furthermore, in each communication terminal, when it is detected that there is a specific other terminal user who has watched a predetermined number of times or more during a series of conversations, it is determined that the other terminal user is highly interested, and the other terminal user's When the input audio information is adjusted from a volume higher than a predetermined value to a lower volume, the volume may be lowered so as to gradually fade out over a longer time (for example, 20 seconds).

In the first and second embodiments described above, the case has been described where image information indicating other terminal users registered in advance in the service management server is acquired and displayed on the display unit of the own communication terminal. The image information of the user taken by the imaging device mounted on each communication terminal may be received via the service management server and displayed.

In the first embodiment and the second embodiment described above, the case where the gaze evaluation information generated by each communication terminal is transmitted to the service management server has been described. However, the gaze point detected by each communication terminal (self Coordinate information indicating the position in the display unit being watched by the terminal user) may be transmitted to the service management server, and the gaze evaluation information may be generated by the service management server and transmitted to another communication terminal.

Further, in the first embodiment and the second embodiment described above, the case where the volume of the sound received from another communication terminal is adjusted based on the gaze evaluation information has been described. However, based on the gaze evaluation information, the display unit Display status of display information showing each user displayed in (size, brightness,
Color, definition, etc.) may be adjusted.

DESCRIPTION OF SYMBOLS 1 ... Communication system 10, 10A, 10B, 10C, 10D, 10X ... Communication terminal 11 ... Operation input part 12 ... Display part 13 ... Voice input part 14 ... Gaze point detection part 15 ... Voice control part 16 ... Voice output part 20 ... Service management server 21 ... Communication control unit 22 ... Image management unit 23 ... Audio management unit 24 ... Gaze evaluation information management unit 30 ... Network 121A, 121B, 121C, 121D ... Image information 122 ... Name information column 151 ... Gaze evaluation information generation unit 152 ... Gaze evaluation information correction unit 153 ... Volume determination unit

Claims (4)

  1. The first communication terminal is a communication terminal that communicates with other communication terminals including at least the second communication terminal and the third communication terminal,
    The first communication terminal adjusts a display state of display information related to the third communication terminal based on a result of detecting a degree of interest of the user of the second communication terminal with respect to the third communication terminal. A communication terminal characterized by that.
  2. The first communication terminal is a communication terminal that communicates with other communication terminals including at least the second communication terminal and the third communication terminal,
    The first communication terminal adjusts a display state of display information related to the third communication terminal based on a degree of attention of the user of the second communication terminal to the third communication terminal. Communication terminal.
  3. The first communication terminal communicates with other communication terminals including at least the second communication terminal and the third communication terminal,
    The first communication terminal adjusts a display state of display information related to the third communication terminal based on a result of detecting a degree of interest of the user of the second communication terminal with respect to the third communication terminal. A communication terminal control method.
  4. The first communication terminal communicates with other communication terminals including at least the second communication terminal and the third communication terminal,
    Based on the result of detecting the degree of interest in the third communication terminal by the user of the second communication terminal, the computer of the first communication terminal changes the display state of the display information related to the third communication terminal. A control program for a communication terminal, characterized by causing the adjustment step to be executed.
JP2017007512A 2017-01-19 2017-01-19 Communication terminal, communication terminal control method, and communication terminal control program Pending JP2017103801A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017007512A JP2017103801A (en) 2017-01-19 2017-01-19 Communication terminal, communication terminal control method, and communication terminal control program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017007512A JP2017103801A (en) 2017-01-19 2017-01-19 Communication terminal, communication terminal control method, and communication terminal control program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2013203432 Division 2013-09-30

Publications (1)

Publication Number Publication Date
JP2017103801A true JP2017103801A (en) 2017-06-08

Family

ID=59015744

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017007512A Pending JP2017103801A (en) 2017-01-19 2017-01-19 Communication terminal, communication terminal control method, and communication terminal control program

Country Status (1)

Country Link
JP (1) JP2017103801A (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59153369A (en) * 1983-02-21 1984-09-01 Hitachi Ltd Voice fading switching system
JPH04191885A (en) * 1990-11-27 1992-07-10 Nippon Telegr & Teleph Corp <Ntt> Sound image control processing method
JPH04237288A (en) * 1991-01-21 1992-08-25 Nippon Telegr & Teleph Corp <Ntt> Audio signal output method for plural-picture window display
JPH0730877A (en) * 1993-07-12 1995-01-31 Oki Electric Ind Co Ltd Inter-multi location multimedia communications conference system
JPH08163527A (en) * 1994-12-09 1996-06-21 Nec Corp Terminal equipment for electronic conference
JPH11234640A (en) * 1998-02-17 1999-08-27 Sony Corp Communication control system
JP2002176503A (en) * 2000-12-08 2002-06-21 Nec Corp Multipoint videoconference controller, voice switching method, and recording medium with recorded program thereof
JP2004248125A (en) * 2003-02-17 2004-09-02 Nippon Telegr & Teleph Corp <Ntt> Device and method for switching video, program for the method, and recording medium with the program recorded thereon
JP2007151103A (en) * 2005-11-02 2007-06-14 Yamaha Corp Teleconference device
JP2007300452A (en) * 2006-05-01 2007-11-15 Mitsubishi Electric Corp Image and television broadcast receiver with sound communication function
JP4465880B2 (en) * 1998-10-09 2010-05-26 ソニー株式会社 Communication apparatus and method
JP4487467B2 (en) * 1999-11-24 2010-06-23 ソニー株式会社 Communications system
JP2010200150A (en) * 2009-02-26 2010-09-09 Toshiba Corp Terminal, server, conference system, conference method, and conference program
JP2010206307A (en) * 2009-02-27 2010-09-16 Toshiba Corp Information processor, information processing method, information processing program, and network conference system
JP2012165170A (en) * 2011-02-07 2012-08-30 Nippon Telegr & Teleph Corp <Ntt> Conference device, conference method and conference program

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59153369A (en) * 1983-02-21 1984-09-01 Hitachi Ltd Voice fading switching system
JPH04191885A (en) * 1990-11-27 1992-07-10 Nippon Telegr & Teleph Corp <Ntt> Sound image control processing method
JPH04237288A (en) * 1991-01-21 1992-08-25 Nippon Telegr & Teleph Corp <Ntt> Audio signal output method for plural-picture window display
JPH0730877A (en) * 1993-07-12 1995-01-31 Oki Electric Ind Co Ltd Inter-multi location multimedia communications conference system
JPH08163527A (en) * 1994-12-09 1996-06-21 Nec Corp Terminal equipment for electronic conference
JPH11234640A (en) * 1998-02-17 1999-08-27 Sony Corp Communication control system
JP4465880B2 (en) * 1998-10-09 2010-05-26 ソニー株式会社 Communication apparatus and method
JP4487467B2 (en) * 1999-11-24 2010-06-23 ソニー株式会社 Communications system
JP2002176503A (en) * 2000-12-08 2002-06-21 Nec Corp Multipoint videoconference controller, voice switching method, and recording medium with recorded program thereof
JP2004248125A (en) * 2003-02-17 2004-09-02 Nippon Telegr & Teleph Corp <Ntt> Device and method for switching video, program for the method, and recording medium with the program recorded thereon
JP2007151103A (en) * 2005-11-02 2007-06-14 Yamaha Corp Teleconference device
JP2007300452A (en) * 2006-05-01 2007-11-15 Mitsubishi Electric Corp Image and television broadcast receiver with sound communication function
JP2010200150A (en) * 2009-02-26 2010-09-09 Toshiba Corp Terminal, server, conference system, conference method, and conference program
JP2010206307A (en) * 2009-02-27 2010-09-16 Toshiba Corp Information processor, information processing method, information processing program, and network conference system
JP2012165170A (en) * 2011-02-07 2012-08-30 Nippon Telegr & Teleph Corp <Ntt> Conference device, conference method and conference program

Similar Documents

Publication Publication Date Title
US8289363B2 (en) Video conferencing
CN101427303B (en) Method and device for latency reduction in a display device
JP2005521340A (en) Telecommunications system
US8634533B2 (en) Directed notifications
JP6151273B2 (en) Video conferencing with unlimited dynamic active participants
US20090089685A1 (en) System and Method of Communicating Between A Virtual World and Real World
US20140229866A1 (en) Systems and methods for grouping participants of multi-user events
US20050101308A1 (en) Mobile station and a method for controlling the mobile station in conferencing mode for use in mobile communication system
US8379076B2 (en) System and method for displaying a multipoint videoconference
US8947493B2 (en) System and method for alerting a participant in a video conference
US8902272B1 (en) Multiparty communications systems and methods that employ composite communications
US20120081506A1 (en) Method and system for presenting metadata during a videoconference
US8436887B2 (en) Mobile terminal, display apparatus and control method thereof
US20110216153A1 (en) Digital conferencing for mobile devices
US20100020951A1 (en) Speaker Identification and Representation For a Phone
JP2013502828A (en) Camera-based facial recognition or other presence detection method as a method of sounding a telephone device alarm,
WO2001010121A1 (en) Method and apparatus for enabling a videoconferencing participant to appear focused on camera to corresponding users
KR101417002B1 (en) A mobile communication terminal having multilateral image communication function and multilateral method for converting image communication mode thereof
US9031222B2 (en) Automatic supervisor intervention for calls in call center based upon video and/or speech analytics of calls
US20070036137A1 (en) Indicating presence of a contact on a communication device
RU2599539C2 (en) Method of use of interactive messaging service providing reception acknowledgement
US20100315482A1 (en) Interest Determination For Auditory Enhancement
US6608644B1 (en) Communication system
US9782675B2 (en) Systems and methods for interfacing video games and user communications
CN1674561A (en) System and method for providing a instant information service

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20171019

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20171121

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20171208

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180605

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180706

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20181225