JP4236606B2 - Communication terminal, television receiver, information output method, information output program, and recording medium containing information output program - Google Patents

Communication terminal, television receiver, information output method, information output program, and recording medium containing information output program Download PDF

Info

Publication number
JP4236606B2
JP4236606B2 JP2004082976A JP2004082976A JP4236606B2 JP 4236606 B2 JP4236606 B2 JP 4236606B2 JP 2004082976 A JP2004082976 A JP 2004082976A JP 2004082976 A JP2004082976 A JP 2004082976A JP 4236606 B2 JP4236606 B2 JP 4236606B2
Authority
JP
Japan
Prior art keywords
content
information
communication terminal
avatar
means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2004082976A
Other languages
Japanese (ja)
Other versions
JP2005269557A (en
Inventor
隆夫 乾
俊幸 岩井
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2004082976A priority Critical patent/JP4236606B2/en
Publication of JP2005269557A publication Critical patent/JP2005269557A/en
Application granted granted Critical
Publication of JP4236606B2 publication Critical patent/JP4236606B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a communication terminal, a television receiver, an information output method, an information output program, and a recording medium on which an information output program is recorded, and particularly when viewing content distributed in real time, The present invention relates to a communication terminal, a television receiver, an information output method, an information output program, and a recording medium on which an information output program is recorded.

  The need for communication with other people has increased due to the development of networks, and methods for realizing various types of communication have been proposed.

  Furthermore, in such communication, there is a demand for communication with other viewers who are not present when viewing content distributed in real time individually, for example, at home. .

  The communication required in such a scene is a communication that feels “connected” with other viewers, and more specifically, the feeling of sharing a place with other viewers, other viewing It can be said that the communication is accompanied by a sensation in which the presence of the viewer can be felt and a sense of unity with other viewers.

  In response to such a demand, for example, in Patent Document 1, a method and a television community for forming a television community in which viewers can exchange opinions on a common television broadcast program and can communicate freely and interactively. A system has been proposed.

For this reason, the television community system proposed in Patent Document 1 includes a viewer terminal used by a plurality of viewers set for a certain TV broadcast program and a management server that manages them. Multimedia data input by the user at the viewer terminal is transmitted to another viewer terminal via the management server.
JP 2001-148841 A

  However, in the television community system disclosed in Patent Document 1 described above, multimedia data is transmitted to other viewer terminals between a plurality of viewer terminals set for a predetermined television broadcast program. Just send it. Therefore, even when the television community system disclosed in Patent Document 1 is used, when the television broadcast program being viewed is changed, the changed television broadcast program is changed between the plurality of viewer terminals. There is a problem that it is difficult to communicate with other viewers when a user is naturally watching content distributed in real time, and feels “connected” with other viewers.

  The present invention has been made in view of such a problem, and when viewing content distributed in real time, a communication terminal, a television receiver, and the like that can realize communication with other viewers, An object is to provide an information output method, an information output program, and a recording medium on which the information output program is recorded.

  In order to achieve the above object, according to an aspect of the present invention, a communication terminal acquires content information, which is information about content being played back, from a content playback device that receives and plays back content in real time. Means, content information receiving means for receiving content information from another communication terminal, content information received from another communication terminal, and content information acquired from the content reproduction device are information relating to the same content, Output means for outputting information indicating other communication terminals.

  The communication terminal preferably further includes content information transmission means for transmitting the content information acquired from the content reproduction device by the content information acquisition means to another communication terminal.

  Furthermore, the communication terminal includes a detection unit that detects a change in the content being played back in the content playback device, and the content information transmission unit changes the change when the content being played back is detected in the detection unit. After that, it is preferable to transmit the content information of the content being reproduced to another communication terminal.

  The output means outputs information indicating the communication terminal, and the communication terminal transmits an operation to the information indicating the communication terminal and an operation for the information indicating the communication terminal to another communication terminal. Operation information transmitting means, and operation information receiving means for receiving an operation for information indicating another communication terminal from another communication terminal, and content information received from the other communication terminal and acquired from the content reproduction device In the case where the content information is information related to the same content, the output means preferably outputs information indicating other communication terminals based on the operation information.

  In addition, the communication terminal includes a designation unit that designates information indicating another communication terminal output by the output unit, and a multimedia that transmits multimedia data to the other communication terminal corresponding to the designated information. It is preferable to further comprise data transmission means.

  The communication terminal further includes multimedia data receiving means for receiving multimedia data from another communication terminal, and storage means for storing the received multimedia data, and the output means is for receiving the multimedia data. It is preferable to output information indicating the communication terminal.

  Further, the designation means designates information indicating another communication terminal that has received the multimedia data, and the communication terminal stores the multimedia received from the other communication terminal corresponding to the designated information stored in the storage means. It is preferable to further include a reproducing unit that reproduces the data at a timing designated by the designation unit.

  The multimedia data described above is more preferably audio data.

  According to another aspect of the present invention, a television receiver receives a television signal from a broadcast station in real time and reproduces the television signal, and reproduces the television signal of the other device from the other device. Viewing information receiving means for receiving viewing information, which is information relating to a television program being reproduced by the means, viewing information received from another device, and viewing information of the television program being reproduced by the television signal reproducing means is the same television And output means for outputting information indicating another device when the information is related to a program.

  According to still another aspect of the present invention, an information output method includes a content information acquisition step of acquiring content information, which is information relating to content being played back, from a content playback device that receives and plays back content in real time; When the content information receiving step for receiving the content information from the communication terminal, the content information received from another communication terminal, and the content information acquired from the content reproduction apparatus are information relating to the same content, the other communication An output step of outputting information indicating the terminal.

  According to still another aspect of the present invention, an information output program causes a computer to execute an information output process for outputting information about a communication terminal that communicates with a content reproduction device that receives and reproduces content in real time. A content information acquisition step for acquiring content information, which is information related to the content being played back, from a content playback device, a content information reception step for receiving content information from another communication terminal, and a reception from another communication terminal When the content information obtained and the content information acquired from the content reproduction apparatus are information relating to the same content, an output step of outputting information indicating another communication terminal is executed.

  According to still another aspect of the present invention, the recording medium is a computer-readable recording medium and records the above-described information output program.

  Embodiments of the present invention will be described below with reference to the drawings. In the following description, the same parts and components are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.

  FIG. 1 is a diagram showing a specific example of the configuration of a content transmission / reception system (hereinafter abbreviated as a system) according to the present embodiment. In the present embodiment, it is assumed that the system receives a television signal transmitted from the broadcast station 400 as content and views a television program, but the content according to the present invention is not limited to a television program, It shall include all content distributed in real time, including television programs. Therefore, in the present embodiment, the content supply source is the broadcast station 400, but a supply device corresponding to the content may be used.

  Referring to FIG. 1, the system according to the present embodiment receives a broadcast station 400 that supplies content that is a television program, and a television signal transmitted from broadcast station 400, and processes and displays the signal. A television receiver (hereinafter abbreviated as “TV”) 200 which is a content reproduction device that maps to the device, and a communication terminal 100 connected to the TV 200 are configured.

  Specific examples of the content reproduction apparatus according to the present invention include a personal computer, a mobile phone, and a PDA (Personal Digital Assistants) in addition to the TV 200 according to the present embodiment.

  This system is configured by participation of a plurality of participants A to D who use (view) each TV 200, and represents the TV 200a to 200d used by each participant A to D as a TV 200, The communication terminals 100a to 100d used by the participants A to D and connected to the TVs 200a to 200d are represented as communication terminals 100.

  Transmission of a television signal from the broadcasting station 400 to the TV 200 may be performed via a dedicated cable, or may be performed by wireless communication such as satellite waves and terrestrial waves. Depending on the content to be distributed, stream distribution may be performed via the Internet or a dedicated line, or distribution may be performed via a dedicated line such as a LAN (Local Area Network).

  The communication terminal 100 is connected to the corresponding TV 200 for communication, and is connected to the network 300 for communication with each other.

  The communication terminal 100 will be described in detail later, but in this embodiment, it is assumed that the communication terminal 100 is a separate device from the TV 200 connected to the TV 200 that is a content reproduction device. However, it goes without saying that the device may be configured integrally with the TV 200. Also, when the content is other than a TV program and the content playback device is a personal computer, a mobile phone, a PDA, or the like, the communication terminal 100 is integrated with the content playback device corresponding to the content. The apparatus may be configured as follows. In that case, each process described below is a process performed in a content reproduction apparatus such as the TV 200.

  In the present embodiment, specifically, communication terminal 100 is a small communication device such as a so-called set-top box that is connected to TV 200 that is a content reproduction device using a dedicated line. However, for example, it may be a device constructed using a device having a general communication function such as a personal computer connected to a content reproduction device such as the TV 200 or a so-called home server, or a content reproduction device such as the TV 200. It may be a device included in a video tape playback device, a DVD (Digital Video Disc) playback device, etc.

  Further, communication with the TV 200 is not limited to wired communication, and wireless communication such as infrared communication may be performed. In that case, the communication terminal 100 may be a small communication device that is not physically connected to the TV 200, specifically, may be a device included in a remote controller attached to the TV 200, or communicates with the TV 200. It may be a device included in a mobile phone or PDA that can be used.

  FIG. 2 shows a specific example of the hardware configuration of the communication terminal 100.

  Referring to FIG. 2, communication terminal 100 includes a CPU (Central Processing Unit) 11 that controls communication terminal 100 as a whole, and a ROM (Read Only Memory) 12 that records programs executed by CPU 11, unique information, and the like. In addition, a variety of information and tables are recorded, and a RAM (Random Access Memory) 13 serving as a temporary work area when the CPU 11 executes the program, a CD-ROM (Compact Disc-Read Only Memory), a DVD-ROM ( An input unit configured to include a recording medium reading unit 14 that reads information and a program from a recording medium such as a digital video disc-read only memory) and an input device such as an operation button and a microphone, and receives an instruction signal and input content data. 15 and an output device such as a speaker or LED (Light Emitting Diode). Output unit 16 that performs power transmission and communication unit 17 that is an interface with other communication terminals via TV 200 or network 300, and these are connected by bus 18.

  The input unit 15 described above may receive an instruction signal or content data by receiving a signal transmitted from another device, or read an instruction signal or content data recorded on a recording medium. May be accepted.

  The hardware configuration of the communication terminal 100 shown in FIG. 2 is the same as the hardware configuration of a general personal computer that can have a communication function. However, when the communication terminal 100 is included in the TV 200 itself or in a mobile phone that can communicate with the TV 200 as described above, in addition to the hardware configuration shown in FIG. 2 or shown in FIG. Instead of at least a part of the hardware configuration, other configurations necessary for these devices may be included.

  FIG. 3 shows a specific example of the functional configuration of the communication terminal 100. Each function shown in FIG. 3 is a function that is exhibited when the CPU 11 reads a program recorded in the ROM 12 or the like, executes it on the RAM 13, and controls each part shown in FIG.

  Referring to FIG. 3, communication terminal 100 communicates to control communication between network communication control unit 101 that is a communication interface that controls communication with other communication terminal 100 via network 300 and TV 200 connected thereto. And a TV communication control unit 102 which is an interface.

  Furthermore, the communication terminal 100 includes a viewing information management unit 103 that performs processing related to a TV program viewed on the TV 200 connected to the connected TV 200 and another communication terminal 100 that performs communication via the network 300, and a channel change detection unit. 104, viewing information storage unit 105, and avatar management unit 106, avatar operation unit 107, avatar designation unit 108, and avatar state storage unit that perform processing related to the display of an avatar that is a character representing a participant who uses TV 200 109, a message management unit 112 that performs processing related to transmission / reception of voice messages that are multimedia data between the avatar information storage unit 110, the avatar display control unit 111, and other communication terminals 100 via the network 300, and a message storage unit 113. , Me Sage input unit 114, a message output unit 115, and includes a message operation unit 116 constructed.

  As shown in FIG. 4, the viewing information storage unit 105 stores viewing information on the TV 200 connected to the communication terminal 100 as a self-viewing information table. Specifically, referring to FIG. 4, a channel number and a broadcasting station of the TV program as information relating to a TV program currently being viewed on connected TV 200 (hereinafter, this information is referred to as self-viewing information). The name and the television program name are stored as a self-viewing information table in the form of a table. Needless to say, the form of the stored information is not limited to the table form, and may be another data form. The same applies to data storage in other storage units.

  In addition, as shown in FIG. 5, the viewing information storage unit 105 stores the viewing information on each TV 200 connected to the other communication terminal 100 that communicates with the communication terminal 100 via the network 300 as the participant viewing information. Stored as a table. Specifically, referring to FIG. 5, information relating to a TV program that is currently viewed on TV 200 used by each participant (hereinafter, this information is referred to as participant viewing information) is used to uniquely identify the participant. Participant ID, channel number, broadcast station name of the TV program, and TV program name, which are unique information to be shown, are stored in a table form as a participant viewing information table. Note that the participant viewing information table may include self-viewing information, or may include only viewing information of other participants other than the self-viewing information. In addition, the participant viewing information table may be configured only with viewing information about the TV 200 currently being viewed in the system, or may be set in advance for the TV 200 of all participants of the system. It may be composed of viewing information.

  Also, in the viewing information storage unit 105, as shown in FIG. 6, a participant who is the same participant viewing information as the current own viewing information (hereinafter, this participant is referred to as the viewing information person) The ID is stored as the viewing information person table. The viewing information person table shown in FIG. 6 is generated by the viewing information management unit 103 at a predetermined timing based on the own viewing information table (FIG. 4) and the participant viewing information table (FIG. 5).

  The viewing information management unit 103 manages these tables stored in the viewing information storage unit 105.

  The channel change detection unit 104 detects a channel change in the TV 200 connected to the communication terminal 100 via the TV communication control unit 102. Then, the detection information is passed to the viewing information management unit 103.

  The viewing information management unit 103 updates the own viewing information table (FIG. 4) and the viewing information person table (FIG. 6) based on the detection information passed from the channel change detection unit 104. The self-viewing information updated based on the detection information is passed to the avatar management unit 106. Furthermore, it transmits to the other communication terminal 100 via the network communication control part 101. FIG.

  Also, the viewing information management unit 103 receives participant viewing information about other participants from the other communication terminals 100 via the network communication control unit 101. Then, the viewing information management unit 103, based on the detection information received from the other communication terminals 100 via the network communication control unit 101, the participant viewing information table (FIG. 5) and the viewing information person table (FIG. 6). Update. Further, the detection information is passed to the avatar management unit 106.

  The avatar management unit 106 manages an avatar that is a character representing a participant who uses the communication terminal 100. The avatar is an image that is displayed together with a television program on an image display unit (not shown) of the TV 200 and represents a participant who is currently viewing the television program. The image corresponds to a character image, a photographic image, a character, or the like as mentioned in this embodiment. Information relating to an avatar image, which is an image used as an avatar, and an avatar image corresponding to each state for each avatar image is stored in advance in the avatar information storage unit 110.

  In the avatar state storage unit 109, as shown in FIG. 7, avatar information, which is information related to an avatar representing a participant who uses the communication terminal 100 (hereinafter, this avatar is referred to as a self-avatar), It is stored as an information table. Specifically, referring to FIG. 7, as the avatar information about the own avatar (hereinafter, the avatar information is referred to as own avatar information), the display position (coordinates) of the own avatar and the avatar image used as the own avatar are displayed. Information for identifying from images stored in the avatar information storage unit 110 (identified by “appearance” in this specific example, but also uniquely assigned identification such as an ID for identifying an image) Information for identifying the image corresponding to the state of the own avatar from the images stored in the avatar information storage unit 110 (in this specific example, an operation representing “state”). In addition, identification information uniquely assigned such as an ID for specifying an image corresponding to the state may be used) as a self-avatar information table in the form of a table. It is stored.

  The avatar state storage unit 109 also has an avatar (hereinafter referred to as this avatar) representing each participant who uses another communication terminal 100 that communicates with the communication terminal 100 via the network 300, as shown in FIG. Avatar information relating to the avatar is referred to as a participant avatar) is stored as a participant avatar information table. Specifically, as shown in FIG. 8, as information on each participant avatar (hereinafter, this avatar information is referred to as participant avatar information), a participant ID that is unique information that uniquely indicates the participant. The display position (coordinates) of the participant avatar, information for identifying the avatar image used as the participant avatar from the images stored in the avatar information storage unit 110, and the participant avatar Information for specifying an image corresponding to the state from the images stored in the avatar information storage unit 110 and information indicating the number of voice messages transmitted from the participant in the form of a table It is stored as an avatar information table. In addition, a participant avatar information table may be comprised including own avatar information, and may be comprised only from participant avatar information other than own avatar information. In addition, the participant avatar information table may be composed only of participant avatar information about the participant who uses the TV 200 currently being viewed in the system, or may be a preset book. It may consist of participant avatar information for all participants in the system.

  The avatar management unit 106 manages these tables stored in the avatar state storage unit 109.

  The avatar management unit 106 updates the own avatar information table (FIG. 7) based on the own viewing information passed from the viewing information management unit 103. Further, the participant avatar information table (FIG. 8) is updated based on the participant viewing information about other participants passed from the viewing information management unit 103.

  The avatar designation unit 108 accepts designation of the participant avatar of the participant in order to process a voice message that is multimedia data related to the communication terminal 100 used by the particular participant. Specifically, the avatar designation unit 108 is configured from buttons, a touch panel, an operation device such as a controller, or the like that is associated with each participant avatar in advance or that selects and determines each participant avatar in order. Then, an operation signal corresponding to the designated operation is passed to the avatar management unit 106. Alternatively, when the avatar designation unit 108 has a voice recognition function or an image recognition function, an operation by voice or a designation operation by image may be accepted and an operation signal corresponding to the designation operation may be passed to the avatar management unit 106. .

  The avatar operation unit 107 accepts an operation for the participant specified by the user of the communication terminal 100 using the own avatar or the avatar specification unit 108 other than the own avatar, and passes the operation signal to the avatar management unit 106. Specifically, the avatar operation unit 107 is configured by buttons, a touch panel, an operation device such as a controller, which represent some predetermined operations and the degree of the operation, and an operation signal corresponding to the operation is sent to the avatar. It passes to the management unit 106. Alternatively, when the avatar operation unit 107 has a voice recognition function or an image recognition function, an operation by voice or an operation by image may be accepted and an operation signal corresponding to the operation may be passed to the avatar management unit 106.

  The avatar management unit 106 updates the own avatar information table (FIG. 7) based on the operation signal passed from the avatar operation unit 107. Also, the own avatar information updated based on the operation information is passed to another communication terminal 100 via the network communication control unit 101. In addition, the avatar management unit 106 receives avatar information from the other communication terminal 100 via the network communication control unit 101 in response to an avatar operation on the other communication terminal 100, and the participant avatar based on the avatar information. The information table (FIG. 8) is updated.

  And the avatar management part 106 updates the display of the own avatar displayed on the image display part (not shown) of TV200 connected, the display of a participant avatar, or the display of a participant avatar as needed. In order to perform this, an instruction signal is passed to the avatar display control unit 111.

  Based on the instruction signal passed from the avatar management unit 106, the avatar display control unit 111 displays the avatar information about the avatar that performs display or updates the display via the avatar management unit 106 or the avatar state storage unit 109. A display control signal to be acquired from the avatar information storage unit 110 and displayed on the image display unit (not shown) of the TV 200 is generated. Then, the generated display control signal is transmitted to the TV 200 connected to the communication terminal 100 via the TV communication control unit 102.

  A specific example of the avatar image displayed on the image display unit (not shown) of the TV 200 based on the display control signal generated by the avatar display control unit 111 is shown in FIG. In FIG. 9, the TV program “News N” distributed from the A broadcast station is viewed on the TV 200b used by the participant B, and the TV program 200d used by the participant D who is another participant also views the TV program. In this case, the display on the image display unit of the TV 200b of the participant B is specifically shown. As shown in FIG. 9, in this case, the avatar image 500b of the own avatar corresponding to the participant B who uses the TV 200b and the participant avatars corresponding to other participants watching the same TV program are displayed. The avatar image 500d is displayed on the image display unit of the TV 200b.

  When the avatar operation unit 107 accepts a predetermined operation, an instruction signal is passed from the avatar management unit 106 to the avatar display control unit 111, and the avatar display control unit 111 displays an image display unit (not shown) of the TV 200. The display control signal may be generated so as not to display the avatar temporarily designated in ().

  In addition, the avatar management unit 106 specifies specific information for specifying the specified avatar based on the operation signal passed from the avatar designation unit 108 and the operation signal passed from the avatar operation unit 107, and an operation on the avatar. Is sent to the message management unit 112, indicating that it is processing of a voice message related to the participant corresponding to the avatar.

  The message management unit 112 receives voice messages from other communication terminals 100 via the network communication control unit 101 and stores them in the message storage unit 113. Further, reception of a voice message is detected, and a detection signal is passed to the avatar management unit 106. The avatar management unit 106 updates the participant avatar table (FIG. 8) based on the detection signal passed from the message management unit 112.

  The message operation unit 116 accepts an operation related to a voice message from the user of the communication terminal 100 to a participant corresponding to the avatar designated by the avatar designation unit 108 and passes an operation signal to the message management unit 112. Specifically, the message operation unit 116 is a button or touch panel representing processing related to some predetermined voice messages (for example, recording a voice message, transmitting a voice message, listening to a voice message, etc.) Or an operation device such as a controller, and passes an operation signal corresponding to the operation to the message management unit 112. The message management unit 112 is specific information specifying the specified avatar passed from the avatar management unit 106, and information indicating that the operation on the avatar is processing of a voice message related to the participant corresponding to the avatar; Based on the operation signal passed from the message operation unit 116, processing related to the voice message is executed.

  Specifically, the message management unit 112 receives an input of a voice message from the message input unit 114, and transmits the voice message input from the message input unit 114 to the corresponding communication terminal 100 via the network communication control unit 101. .

  The message input unit 114 is a function of inputting voice such as a microphone as the input unit 15 and accepts input of a voice message to a participant corresponding to the avatar designated by the avatar designation unit 108. Further, when the message input unit 114 is configured by a recording medium reader, the input of the voice message may be accepted by reading the voice message from the recording medium. The input voice message is passed to the message management unit 112.

  The multimedia data exchanged between the communication terminals 100 of the system may be any form of multimedia data including a voice message. Specifically, the data is not limited to sound, and may be still image data or moving image data, or a combination thereof. In that case, the message input unit 114 may be in another form according to the form of multimedia data, such as an image input device such as a camera.

  Specifically, the message management unit 112 acquires a voice message related to the participant corresponding to the specified avatar from the message storage unit 113, and outputs it from the message output unit 115.

  The message output unit 115 is a function of outputting sound such as a speaker as the output unit 16, and is received from the communication terminal 100 of the participant corresponding to the avatar specified by the avatar specifying unit 108 and stored in the message storage unit 113. Output a voice message. The multimedia data exchanged between the communication terminals 100 of the system may be any form of multimedia data including a voice message. Specifically, the data is not limited to sound, and may be still image data or moving image data, or a combination thereof. In that case, the message output unit 115 may be in another form according to the form of the multimedia data, such as a display device (may be shared with an image display part (not shown) of the TV 200 or may be separate). It may be.

  Next, the process according to the channel change operation performed by the communication terminal 100 according to the present embodiment will be described.

  The process shown in the flowchart of FIG. 10 is a process performed when a channel change operation is performed on the TV 200 connected to the communication terminal 100, and is a program recorded in the ROM 12 or the like by the CPU 11 of the communication terminal 100. Is read out from the RAM 13 and executed, and each part shown in FIG. 2 is controlled to realize each function shown in FIG.

  Referring to FIG. 10, channel change detection unit 104 monitors a channel change in TV 200 via TV communication control unit 102 (S101).

  When a channel change is detected in step S101 (YES in S103), the detection information is transferred to the viewing information management unit 103, and is stored in the viewing information table shown in FIG. The self-viewing information is updated (S105).

  Further, the viewing information management unit 103 transmits the updated self-viewing information to the communication terminals 100 of all the participants via the network communication control unit 101 (S107). FIG. 11 shows a specific example of the self-viewing information transmitted to the communication terminal 100 of all participants in step S107. As shown in FIG. 11, in step S107, specifically, a participant ID, which is unique information uniquely indicating the user of the communication terminal 100, a changed channel number, and a changed television program The broadcast station name and the TV program name are transmitted to the other communication terminal 100.

  In step S107, it may be transmitted to the communication terminals 100 of all the participants of the present system registered in advance in the participant viewing information table, or to other communication terminals 100 via the network 300. It may be transmitted by broadcast.

  In addition, the viewing information management unit 103 updates the viewing information table shown in FIG. 6 (S109).

  The avatar management unit 106 reads the viewing information table updated in step S109. Then, the avatar display control unit 111 transmits a display control signal to the TV 200 to which the display control signal is connected via the TV communication control unit 102, and the avatar corresponding to the viewing information person together with the own avatar in the image display unit (not shown). Is displayed (S111).

  Thus, the process performed when the channel 200 is changed on the TV 200 connected to the communication terminal 100 ends. Then, the process returns to step S101 again, and the channel change detection unit 104 monitors the channel change in the TV 200.

  In the above-described processing, the processing order of steps S107 to S111 is not limited to this order, and may be switched.

  Next, with reference to FIG. 12, a description will be given of processing performed when the above-described processing is performed and viewing information is received from another communication terminal 100 in accordance with a channel change in the other communication terminal 100. . In the processing shown in the flowchart of FIG. 12, the CPU 11 of the communication terminal 100 reads and executes a program recorded in the ROM 12 or the like on the RAM 13, and controls each unit shown in FIG. 2 to control each function shown in FIG. It is realized by demonstrating.

  Referring to FIG. 12, the viewing information management unit 103 monitors reception of viewing information from other communication terminals 100 connected via the network 300 via the network communication control unit 101 (S131).

  When reception of viewing information from another communication terminal 100 connected via the network 300 is detected in step S131 (YES in S133), the viewing information management unit 103 determines the reception partner based on the detected information. Is compared with the viewing information stored in the viewing information table shown in FIG. 4 (S135).

  If the viewing information matches as a result of the comparison in step S135 (YES in S137), that is, in the communication terminal 100 of the receiving party, the TV program currently being viewed by the user of the communication terminal 100 When the channel is changed so that the same television program can be viewed, the viewing information manager 103 stores the viewing information of the receiving party in the viewing information table shown in FIG. The table is updated (S139).

  Further, the viewing information table updated in step S139 in the avatar management unit 106 is read. Then, in the avatar display control unit 111, a display control signal is transmitted to the TV 200 to which the display control signal is connected via the TV communication control unit 102, and the image display unit (not shown) corresponds to the reception partner participant together with the own avatar. An avatar is newly displayed (S141).

  On the other hand, as a result of the comparison in step S135, when the viewing information does not match (NO in S137), that is, in the receiving communication terminal 100, the television currently being viewed by the user of the communication terminal 100 When the channel is changed so that a TV program different from the program can be viewed, the viewing information management unit 103 retrieves the viewing information of the receiving party from the viewing information person table (S145).

  As a result of the search in step S145, if it is determined that the viewing information of the receiving party is included in the viewing information person table (YES in S147), that is, in the communication terminal 100 of the receiving party, the user of the current communication terminal 100 When it is determined that the channel has been changed so that another TV program different from the TV program can be viewed from the TV program being viewed by the viewing information management unit 103, the viewing information person table is updated, The viewing information of the receiving party is deleted from the viewing information person table (S149).

  Further, the viewing information table updated in step S149 in the avatar management unit 106 is read. Then, in the avatar display control unit 111, the participation of the reception partner transmitted to the TV 200 to which the display control signal is connected via the TV communication control unit 102 and displayed together with the own avatar in the image display unit (not shown). The display of the avatar corresponding to the person is deleted from the screen (S151).

  On the other hand, as a result of the search in step S145, when it is determined that the viewing information of the receiving party is not included in the viewing information person table (NO in S147), the communication terminal 100 of the receiving party currently uses the current communication terminal 100. The channel is changed so that another TV program that is different from the TV program currently viewed by the user of the communication terminal 100 can be viewed from another TV program that is different from the TV program currently viewed by the user. Therefore, the processes in steps S149 and S151 are skipped.

  Then, based on the received viewing information of the receiving party, the viewing information management unit 103 updates the participant viewing information table shown in FIG. 5, and the viewing information corresponding to the corresponding participant is updated (S143). ).

  Thus, the process associated with the channel change in the other communication terminal 100 connected to the communication terminal 100 via the network 300 ends. Then, the process returns to step S131 again, and the viewing information management unit 103 monitors reception of viewing information from the other communication terminals 100 connected via the network 300 via the network communication control unit 101. The

  In the above-described processing, the processing in step S143 is not limited to this timing, and may be executed at another timing.

  By executing the above-described processing in communication terminal 100 according to the present embodiment, for example, participants B and D of this system use TVs 200b and 200d, respectively, to broadcast TV program “News N As shown in FIG. 9, an avatar image 500d showing another participant D who is watching the same TV program together with the participant B is displayed on the TV 200b as shown in FIG. It is displayed over the image (or with the image).

  Participants B and D are watching TV program “News N” distributed by A broadcast station using TV 200b and 200d, respectively, and changing the channel of TV 200a used by participant A to the TV When it is assumed that the program is to be viewed, as shown in FIG. 13, first, the TV 200b has its own avatar superimposed on the image of the TV program “News N” (or together with the image) on the TV 200b. The avatar image 500b of the participant B and the avatar image 500d of the participant D who is the viewing information person are displayed. Then, viewing information is received from the communication terminal 100a of the participant A who has started viewing the television program and the above-described processing is performed, so that in step S141, the avatar image 500a of the participant A is newly created. It will be displayed. Note that when a new avatar image is displayed in step S141, an avatar image different from the normal display state may be displayed so as to indicate a state where it is newly displayed. For example, a display of an avatar image that actually indicates entry with a moving image, a display of an avatar image paired with an image indicating that it is a new display, and the like can be given. Instead of such a display or together with such a display, the fact may be expressed by voice.

  Participants A, B, and D are watching TV program “News N” distributed by A broadcast station using TVs 200a, 200b, and 200d, respectively, and the channel of TV 200a that is used by participant A is changed. Then, when another TV program different from the TV program is to be viewed, first, as shown in FIG. 14, the image display unit of the TV 200b is overlaid on the image of the TV program “News N”. (Or together with the image), the avatar image 500b of the participant B who is the own avatar, the avatar image 500a of the participant A who is the viewing information person, and the avatar image 500d of the participant D are displayed. Then, the viewing information is received from the communication terminal 100a of the participant A who has started viewing another television program different from the television program, and the above-described processing is performed. The display of the avatar image 500a is deleted. Note that when the display of the avatar image is deleted in step S151, an avatar image different from the normal display state may be displayed, which represents a state where the display is deleted. For example, a display of an avatar image that actually indicates leaving in a moving image, a display of an avatar image paired with an image indicating that the display is deleted, and the like can be given. Instead of such a display or together with such a display, the fact may be expressed by voice.

  By displaying avatars indicating other participants who are viewing in this way, when viewing content distributed in real time individually, for example, at home, etc. You can feel the sense of sharing a place with other viewers, the presence of other participants and a sense of unity with other viewers, and communication that feels “connected” with other viewers. .

  Next, the process according to the operation with respect to the avatar performed with the communication terminal 100 concerning this Embodiment is demonstrated.

  The process shown in the flowchart of FIG. 15 is a process performed when an operation is performed on the own avatar in the communication terminal 100, and the CPU 11 of the communication terminal 100 reads a program recorded in the ROM 12 or the like onto the RAM 13. This is realized by executing each function shown in FIG. 3 by controlling each part shown in FIG.

  With reference to FIG. 15, the avatar management part 106 monitors whether there is operation with respect to a self-avatar from the avatar operation part 107 (S161).

  When an operation on the own avatar is detected in step S161 (YES in S163), the own avatar information stored in the own avatar information table shown in FIG. 7 is updated based on the operation signal passed from the avatar operation unit 107. (S165).

  Furthermore, in the avatar operation unit 107, the updated own avatar information is transmitted to the communication terminals 100 of all the participants via the network communication control unit 101 (S167). FIG. 16 shows a specific example of the avatar information transmitted to the communication terminals 100 of all participants in step S167. As shown in FIG. 16, in step S167, specifically, the participant ID, which is unique information uniquely indicating the user of the communication terminal 100, and the updated self-avatar information, the self-operation after the operation. The display position of the avatar, the “appearance” which is information for identifying the avatar image used as the operated avatar from the images stored in the avatar information storage unit 110, and the state of the operated avatar “Operation”, which is information for specifying an image corresponding to the image from the images stored in the avatar information storage unit 110, is transmitted to the other communication terminal 100.

  In step S167, it may be transmitted to the communication terminals 100 of all the participants of this system registered in advance in the participant viewing information table, or to other communication terminals 100 via the network 300. It may be transmitted by broadcast.

  Then, in the avatar display control unit 111, a display control signal generated based on the updated own avatar information passed from the avatar management unit 106 is transmitted to the TV 200 connected via the TV communication control unit 102. Then, the display of the own avatar on the image display unit (not shown) is updated (S167).

  Thus, the process according to the operation on the own avatar in the TV 200 connected to the communication terminal 100 ends. Then, the process returns to step S161 again, and it is monitored in the avatar operation unit 107 whether or not there is an operation on the own avatar from the avatar operation unit 107.

  Next, with reference to FIG. 17, the processing that is performed when the above-described processing is performed and avatar information is received from the other communication terminal 100 in accordance with the operation on the own avatar at the other communication terminal 100 will be described. To do. In the processing shown in the flowchart of FIG. 17, the CPU 11 of the communication terminal 100 reads out and executes the program recorded in the ROM 12 or the like on the RAM 13, and controls each unit shown in FIG. 2 to control each function shown in FIG. It is realized by demonstrating.

  With reference to FIG. 17, the avatar management part 106 monitors reception of the avatar information from the other communication terminal 100 connected via the network 300 via the network communication control part 101 (S171).

  If reception of avatar information from another communication terminal 100 connected via the network 300 is detected in step S171 (YES in S173), the viewing information management unit 103 views the viewing information of the receiving communication terminal 100. Is extracted from the participant viewing information table shown in FIG. 5, and the viewing information of the receiving party is compared with the own viewing information stored in the own viewing information table shown in FIG. 4 (S175).

  If the viewing information matches as a result of the comparison in step S175 (YES in S177), that is, the same TV program that is currently viewed by the user of the communication terminal 100 is being viewed. When the avatar information accompanying the change of the avatar is received from the other communication terminal 100, the display control signal generated based on the avatar information passed from the avatar management unit 106 in the avatar display control unit 111 is the TV communication. The display of the avatar corresponding to the receiving party transmitted to the TV 200 connected via the control unit 102 and displayed on the image display unit (not shown) is updated (S179).

  In step S179, when avatar information as shown in FIG. 16 is received, a predetermined character specified by the “form” and “motion” in the avatar management unit 106 performs a predetermined motion. Is displayed from the avatar information storage unit 110, and the avatar display control unit 111 displays a display control signal for displaying the avatar image at a position on the image display unit of the TV 200 specified by the display position. Generated.

  Note that, as a result of the comparison in step S175, if the viewing information does not match (NO in S177), that is, a TV program different from the TV program currently being viewed by the user of the communication terminal 100 is viewed. When the avatar information accompanying the change of the avatar is received from the other communication terminal 100 being used, the process of step S179 described above is skipped.

  Then, based on the received avatar information of the receiving partner, the avatar management unit 106 updates the participant avatar information table shown in FIG. 8, and the avatar information corresponding to the corresponding participant is updated (S181). .

  Thus, the process associated with the operation on the own avatar at the other communication terminal 100 connected to the communication terminal 100 via the network 300 is completed. Then, the process returns to step S171 again, and the avatar management unit 106 monitors reception of avatar information from other communication terminals 100 connected via the network 300 via the network communication control unit 101. .

  In the above-described processing, the processing in step S181 is not limited to this timing, and may be executed at another timing.

  An avatar that is a character indicating that a participant who is watching a television program on TV 200 is viewing the television program by performing the above-described processing in communication terminal 100 according to the present embodiment. You can express your feelings and feelings by manipulating the movement of. In addition, since the avatar operation can be reflected on the TV 200 of another participant who is watching the television program by performing the above-described processing, between other participants who are watching the television program, Impressions and feelings can be exchanged in real time. In this way, even when viewing content delivered in real time individually, for example, at home, as if with other participants who are not there Feel free to exchange your thoughts and feelings, share a place with other viewers who are watching the same content together, and feel the presence of other participants and a sense of unity with other viewers. Communication with other viewers.

  In the above-described example, an avatar image in which a predetermined character performs a predetermined action among images stored in advance in the avatar information storage unit 110 of each communication terminal 100 between the communication terminals 100 is used. Information for identification (for example, uniquely assigned identification information such as ID) is exchanged, and a corresponding avatar image is identified based on the information in the received communication terminal 100. Image data as an avatar image may be exchanged between them. In addition, image data (for example, an image of a participant who has taken a photograph) desired to be used as an avatar image or an image corresponding to the state of the avatar is newly transmitted to the other communication terminals 100 and the avatar of each communication terminal 100 Processing to be stored in the information storage unit 110 may be performed at an arbitrary timing. In this case, in each communication terminal 100, unique identification information shared among the communication terminals 100 is newly allocated to the image data and stored in the avatar information storage unit 110.

  Next, in the communication terminal 100 concerning this Embodiment, the process with respect to the voice message between the other participant's communication terminals 100 is demonstrated using FIG. The process shown in the flowchart of FIG. 18 is a process that is performed when an operation is performed on a voice message with another communication terminal 100 in the communication terminal 100, and the CPU 11 of the communication terminal 100 records in the ROM 12 or the like. The program to be executed is read out on the RAM 13 and executed, and each unit shown in FIG. 2 is controlled to exhibit each function shown in FIG.

  Referring to FIG. 18, avatar management section 106 designates one or more predetermined avatars from participant avatars currently displayed on the image display section (not shown) of TV 200 from avatar designation section 108. It is monitored whether there is an operation to perform (S201).

  When an operation for designating a predetermined avatar is detected in step S201 (YES in S203), an instruction input for the participant's communication terminal 100 corresponding to the avatar designated in the avatar operation unit 107 is further accepted (S205). Then, in the avatar management unit 106, the instruction content input in step S205 is analyzed, and thereafter, processing according to the instruction content proceeds.

  When the instruction content input in step S205 is transmission of a voice message to the communication terminal 100 of the participant corresponding to the specified avatar (YES in S207), the message management unit 112 starts from the message input unit 114. The input voice message is accepted (S209). Then, a participant corresponding to the designated avatar is specified as a transmission partner in the avatar management unit 106, and a voice message is sent to the corresponding communication terminal 100 via the network communication control unit 101 in the message management unit 112. It is transmitted (S213).

  In order to explain the voice message transmitted in step S213, an avatar corresponding to the participant specified by the participant ID “001” is designated in the communication terminal 100 of the participant specified by the participant ID “002”. FIG. 19 shows a specific example of the voice message to be transmitted when the voice message “Yes! Referring to FIG. 19, in this case, the participant's communication terminal 100 identified by the participant ID “002” transmits the participant ID (002) of the participant who uses the communication terminal 100 on the transmission side and the transmission. The participant ID (001) of the participant corresponding to the avatar designated as the other party, the transmission time in the communication terminal 100 on the transmission side, and the viewing information in the communication terminal 100 on the transmission side at the time of transmission are connected at that time. Information including the channel number being viewed on the TV 200, the broadcast station name of the TV program, the TV program name, and the content of the input voice message is transmitted to the corresponding communication terminal 100. The participant ID of the participant corresponding to the avatar designated as the transmission partner, which is information related to the communication terminal 100 of the transmission partner included in this information, is stored in the avatar state storage unit 109 in the avatar management unit 106. This is information acquired from the avatar information table and passed to the message management unit 112. Also, the viewing information in the communication terminal 100 on the transmission side at the time of transmission is information obtained from the own viewing information table stored in the viewing information storage unit 105 in the viewing information management unit 103 and passed to the message management unit 112.

  On the other hand, if the instruction content input in step S205 is to reproduce a voice message transmitted from the communication terminal 100 of the participant corresponding to the specified avatar (NO in S207 and YES in S215), the avatar In the management unit 106, a participant corresponding to the designated avatar is specified, and in the message management unit 112, it is checked whether or not the voice message transmitted from the participant is stored in the message storage unit 113. (S217).

  Here, a specific example of information related to the voice message stored in the message storage unit 113 is shown in FIG. A voice message as shown in FIG. 19 transmitted from another communication terminal 100 is stored in the message storage unit 113 for each voice message as it is, as shown in FIG. Note that the information related to the voice message stored in the message storage unit 113 may be automatically deleted when it is played back by the message output unit 115 after the following processing is performed. It may be deleted according to the operation from the operation unit 116.

  When the corresponding voice message is stored in the message storage unit 113 (YES in S219), the message output unit 115 reproduces the voice message (S221). At the same time, the avatar management unit 106 may update the information about the presence or absence of a message among the corresponding participant avatar information in the participant avatar information table. If the corresponding voice message is not stored in the message storage unit 113 (NO in S219), the process in step S221 is skipped.

  Further, when the instruction content input in step S205 is deletion of the voice message transmitted from the communication terminal 100 of the participant corresponding to the designated avatar (NO in S207 and NO in S215), step Similarly to S217, the avatar management unit 106 identifies the participant corresponding to the designated avatar, and the message management unit 112 stores the voice message transmitted from the participant in the message storage unit 113. Whether it is present or not is checked (S223).

  When the corresponding voice message is stored in the message storage unit 113 (YES in S225), the message management unit 112 deletes the voice message from the message storage unit 113 (S227). At the same time, the avatar management unit 106 may update the information about the presence or absence of a message among the corresponding participant avatar information in the participant avatar information table. If the corresponding voice message is not stored in the message storage unit 113 (NO in S225), the process in step S227 is skipped.

  This completes the processing for the voice message with the communication terminal 100 of another participant. Then, the process is returned to step S201 again, and one or a plurality of participant avatars currently displayed on the image display unit (not shown) of the TV 200 are displayed from the avatar specifying unit 108 in the avatar management unit 106. It is monitored whether there is an operation for designating a predetermined avatar.

  Next, processing performed when a voice message is received from another communication terminal 100 on which the above-described processing has been performed will be described with reference to FIG. In the processing shown in the flowchart of FIG. 21, the CPU 11 of the communication terminal 100 reads out and executes a program recorded in the ROM 12 or the like on the RAM 13, and controls each unit shown in FIG. 2 to control each function shown in FIG. It is realized by demonstrating.

  Referring to FIG. 21, the message management unit 112 monitors the reception of a voice message from another communication terminal 100 connected via the network 300 via the network communication control unit 101 (S241).

  If reception of a voice message from another communication terminal 100 connected via the network 300 is detected in step S241 (YES in S243), the received voice message is stored in a message as shown in FIG. Is accumulated in the unit 113 (S245).

  Then, the message manager 112 notifies the avatar manager 106 that the voice message has been received, and the avatar manager 106 identifies the participant who is the transmission partner. Further, the display control signal generated in the avatar display control unit 111 is transmitted to the TV 200 connected via the TV communication control unit 102, and corresponds to the receiving party displayed on the image display unit (not shown). The display of the avatar to be updated is updated to a display indicating that a message has been received from the participant (S247). At that time, the avatar management unit 106 updates the number of voice messages transmitted from the participant by 1 for the avatar information corresponding to the corresponding participant in the participant avatar information table shown in FIG. May be executed.

  Thus, the process in the case where a voice message is received from another communication terminal 100 ends. Then, the process returns to step S241 again, and the message manager 112 monitors the reception of the voice message from the other communication terminal 100 connected via the network 300 via the network communication controller 101. .

  When the communication terminal 100 according to the present embodiment executes the above-described processing, for example, when the participant B communication terminal 100b receives a voice message transmitted from the participant D communication terminal 100d, the TV 200b An avatar image as shown in FIG. 22 is displayed on the image display unit. That is, referring to FIG. 22, an avatar image 500d corresponding to participant D indicating that a voice message addressed from participant D to participant B has been received from communication terminal 100d is displayed.

  In the display of the avatar as shown in FIG. 22, when the avatar image 500d is designated by the avatar designation unit 108 and the message operation unit 116 is operated to instruct the reproduction or deletion of the voice message, it will be described first. The processed process is executed, and the process of reproducing the voice message (S221) and the process of deleting (S227) are executed.

  In other words, when the communication terminal 100 according to the present embodiment performs the above-described processing, other contents that are distributed in real time, for example, when viewed individually in the home, are not present. You can freely exchange multimedia data such as voice messages as if you were there, and share the venue with other viewers who are watching the same content. You can feel the senses, the presence of other participants, and the sense of unity with other viewers, and you can get communication that feels “connected” with other viewers. Further, as described above, multimedia data received from other participants can be reproduced at an arbitrary timing, so that communication can be achieved without hindering viewing of the participant.

  In the above-described embodiment, as shown in FIG. 1, the configuration of a system in which each communication terminal 100 includes a storage device and directly exchanges data via the network 300 has been described. The configuration is not limited to such a configuration.

  Another specific example of the system configuration is a system configuration including a server 600 connected to the network 300 as shown in FIG. In this case, as illustrated in FIG. 24, the communication terminal 100 is other than the viewing information storage unit 105, the avatar state storage unit 109, the avatar information storage unit 110, and the message storage unit 113 among the functions illustrated in FIG. 3. The server 600 communicates with the communication terminal 100 via the network 300 and the viewing information storage unit 105, the avatar state storage unit 109, the avatar information storage unit 110, and the message storage unit 113 described above. The network communication control unit 601 is configured to include each function. Then, when the above-described processing is executed, each of the communication information included in the server 600 from the viewing information management unit 103, the avatar management unit 106, and the message management unit 112 of each communication terminal 100 via the network communication control unit 101. The storage unit and the storage unit are accessed.

  By configuring the system in this way, the storage capacity of each communication terminal 100 can be reduced, and the configuration of the communication terminal 100 can be simplified. Moreover, since it becomes possible to manage avatar image data etc. in the server 600 centrally, the management becomes easy.

  In the above-described embodiment, an avatar that is a character image is displayed as information representing a participant who uses the TV 200. However, a participant who views the same TV program in place of the avatar is displayed. A character string indicating the participant, such as the name of the participant, may be displayed in a list. Alternatively, information representing a participant who uses the TV 200 may be voice, and may be output from a speaker (not shown) instead of the display. Moreover, the output form of such information is not limited to the output form output together with the TV program from the TV 200. For example, an identification device provided on the TV 200 such as a lamp, a bill, or a doll indicating each participant is used. The form which represents the participant who is viewing the same television program by operating may be sufficient.

  Furthermore, the method of each process performed with the communication terminal 100 mentioned above can also be provided as a program. Such a program can be recorded on a computer-readable recording medium such as a flexible disk, a CD-ROM, a ROM, a RAM, and a memory card attached to the computer and provided as a program product. Alternatively, the program can be provided by being recorded on a recording medium such as a hard disk built in the computer. A program can also be provided by downloading via a network.

  The provided program product is installed in a program storage unit such as a hard disk and executed. The program product includes the program itself and a recording medium on which the program is recorded.

  The embodiment disclosed this time should be considered as illustrative in all points and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.

It is a figure which shows the specific example of a structure of the content transmission / reception system concerning this Embodiment. It is a figure which shows the specific example of the hardware constitutions of the communication terminal. 2 is a diagram illustrating a specific example of a functional configuration of a communication terminal 100. FIG. It is a figure which shows the specific example of a self-viewing information table. It is a figure which shows the specific example of a participant viewing information table. It is a figure which shows the specific example of the viewing information person table. It is a figure which shows the specific example of a self-avatar information table. It is a figure which shows the specific example of a participant avatar information table. It is a figure which shows the specific example of the avatar image displayed on TV200. 4 is a flowchart illustrating processing according to a channel change operation executed on the communication terminal 100. It is a figure which shows the specific example of the own viewing-and-listening information transmitted by step S107. It is a flowchart which shows the process performed in the communication terminal 100, when viewing-and-listening information is received from another communication terminal. It is a figure which shows the specific example of the avatar image displayed on TV200. It is a figure which shows the specific example of the avatar image displayed on TV200. It is a flowchart which shows the process performed when there exists operation with respect to a self-avatar performed with the communication terminal. It is a figure which shows the specific example of the avatar information transmitted by step S167. It is a flowchart which shows the process performed with the communication terminal 100, when avatar information is received from another communication terminal. It is a flowchart which shows the process performed in the communication terminal 100 when there exists operation with respect to the voice message between other communication terminals. It is a figure which shows the specific example of the voice message transmitted by step S213. It is a figure which shows the specific example of the information regarding the voice message accumulate | stored in the message storage part. It is a flowchart which shows the process performed in the communication terminal 100, when a voice message is received from another communication terminal. It is a figure which shows the specific example of the avatar image displayed on TV200. It is a figure which shows the other specific example of a structure of the content transmission / reception system concerning this Embodiment. It is a figure which shows the specific example of the functional structure of the communication terminal 100, and the functional structure of the server 600. FIG.

Explanation of symbols

  11 CPU, 12 ROM, 13 RAM, 14 Recording medium reading unit, 15 input unit, 16 output unit, 17 communication unit, 18 bus, 100, 100a to 100d communication terminal, 101, 601 network communication control unit, 102 TV communication control Unit, 103 viewing information management unit, 104 channel change detection unit, 105 viewing information storage unit, 106 avatar management unit, 107 avatar operation unit, 108 avatar designation unit, 109 avatar state storage unit, 110 avatar information storage unit, 111 avatar display Control unit, 112 Message management unit, 113 Message storage unit, 114 Message input unit, 115 Message output unit, 116 Message operation unit, 200, 200a to 200d TV, 300 Network, 400 Broadcast station, 500a, 500b, 500d Butter image, 600 server.

Claims (11)

  1. A communication terminal connected to a first content reproduction device that receives and reproduces content in real time,
    Content information acquisition means for acquiring first content information which is information relating to content being played back in the first content playback device;
    Detecting means for detecting a change in content being played back on the first content playback device based on the first content information;
    When the detection unit detects a change in the content being played back on the first content playback device, the changed first content information is sent to another communication terminal connected to the second content playback device. Content information transmitting means for transmitting to,
    When a change in content is detected in the second content reproduction device from the other communication terminal, transmitted by the content information transmission means of the other communication terminal, being reproduced in the second content reproduction device Content information receiving means for receiving second content information which is information relating to the changed content of
    When the detection unit detects a change in the content being played back on the first content playback device, or when the detection unit detects a change in the content being played back on the second content playback device Output means for comparing the first content information with the second content information and outputting a control signal according to the result of the comparison,
    The output means includes
    When the first content information and the second content information are information relating to the same content, a control signal for causing the first content reproduction device to display information indicating the other communication terminal is transmitted to the first content reproduction device. A communication terminal that outputs to a content playback device.
  2. The output means includes
    Control for preventing display of information indicating the other communication terminal displayed on the first content reproduction device when the first content information and the second content information are not information related to the same content. The communication terminal according to claim 1, wherein a signal is output to the first content reproduction device.
  3. Operation means for accepting an operation on information indicating the communication terminal;
    Operation information transmitting means for transmitting first operation information representing an operation for information indicating the communication terminal to another communication terminal;
    Operation information receiving means for receiving, from the other communication terminal, second operation information representing an operation on information indicating the other communication terminal;
    The output means includes
    In order to display information indicating the other communication terminal on the first content reproduction device based on the second operation information when the first content information and the second content information are information relating to the same content. The communication terminal according to claim 1, wherein the control signal is output to the first content reproduction device.
  4. Designating means for designating information indicating the other communication terminal displayed on the first content reproduction device;
    The communication terminal according to claim 1, further comprising multimedia data transmitting means for transmitting multimedia data to the other communication terminal corresponding to the designated information.
  5. Multimedia data receiving means for receiving multimedia data from the other communication terminal;
    Storing means for storing the received multimedia data;
    The output means outputs a control signal for causing the first content reproduction device to display information indicating the other communication terminal that has received the multimedia data, to the first content reproduction device. The communication terminal in any one of -4.
  6. The designation means designates information indicating the other communication terminal that has received the multimedia data,
    6. The reproduction apparatus according to claim 5, further comprising a reproduction unit that reproduces multimedia data received from the other communication terminal corresponding to the designated information stored in the storage unit at a timing designated by the designation unit. The communication terminal described.
  7.   The communication terminal according to claim 4, wherein the multimedia data is audio data.
  8. A television signal reproducing means for receiving and reproducing a television signal from a broadcasting station in real time;
    Detecting means for detecting a change in the television program being reproduced in the television signal reproducing means based on first viewing information that is information relating to the television program being reproduced in the television signal reproducing means;
    Content information transmission means for transmitting the changed first viewing information to another device when the detection means detects a change in the television program being reproduced in the television signal reproduction means;
    In the television signal reproduction means of the other device transmitted from the other device when the change of the television program being reproduced in the other device is detected by the content information transmission means of the other device. Viewing information receiving means for receiving second viewing information which is information relating to the changed television program being reproduced;
    When the change of the television program being reproduced is detected by the television signal reproduction means in the detection means, or the change of the television program being reproduced is detected by the television signal reproduction means of the other device in the detection means. If the first viewing information and the second viewing information are compared, and the first viewing information and the second viewing information are information relating to the same television program, information indicating the other device is A television receiver comprising output means for outputting.
  9. A content information acquisition step of acquiring first content information that is information relating to the content being played back from a first content playback device that receives and plays back the content in real time;
    A detection step of detecting a change in content being played back on the first content playback device based on the first content information;
    A content information transmission step of transmitting the changed first content information to another communication terminal when a change in the content being reproduced on the first content reproduction device is detected in the detection step;
    When a change in content is detected in the second content reproduction device from a communication terminal connected to a second content reproduction device different from the first content reproduction device, the other communication is performed in the content information transmission step. A content information receiving step of receiving second content information transmitted from a terminal, which is information relating to the changed content being played back in the second content playback device;
    In the detecting step, when it is detected that the content of the changed changed content being reproduced is the same as the content being reproduced in the second content reproduction apparatus in the first content playback apparatus, to display the information indicating the second content reproduction apparatus to the first content reproduction apparatus, it changes the same content as the content being reproduced in the second content reproduction apparatus being reproduced in the first content playback device If it is detected to have been turned to a content different from the content being reproduced in the second content reproduction apparatus is, displays the information indicating the first said displayed on the content reproduction apparatus of the second content playback device An information output method comprising: a control step for controlling display so that the display is not performed.
  10. An information output program for causing a computer to execute an information output process for outputting information about a communication terminal that communicates with a first content reproduction device that receives and reproduces content in real time,
    A content information acquisition step of acquiring first content information, which is information related to the content being played back, from the first content playback device;
    A detection step of detecting a change in content being played back on the first content playback device based on the first content information;
    A content information transmission step of transmitting the changed first content information to another communication terminal when a change in the content being reproduced on the first content reproduction device is detected in the detection step;
    When a change in content is detected in the second content reproduction device from a communication terminal connected to a second content reproduction device different from the first content reproduction device, the other communication is performed in the content information transmission step. A content information receiving step of receiving second content information transmitted from a terminal, which is information relating to the changed content being played back in the second content playback device;
    In the detecting step, when it is detected that the content of the changed changed content being reproduced is the same as the content being reproduced in the second content reproduction apparatus in the first content playback apparatus, to display the information indicating the second content reproduction apparatus to the first content reproduction apparatus, it changes the same content as the content being reproduced in the second content reproduction apparatus being reproduced in the first content playback device If it is detected to have been turned to a content different from the content being reproduced in the second content reproduction apparatus is, displays the information indicating the first said displayed on the content reproduction apparatus of the second content playback device Information output program that executes the control step to control the display. Grams.
  11.   A computer-readable recording medium on which the information output program according to claim 10 is recorded.
JP2004082976A 2004-03-22 2004-03-22 Communication terminal, television receiver, information output method, information output program, and recording medium containing information output program Expired - Fee Related JP4236606B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004082976A JP4236606B2 (en) 2004-03-22 2004-03-22 Communication terminal, television receiver, information output method, information output program, and recording medium containing information output program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004082976A JP4236606B2 (en) 2004-03-22 2004-03-22 Communication terminal, television receiver, information output method, information output program, and recording medium containing information output program

Publications (2)

Publication Number Publication Date
JP2005269557A JP2005269557A (en) 2005-09-29
JP4236606B2 true JP4236606B2 (en) 2009-03-11

Family

ID=35093568

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004082976A Expired - Fee Related JP4236606B2 (en) 2004-03-22 2004-03-22 Communication terminal, television receiver, information output method, information output program, and recording medium containing information output program

Country Status (1)

Country Link
JP (1) JP4236606B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101873736B1 (en) * 2011-02-10 2018-07-03 엘지전자 주식회사 Electronic device and method for remotely viewing the same content at the same time

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5231093B2 (en) * 2008-06-17 2013-07-10 ヤフー株式会社 Content updating apparatus, method and program
US20100306671A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Avatar Integrated Shared Media Selection
WO2011145701A1 (en) * 2010-05-19 2011-11-24 シャープ株式会社 Source device, sink device, system, programme and recording medium
JP5626878B2 (en) * 2010-10-20 2014-11-19 Necカシオモバイルコミュニケーションズ株式会社 Viewing system, mobile terminal, server, viewing method
KR20140061620A (en) * 2012-11-13 2014-05-22 삼성전자주식회사 System and method for providing social network service using augmented reality, and devices

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101873736B1 (en) * 2011-02-10 2018-07-03 엘지전자 주식회사 Electronic device and method for remotely viewing the same content at the same time

Also Published As

Publication number Publication date
JP2005269557A (en) 2005-09-29

Similar Documents

Publication Publication Date Title
US5835126A (en) Interactive system for a closed cable network which includes facsimiles and voice mail on a display
JP4456762B2 (en) Interactive video programming method
US7284202B1 (en) Interactive multi media user interface using affinity based categorization
US8655953B2 (en) System and method for playback positioning of distributed media co-viewers
JP4505141B2 (en) TV message system
JP4655190B2 (en) Information processing apparatus and method, recording medium, and program
US6753857B1 (en) Method and system for 3-D shared virtual environment display communication virtual conference and programs therefor
CN100369483C (en) TV chat system
AU774190B2 (en) Enhanced video programming system and method for providing distributed community network
CA2302616C (en) Apparatus for video access and control over computer network, including image correction
KR100616026B1 (en) Electric apparatus system, control method of an electric apparatus system, and recording medium
JP4765182B2 (en) Interactive television communication method and interactive television communication client device
US20040045039A1 (en) Host apparatus for simulating two way connectivity for one way data streams
CN1229988C (en) Synchronized video output method, system and apparatus
US20090013263A1 (en) Method and apparatus for selecting events to be displayed at virtual venues and social networking
KR101796005B1 (en) Media processing methods and arrangements
US7370342B2 (en) Method and apparatus for delivery of targeted video programming
US8307389B2 (en) Information processing apparatus, information processing method, computer program, and information sharing system
US20040010804A1 (en) Apparatus for video access and control over computer network, including image correction
JP4169180B2 (en) A portable communication device that simulates a bi-directional connection to a one-way data stream
CN1254101C (en) Reinforced video frequency program system and method for user simple information
US9684796B2 (en) Information processing system, service providing apparatus and method, information processing apparatus and method, recording medium, and program
US8112490B2 (en) System and method for providing a virtual environment with shared video on demand
CN106233740B (en) Recommend method, system and the medium of items of media content for rendering
EP0843168A2 (en) An information processing apparatus, an information processing method, and a medium for use in a three-dimensional virtual reality space sharing system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060125

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20071016

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20071213

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080930

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20081113

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20081209

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20081216

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111226

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111226

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121226

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees