WO2013065221A1 - Transmission terminal, reception terminal, and method for sending information - Google Patents

Transmission terminal, reception terminal, and method for sending information Download PDF

Info

Publication number
WO2013065221A1
WO2013065221A1 PCT/JP2012/005492 JP2012005492W WO2013065221A1 WO 2013065221 A1 WO2013065221 A1 WO 2013065221A1 JP 2012005492 W JP2012005492 W JP 2012005492W WO 2013065221 A1 WO2013065221 A1 WO 2013065221A1
Authority
WO
WIPO (PCT)
Prior art keywords
transmission
terminal
information
image
receiving
Prior art date
Application number
PCT/JP2012/005492
Other languages
French (fr)
Japanese (ja)
Inventor
幹直 池田
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2013065221A1 publication Critical patent/WO2013065221A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • the present invention relates to an information transmission method for transmitting information input at a transmission terminal to a reception terminal.
  • the sending user terminal sends the comment and comment elapsed time to the server, and the receiving user terminal inputs the comment and comment Is received from a server and a comment corresponding to the elapsed time is displayed on a screen during the reproduction of the moving image (for example, see Patent Document 1).
  • Patent Document 1 on the assumption that the receiving user terminal holds the content, the comment and the elapsed time in the specific scene of the content are displayed in association with each other.
  • the comment from the transmission-side user cannot be displayed in association with the elapsed time in the specific scene of the content. That is, in such a case, the transmission-side user cannot accurately transmit information using the comment and the specific scene of the content to the reception-side user.
  • Patent Document 1 it is troublesome for the receiving side user to record all the content commented by the transmitting side user at the receiving side user terminal and to hold the recorded content.
  • Patent Document 1 since it is assumed that the receiving side user views the broadcast program after the broadcast of the broadcast program is finished, the receiving side user can comment the transmission side user's comment on the broadcast program in real time during the broadcast. I can't confirm.
  • the present invention has been made in view of the above points, and on the premise of avoiding copyright infringement and reducing the network load, the transmission information including comments from the sending user is accurately and in real time. It is an object of the present invention to provide an information transmission method that can be transmitted to a receiving side user and can eliminate the troublesomeness of recording the content at the receiving side user terminal and holding the recorded content.
  • a transmitting terminal disclosed below captures an image indicating any scene of the displayed moving image content, a reproducing unit that reproduces the moving image content, a display unit that displays the moving image content reproduced by the reproducing unit, and A receiving unit that receives input of transfer information related to the captured image, a processor that generates or acquires scene specifying information that is information specifying the image captured in the memory, and the transfer information And a transmitting unit that transmits scene specifying information for specifying an image related to the transmission information to the receiving terminal, wherein the receiving terminal receives the video content reproduced by the reproducing unit.
  • Scene identification obtained from another source different from the transmission terminal and received from the transmission unit among the obtained video content Capturing an image showing a scene specified by broadcast, and displays the transmission information received from the transmitting unit together with the captured image.
  • the receiving terminal disclosed below includes a playback unit that plays back video content, a display unit that displays the video content played back by the playback unit, scene specification information that specifies transmission information and an image associated with the transmission information, A receiving unit, and a memory that captures an image indicating a scene specified by the scene specifying information received from the transmitting terminal among the displayed moving image content, and a receiving terminal comprising:
  • the display unit displays the captured information received from the transmission terminal together with the captured image, and the transmission terminal obtains the moving image content reproduced by the reproduction unit from another source different from the reception terminal.
  • the scene identification information that is information for identifying the captured image is generated or acquired, and the transmission information and the scene identification information for identifying the image related to the transmission information are transmitted to the receiving terminal. To send.
  • the information transmission system disclosed below is an information transmission system including a transmission terminal possessed by a transmission user and a reception terminal that is a reception terminal that can communicate with the transmission terminal and that the reception user has.
  • Scene identification information for identifying a specific scene in the content is transmitted to the receiving terminal, and transmission information to be displayed in association with the captured image captured based on the scene identification information is displayed on the receiving terminal.
  • a transmission information transmitting unit for transmitting, and the receiving terminal captures a specific scene in the content that can be viewed by the receiving user based on the scene identification information received from the transmitting terminal, and receives a captured image on the receiving side Receiving side captured image acquisition unit, the transmission information received from the transmitting terminal, and the receiving side And an image combining unit for combining the Yapucha image.
  • An information transmission system disclosed below is an information transmission system including a transmission terminal and a reception terminal capable of communicating with the transmission terminal.
  • the transmission terminal is configured to transmit the user with respect to real-time content being viewed by the user.
  • a capture instruction receiving unit that receives a capture instruction, a transmission side capture image acquisition unit that captures at least one specific scene in the real-time content and acquires a transmission side capture image based on the capture instruction, and the transmission
  • a scene identification information transmission unit that transmits scene identification information that identifies a specific scene of the side capture image to the receiving terminal, and a transmission information reception that receives input of transmission information to be displayed superimposed on the transmission side capture image from the user
  • the information received by the transmission information receiving unit A transmission information transmission unit that transmits information to the reception terminal, the reception terminal based on the scene identification information reception unit that receives scene identification information from the transmission terminal and the received scene identification information, Capture a specific scene in the real-time content and acquire a capture image on the reception side, capture information acquisition unit for receiving the transmission
  • An information transmission system disclosed below is an information transmission system including a transmission terminal, a reception terminal, the transmission terminal, and a management server capable of communicating with the reception terminal, and the transmission terminal is viewed by a user.
  • a capture instruction receiving unit that receives a capture instruction from the user for the real-time content in the transmission, and a transmission side that captures at least one specific scene in the real-time content and acquires a transmission-side capture image based on the capture instruction Capture image acquisition unit, scene identification information for identifying a specific scene of the capture image on the transmission side, a scene identification information transmission unit for transmitting to the management server, and input of transmission information to be displayed superimposed on the capture image on the transmission side
  • a reception information receiving unit for receiving the information from the user, and the transmission
  • a transmission information transmission unit that transmits the transmission information received by the information reception unit to the management server, and the management server distributes the scene identification information received from the transmission terminal to the reception terminal.
  • a scene identification information distribution unit that performs the above-described transmission information received from the transmission terminal and a transmission information distribution unit that distributes the transmission information to the reception terminal.
  • the reception terminal receives scene identification information from the management server. Based on the received scene identification information, an information receiving unit, a receiving-side captured image acquisition unit that captures a specific scene in the real-time content and acquires a receiving-side captured image, and the transmission information from the management server A transmission information receiving unit for receiving, and an image superimposing unit for superimposing the received transmission information on the capture image on the receiving side.
  • transmission information including comments from the transmitting user is accurately and in real time transmitted to the receiving user, and It is possible to eliminate the complexity of recording content in the receiving user terminal and maintaining the recorded content.
  • FIG. 1 is a diagram illustrating an example of a system configuration of an information transmission system according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration in which the first transmission terminal 20 illustrated in FIG. 1 is realized using a CPU.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration in which the second transmission terminal 30 illustrated in FIG. 1 is realized using a CPU.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration in which the receiving terminal 2 illustrated in FIG. 1 is realized using a CPU.
  • FIG. 5 is a diagram showing an example of a flowchart showing processing when the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits the transmission information of the user who views real-time content to the reception terminal 2. It is.
  • FIG. 5 is a diagram showing an example of a flowchart showing processing when the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits the transmission information of the user who views real-time content to the reception terminal 2.
  • FIG. 6A is a diagram illustrating an example when real-time content is displayed on the display 201 of the first transmission terminal 20.
  • FIG. 6B is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 6C is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 7A is a diagram illustrating an example of captured image data 205b held in the first transmission terminal 20.
  • FIG. 7B is a diagram illustrating an example of the captured image data 205b held in the first transmission terminal 20.
  • FIG. 8A is a diagram illustrating an example of the scene identification information data 205c held in the first transmission terminal 20.
  • FIG. 8B is a diagram showing an example of the scene identification information data 205c held in the first transmission terminal 20.
  • FIG. 9A is a diagram illustrating an example of captured image data 405b held in the receiving terminal 2.
  • FIG. 9B is a diagram illustrating an example of captured image data 405b held in the receiving terminal 2.
  • FIG. 10A is a diagram illustrating an example when the transmission information regarding the captured image displayed on the touch panel display 301 of the second transmission terminal 30 is input.
  • FIG. 10B is a diagram illustrating an example of transmission information input by the user on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 11A is a diagram illustrating an example of the transmission information data 305 c held in the second transmission terminal 30.
  • FIG. 11B is a diagram illustrating an example of the transmission information data 305 c held in the second transmission terminal 30.
  • FIG. 12A is a diagram illustrating an example of transmission information received by the receiving terminal 2 from the second transmitting terminal 30.
  • FIG. 12B is a diagram illustrating an example when a capture image is displayed on the display 401 of the receiving terminal 2.
  • FIG. 12C is a diagram illustrating an example of a superimposed image generated by superimposing a capture image and transmission information on the reception terminal 2.
  • FIG. 13A is a diagram illustrating an example when a superimposed image is displayed on real-time content on the display 401 of the receiving terminal 2.
  • FIG. 13B is a diagram illustrating an example when a superimposed image is displayed on the tablet terminal included in the reception terminal 2.
  • FIG. 12A is a diagram illustrating an example of transmission information received by the receiving terminal 2 from the second transmitting terminal 30.
  • FIG. 12B is a diagram illustrating an example when a capture image is displayed on the display 401 of
  • FIG. 14 is a diagram illustrating an example of a system configuration of an information transmission system when the transmission terminal 1 is configured by one device.
  • FIG. 15 is a diagram illustrating an example of a system configuration of the information transmission system when the first transmission terminal 20 includes the transmission information transmission unit 16.
  • FIG. 16 is a diagram illustrating an example of a system configuration of the information transmission system when the second transmission terminal 30 includes the scene identification information transmission unit 14.
  • FIG. 17 is a diagram illustrating an example of a system configuration of the information transmission system when the transmission terminal 1 and the reception terminal 2 are configured to receive real-time content from the content distribution server 4.
  • FIG. 18A is a diagram illustrating an example of a configuration of a receiving terminal when the receiving terminal 2 is configured by two devices.
  • FIG. 18B is a diagram illustrating an example of a configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26.
  • FIG. 18C is a diagram illustrating an example of the configuration of the receiving terminal when the second receiving terminal 50 is configured to include the scene identification information receiving unit 27.
  • FIG. 18D is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26 and the second receiving terminal 50 includes the scene identification information receiving unit 27.
  • FIG. 18E is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 includes the image superimposing unit 24.
  • FIG. 19 is a diagram illustrating an example of a system configuration of an information transmission system according to the second embodiment.
  • FIG. 20 is a diagram illustrating an example of a hardware configuration in which the management server 5 illustrated in FIG. 19 is realized using a CPU.
  • FIG. 21A shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits the transmission information of the user who views real-time content to the receiving terminal 2 via the management server. It is a figure which shows an example of the flowchart shown.
  • FIG. 21B shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits the transmission information of the user who views real-time content to the receiving terminal 2 via the management server. It is a figure which shows an example of the flowchart shown.
  • FIG. 21A shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits the transmission information of the user who views real-time content to the receiving terminal 2 via the management server. It is a figure which shows an example of the flowchart shown.
  • FIG. 21A shows processing when the transmitting terminal
  • FIG. 22 is a diagram illustrating an example when a superimposed image is displayed on real-time content on the display 401 of the receiving terminal 2.
  • FIG. 23A is a diagram schematically illustrating a relationship of main data transmitted and received between the transmission terminal 1 and the reception terminal 2.
  • FIG. 23B is a diagram schematically illustrating a relationship between main data transmitted and received between the transmission terminal 1 and the reception terminal 2.
  • FIG. 23C is a diagram schematically illustrating a relationship of main data transmitted / received between the transmission terminal 1 and the reception terminal 2.
  • FIG. 23D is a diagram schematically illustrating a relationship of main data transmitted and received between the transmission terminal 1 and the reception terminal 2.
  • FIG. 23E is a diagram schematically illustrating a relationship of main data transmitted and received among the transmission terminal 1, the management server 5, and the reception terminal 2.
  • FIG. 24 is a diagram illustrating an example of a system configuration of an information transmission system according to the fourth embodiment.
  • FIG. 25 shows processing in the case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits quiz information as transmission information to the reception terminal 2 in the information transmission system of the fourth embodiment. It is a figure which shows an example of the flowchart shown.
  • FIG. 26 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 27 is a diagram illustrating an example of a flowchart of quiz information creation processing executed by the second transmission terminal 30 in the information transmission system according to the fourth embodiment.
  • FIG. 28 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 31 is a diagram illustrating an example of the transmission information data 305 c held as quiz information in the second transmission terminal 30.
  • FIG. 32 is a diagram illustrating an example when the quiz image 121 is superimposed and displayed on the real-time content.
  • FIG. 33 is a diagram illustrating an example of a system configuration of an information transmission system according to the fifth embodiment.
  • FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display
  • FIG. 34 shows processing in the case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits advertisement information as transmission information to the reception terminal 2 in the information transmission system of the fifth embodiment. It is a figure which shows an example of the flowchart shown.
  • FIG. 35 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 36 is a diagram illustrating an example of a flowchart of the advertisement information creation process executed by the second transmission terminal 30 in the information transmission system of the fifth embodiment.
  • FIG. 37 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 38 is a diagram illustrating an example when an advertisement setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 39 is a diagram illustrating an example when an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 40 is a diagram illustrating an example when the advertisement image 122 is superimposed and displayed on the real-time content.
  • FIG. 41 is a diagram illustrating an example of a system configuration of an information transmission system according to the sixth embodiment.
  • FIG. 42 shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits game information as transmitted information to the receiving terminal 2 in the information transmitting system of the sixth embodiment. It is a figure which shows an example of the flowchart shown.
  • FIG. 42 shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits game information as transmitted information to the receiving terminal 2 in the information transmitting system of the sixth embodiment. It is a figure which shows an example of the
  • FIG. 43 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 44 is a diagram illustrating an example of a flowchart of game information creation processing executed by the second transmission terminal 30 in the information transmission system of the fifth embodiment.
  • FIG. 45 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 46 is a diagram illustrating an example when a game setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 47 is a diagram illustrating an example when a game screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 48 is a diagram showing an example when the game image 123 is superimposed and displayed on the real-time content.
  • a television receiver is a device that can receive a radio wave of a television broadcast and allow a user to view real-time content.
  • real-time content refers to content that is distributed all at once, such as a broadcast program in which television broadcasting is performed, and can be viewed by each viewing user at the same timing. Therefore, the real-time content is a concept including a broadcast program of live broadcast or recorded broadcast.
  • FIG. 1 is a diagram illustrating an example of a system configuration of an information transmission system according to the first embodiment.
  • the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so that they can communicate with each other.
  • the network N can be realized by the Internet, a local area network (LAN), a wide area network (WAN), or the like.
  • the transmission terminal 1 includes a first transmission terminal 20 and a second transmission terminal 30.
  • Each of the first transmission terminal 20 and the reception terminal 2 can receive broadcast radio waves (for example, terrestrial digital broadcast radio waves) from the broadcast station 3 and allow the user to view real-time content that is a broadcast program.
  • broadcast radio waves for example, terrestrial digital broadcast radio waves
  • the first transmission terminal 20 and the reception terminal 2 are television receivers.
  • the first transmission terminal 20 or the reception terminal 2 can be a device device (for example, a disk recording / playback device) that incorporates a tuner capable of receiving a television broadcast.
  • the second transmission terminal 30 is a tablet terminal that can communicate with the first transmission terminal 20.
  • the 2nd transmission terminal 30 can be used as the smart phone provided with the touchscreen display etc.
  • the first transmitting terminal 20 or the receiving terminal 2 is a device device (for example, a set-top box or a disk recording / playback device) having a built-in tuner capable of receiving a television broadcast via a network such as a cable TV. Can do.
  • a device device for example, a set-top box or a disk recording / playback device
  • a built-in tuner capable of receiving a television broadcast via a network such as a cable TV. Can do.
  • FIG. 1 only one receiving terminal 2 that can communicate with the transmitting terminal 1 is shown, but a plurality of receiving terminals 2 may exist. In FIG. 1, only one set of the combination of the transmission terminal 1 and the reception terminal 2 is described, but a plurality of combinations may exist.
  • the first transmission terminal 20 includes a real-time content reception unit 11 (a reproduction unit that reproduces moving image content), an image display unit 12 (a display unit that displays moving image content), and a transmission side buffer unit 13 (a part or all of the moving image content).
  • a buffer unit for temporarily storing a scene identification information transmission unit 14 (a transmission unit for transmitting scene specifying information), and a transmission side capture image acquisition unit 15.
  • the second transmission terminal 30 includes a transmission information transmission unit 16, a capture instruction reception unit 17, a transmission information reception unit 18 (a reception unit that receives input of transmission information), and a captured image display unit 19.
  • the receiving terminal 2 includes a real-time content receiving unit 21 (a reproducing unit that reproduces moving image content), an image display unit 22 (a display unit that displays moving image content), and a receiving side buffer unit 23 (a part or all of the moving image content is temporarily stored). And a superimposing section 24, a receiving side capture image acquiring section 25, a transmission information receiving section 26, and a scene identification information receiving section 27 (receiving section for receiving scene specifying information).
  • the real-time content receiver 11 in the first transmission terminal 20 receives and reproduces a broadcast program broadcasted by the broadcast station 3 as real-time content.
  • the image display unit 12 in the first transmission terminal 20 displays an image (for example, a moving image) of the content received and reproduced by the real-time content reception unit 11 in the first transmission terminal 20 so that the user of the transmission terminal 1 can view the content.
  • the transmission side buffer unit 13 in the first transmission terminal 20 temporarily stores part or all of the content image displayed by the image display unit 12 in the first transmission terminal 20.
  • the transmission side buffer unit 13 temporarily stores part or all of the content image displayed by the image display unit 12. However, the content image output by the real-time content reception unit 11 is used. A part or all of the above may be temporarily stored. Further, the transmission-side buffer unit 13 may be arranged between the real-time content reception unit 11 and the image display unit 12 so that the image display unit 12 displays an image of the content output from the transmission-side buffer unit 13. .
  • the capture instruction receiving unit 17 in the second transmission terminal 30 is a user who is viewing this content for a capture instruction for capturing an image of the content displayed by the image display unit 12 in the first transmission terminal 20 at a specific timing. Accept from.
  • the transmission side capture image acquisition unit 15 in the first transmission terminal 20 captures by capturing at least one specific scene in the real-time content based on the capture instruction received by the capture instruction reception unit 17 in the second transmission terminal 30.
  • the image is acquired in a memory, for example.
  • the scene identification information transmission unit 14 in the first transmission terminal 20 is, for example, scene identification information generated or acquired by a processor, and specifies the capture image acquired by the transmission side capture image acquisition unit 15 in the first transmission terminal 20. Scene identification information for identifying a scene is transmitted to the receiving terminal 2.
  • the capture image display unit 19 in the second transmission terminal 30 displays the capture image (for example, a still image) acquired by the transmission side capture image acquisition unit 15 in the first transmission terminal 20.
  • the transmission information reception unit 18 in the second transmission terminal 30 receives input of transmission information to be displayed superimposed on the capture image displayed by the capture image display unit 19 in the second transmission terminal 30 from the user.
  • the transmission information transmission unit 16 in the second transmission terminal 30 transmits the transmission information received from the user by the transmission information reception unit 18 in the second transmission terminal 30 to the reception terminal 2.
  • the real-time content receiving unit 21 in the receiving terminal 2 receives and reproduces a broadcast program broadcasted by the broadcast station 3 as real-time content, as in the case of the real-time content receiving unit 11 in the first transmitting terminal 20. Similar to the case of the image display unit 12 in the first transmission terminal 20, the image display unit 22 in the reception terminal 2 displays content images (for example, moving images) received and reproduced by the real-time content reception unit 21 in the reception terminal 2. Thus, the user of the receiving terminal 2 is allowed to view the content.
  • the receiving side buffer unit 23 in the receiving terminal 2 temporarily stores part or all of the content image displayed by the image display unit 22 in the receiving terminal 2.
  • the reception-side buffer unit 23 temporarily stores part or all of the content image displayed by the image display unit 22, but the content image received by the real-time content reception unit 21. A part or all of the above may be temporarily stored. Further, the receiving side buffer unit 23 is positioned between the real-time content receiving unit 21 and the image display unit 22, and the image display unit 22 displays the image of the content stored in the receiving side buffer unit 23. Also good.
  • the scene identification information receiving unit 27 in the receiving terminal 2 receives scene identification information for identifying a specific scene of the captured image acquired by the transmitting-side captured image acquiring unit 15 of the first transmitting terminal 20 from the first transmitting terminal. . Based on the scene identification information received by the scene identification information receiving unit 27 in the receiving terminal 2 from the first transmitting terminal 20, the receiving side capture image acquiring unit 25 in the receiving terminal 2 has at least one specific scene in the real-time content, For example, a captured image is acquired by capturing in a memory.
  • the transmission information receiving unit 26 in the receiving terminal 2 receives the transmission information received from the user by the transmission information receiving unit 18 in the second transmitting terminal 30 from the second transmitting terminal 30.
  • the image superimposing unit 24 in the receiving terminal 2 generates a superimposed image by superimposing the transmission information received by the transmission information receiving unit 26 in the receiving terminal 2 on the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2. To do.
  • the image display unit 22 in the receiving terminal 2 displays the superimposed image generated by the image superimposing unit 24 in the receiving terminal 2 so that the user of the receiving terminal 2 can visually recognize the superimposed image.
  • Each functional unit shown in FIG. 1 (real-time content receiving unit 11, image display unit 12, scene identification information transmission unit 14, transmission side capture image acquisition unit 15, transmission information transmission unit 16, capture instruction reception unit 17, transmission information reception Unit 18, captured image display unit 19, real-time content reception unit 21, image display unit 22, image superimposition unit 24, reception side capture image acquisition unit 25, transmission information reception unit 26, and scene identification information reception unit 27) It includes the functions of the CPU realized by the above.
  • the program includes not only a program that can be directly executed by the CPU, but also a source format program, a compressed program, an encrypted program, and the like.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration in which the first transmission terminal 20 illustrated in FIG. 1 is realized using a CPU.
  • the first transmission terminal 20 includes a display 201, a CPU (Central Processing Unit) 202, a RAM (Random Access Memory) 203, an operation button 204, a flash memory 205, a ROM (Read Only Memory) 206, a tuner circuit 207, and a first communication circuit. 208, a second communication circuit 209, and an external connection port 210. These are connected to each other via an internal bus 200.
  • the display 201 can display an image of real-time content output according to a command from the CPU 202.
  • the CPU 202 can execute processing based on an OS (Operating System) 206a and a first transmission terminal control program 205a.
  • the RAM 203 can provide an address space to the CPU 202.
  • the operation button 204 can accept a basic operation of the first transmission terminal 20 (for example, a power on / off operation) from the user.
  • the flash memory 205 can hold the first transmission terminal control program 205a.
  • the external connection port 210 can be connected to an external connection type hard disk drive 211.
  • the external connection port 210 can be connected to an external device that can be connected based on, for example, the USB (Universal Serial Bus) standard or the IEEE 1394 standard.
  • a part of the externally connected hard disk drive 211 can function as the ring buffer 203a.
  • the external connection type hard disk drive 211 can hold captured image data 205b, scene identification information data 205c, and the like.
  • any or all of the ring buffer 203a, the captured image data 205b, and the scene identification information data 205c are stored in the RAM 203, the flash memory 205, an internal connection type hard disk drive (not shown), and a network connection type hard disk drive (not shown). Alternatively, it may be held in a network storage (not shown).
  • an internal connection type hard disk drive (not shown) is a hard disk drive that is directly connected to the internal bus 200 without going through the external connection port 210.
  • a network-connected hard disk drive is a hard disk drive connected via a network N such as the Internet.
  • the network storage (not shown) is a file server (not shown) connected via a network N such as the Internet.
  • the ROM 206 can hold the OS 206a.
  • the tuner circuit 207 can receive a plurality of radio waves of broadcast programs broadcast by the broadcast station 3.
  • the first communication circuit 208 can communicate with the receiving terminal 2 via the network N (FIG. 1).
  • the first communication circuit 208 can perform communication using, for example, TCP / IP (Transmission Control Protocol / Internet Protocol).
  • the second communication circuit 209 can communicate with the second transmission terminal 30.
  • the second communication circuit 209 is, for example, wireless communication based on a standard such as IrDA (trademark), Bluetooth (trademark), Wireless USB, or Wi-Fi (trademark), USB (Universal Serial Bus), or HDMI (High-Definition Multimedia Interface). ) Etc. are possible.
  • the real-time content reception unit 11, the image display unit 12, the scene identification information transmission unit 14, and the transmission side capture image acquisition unit 15 included in the first transmission terminal 20 illustrated in FIG. This is realized by executing the terminal control program 205a (FIG. 2).
  • the real-time content received by the real-time content receiver 11 shown in FIG. 1 or the real-time content displayed by the image display unit 12 shown in FIG. 1 is, for example, the ring buffer 203a (FIG. 2) of the externally connected hard disk drive 211. ).
  • the captured image acquired by the transmission-side captured image acquisition unit 15 illustrated in FIG. 1 is held in the captured image data 205b (FIG. 2) of the external connection type hard disk drive 211 as an example.
  • the scene identification information for identifying the specific scene of the captured image acquired by the transmission-side captured image acquisition unit 15 illustrated in FIG. 1 is held in the scene identification information data 205c (FIG. 2) of the externally connected hard disk drive 211 as an example.
  • FIG. 3 is a diagram illustrating an example of a hardware configuration in which the second transmission terminal 30 illustrated in FIG. 1 is realized using a CPU.
  • the second transmission terminal 30 includes a touch panel display 301, a CPU 302, a RAM 303, operation buttons 304, a flash memory 305, a ROM 306, a first communication circuit 307, and a second communication circuit 308.
  • the touch panel display 301 can display a captured image output by a command from the CPU 302 on the screen.
  • the touch panel display 301 can accept an instruction operation (for example, a contact operation by a user's fingertip or stylus) that specifies a position on the screen from the user.
  • the CPU 302 can execute processing based on the OS 306a and the second transmission terminal control program 305a.
  • the RAM 303 can provide an address space to the CPU 302.
  • the operation button 304 can accept a basic operation (for example, a power on / off operation) of the second transmission terminal 30 from the user.
  • the flash memory 305 can hold the second transmission terminal control program 305a, capture image data 305b, transmission information data 305c, and the like.
  • the ROM 306 can hold the OS 306a. Note that the captured image data 305 b and the transmission information data 305 c may be held in the RAM 303.
  • the first communication circuit 307 can communicate with the receiving terminal 2 via the network N (FIG. 1) as in the case of the first transmitting terminal 20.
  • the first communication circuit 307 can perform communication by, for example, TCP / IP, as in the case of the first transmission terminal 20.
  • the second communication circuit 308 can communicate with the first transmission terminal 20.
  • the second communication circuit 308 is configured to perform wireless communication based on standards such as IrDA (trademark), Bluetooth (trademark), Wireless USB, or Wi-Fi (trademark), USB, or HDMI. Wired communication based on standards such as (trademark) is possible.
  • the transmission information transmission unit 16, the capture instruction reception unit 17, the transmission information reception unit 18, and the captured image display unit 19 configuring the second transmission terminal 30 illustrated in FIG. This is realized by executing the program 305a (FIG. 3).
  • the captured image acquired by the transmission-side captured image acquisition unit 15 illustrated in FIG. 1 is held in captured image data 305b (FIG. 3) of the flash memory 305 as an example.
  • the transmission information received by the transmission information receiving unit 18 shown in FIG. 1 from the user is held in the transmission information data 305c (FIG. 3) of the flash memory 305 as an example.
  • FIG. 4 is a diagram illustrating an example of a hardware configuration in which the receiving terminal 2 illustrated in FIG. 1 is realized using a CPU.
  • the receiving terminal 2 includes a display 401, a CPU 402, a RAM 403, operation buttons 404, a flash memory 405, a ROM 406, a tuner circuit 407, a communication circuit 408, and an external connection port 410. These are connected to each other via an internal bus 400.
  • the display 401 can display an image of real-time content output in accordance with an instruction from the CPU 402.
  • the CPU 402 can execute processing based on the OS 406a and the receiving terminal control program 405a.
  • the RAM 403 can provide an address space to the CPU 402.
  • the operation button 404 can accept a basic operation of the receiving terminal 2 (for example, a power on / off operation) from the user.
  • the flash memory 405 can hold a receiving terminal control program 405a.
  • the external connection port 410 can be connected to an external connection type hard disk drive 411.
  • the external connection port 410 can connect an external device that can be connected based on, for example, the USB standard or the IEEE1394 standard.
  • a part of the externally connected hard disk drive 411 can function as the ring buffer 403a.
  • the external connection type hard disk drive 411 can hold capture image data 405b, transmission information data 405c, and the like.
  • any or all of the ring buffer 403a, the captured image data 405b, and the transmission information data 405c are stored in the RAM 403, the flash memory 405, an internal connection type hard disk drive (not shown), a network connection type hard disk drive (not shown), or It may be held in a network storage (not shown).
  • an internal connection type hard disk drive (not shown) is a hard disk drive that is directly connected to the internal bus 400 without going through the external connection port 410.
  • a network-connected hard disk drive (not shown) is a hard disk drive connected via a network N such as the Internet.
  • the network storage (not shown) is a file server (not shown) connected via a network N such as the Internet.
  • the ROM 406 can hold the OS 406a.
  • the tuner circuit 407 can receive a plurality of radio waves of a broadcast program broadcast by the broadcast station 3 as in the case of the first transmission terminal 20 or the second transmission terminal 30.
  • the communication circuit 408 can communicate with the first transmission terminal 20 and the second transmission terminal 30 via the network N (FIG. 1).
  • the communication circuit 408 can perform communication by TCP / IP, for example, as in the case of the first transmission terminal 20 or the second transmission terminal 30.
  • it is realized by executing the receiving terminal control program 405a (FIG. 4) on the CPU 402.
  • the real-time content received by the real-time content receiving unit 21 shown in FIG. 1 or the real-time content displayed by the image display unit 22 shown in FIG. 1 is, for example, the ring buffer 403a (see FIG. 4) of the externally connected hard disk drive 411. ).
  • the captured image acquired by the reception-side captured image acquisition unit 25 illustrated in FIG. 1 is held in the captured image data 405b (FIG.
  • the transmission information received from the second transmission terminal by the transmission information receiving unit 26 shown in FIG. 1 is held in the transmission information data 405c (FIG. 4) of the external connection type hard disk drive 411 as an example.
  • FIG. 5 shows a case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits the transmission information of the user who views real-time content to the reception terminal 2 in the information transmission system of the present embodiment. It is a figure which shows an example of the flowchart which shows a process.
  • FIG. 6A is a diagram illustrating an example when real-time content is displayed on the display 201 of the first transmission terminal 20.
  • FIG. 6B is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 6C is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30.
  • 7A and 7B are diagrams illustrating an example of the captured image data 205b held in the first transmission terminal 20.
  • 8A and 8B are diagrams illustrating an example of the scene identification information data 205c held in the first transmission terminal 20.
  • 9A and 9B are diagrams illustrating an example of the captured image data 405b held in the receiving terminal 2.
  • FIG. 10A is a diagram illustrating an example in a case where transmission information related to a captured image displayed on the touch panel display 301 of the second transmission terminal 30 is input.
  • FIG. 10B is a diagram illustrating an example of transmission information input by the user on the touch panel display 301 of the second transmission terminal 30.
  • 11A and 11B are diagrams illustrating an example of the transmission information data 305c held in the second transmission terminal 30.
  • FIG. 10A is a diagram illustrating an example in a case where transmission information related to a captured image displayed on the touch panel display 301 of the second transmission terminal 30 is input.
  • FIG. 10B is a diagram illustrating an example of transmission information input by the user on the touch panel display 301 of the second transmission terminal 30.
  • 11A and 11B are diagrams illustrating an example of the transmission information data 305c held in the second transmission terminal 30.
  • FIG. 12A is a diagram illustrating an example of transmission information received by the receiving terminal 2 from the second transmitting terminal 30.
  • FIG. 12B is a diagram illustrating an example when a capture image is displayed on the display 401 of the receiving terminal 2.
  • FIG. 12C is a diagram illustrating an example of a superimposed image generated by superimposing a capture image and transmission information on the reception terminal 2.
  • FIG. 13A is a diagram illustrating an example when a superimposed image is displayed on real-time content on the display 401 of the receiving terminal 2.
  • FIG. 13B is a diagram illustrating an example when a superimposed image is displayed on a tablet terminal included in the reception terminal 2.
  • the CPU 202 of the first transmission terminal 20 which is a television receiver, receives a broadcast program broadcast by the broadcast station 3 as real time content and displays the received real time content on the display 201. Thereby, the user of the 1st transmission terminal 20 can view real-time content (for example, refer to Drawing 6A).
  • the CPU 202 of the first transmission terminal 20 records the real-time content displayed on the display 201 in the ring buffer 203a (FIG. 2) (step S501).
  • the ring buffer 203a can hold an image of content displayed in the past or an image of content received in the past for a predetermined time.
  • the ring buffer 203a holds a frame of a content video every predetermined time (for example, every second) for a predetermined time, but all the frames of the content video (for example, 30 frames per second) for a predetermined time. It may be held.
  • the user can use the second transmission terminal 30 that is a tablet type terminal while viewing real-time content on the first transmission terminal 20 that is a television receiver.
  • the second transmission terminal 30 that is a tablet-type terminal is connected to the first transmission terminal 20 that is a television receiver so as to be able to communicate in advance.
  • the CPU 302 of the second transmission terminal 30 determines whether or not the capture instruction button has been pressed by the user (step S511). For example, when the CPU 302 detects that the capture instruction button 311 (FIG. 6B) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip, stylus, or the like, the capture instruction button It is determined that is pressed.
  • step S512 If it is determined that the capture instruction button 311 has been pressed (Yes in step S511), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 (step S512).
  • the CPU 202 of the first transmission terminal 20 Upon receiving the capture instruction data, the CPU 202 of the first transmission terminal 20 acquires a plurality of captured images by capturing a plurality of frames including a specific scene of the real-time content currently displayed on the display 201, The acquired captured image is recorded in the captured image data 205b (step S502).
  • the CPU 202 refers to the frames displayed in the past recorded in the ring buffer 203a, and records 701 to 705 indicating five captured images with different reproduction times 73 by 1 second as shown in FIG. 7A. Record.
  • the CPU 202 records the actual data of the captured image 74 in the captured image data 205b in association with each data of the broadcast station ID 71, the program ID 72, and the reproduction time 73.
  • the record 703 in FIG. 7A includes data such as a broadcasting station ID 71 “B001”, a program ID 72 “P101”, a reproduction time 73 “18:15:03”, and a captured image 74 “(S_Cap003.bmp)”.
  • the actual data of the captured image 74 is expressed using a file name indicating an image file, such as “(S_Cap001.bmp)”.
  • the reproduction time 73 is a reproduction time included in header information of a PES (Packetized Elementary Stream) packet. it can.
  • PES Packetized Elementary Stream
  • the CPU 202 records, for example, captured image data for frames of 2 seconds before and after the record 703 as records 701, 702, 704, and 705.
  • the time lag between the time when the user presses the capture instruction button 311 of the second transmission terminal 30 and the time when the content is actually captured at the first transmission terminal 20 can be eliminated.
  • the user can select an arbitrary captured image from a plurality of captured images, even if the image that the user tried to capture differs from the actually captured image, the image that the user tried to capture can be selected. You can choose again.
  • the CPU 202 issues unique ID numbers (identification numbers) corresponding to the respective records (records 701 to 705) of the captured image data 205b shown in FIG. 7A.
  • ID numbers 75 “ID001” to “ID005” can be added to each record.
  • the receiving terminal 2 can identify the captured image based on the ID number.
  • the scene identification information is information for identifying a specific scene of the captured image.
  • a combination of the broadcast station ID 71, program ID 72, and playback time 73 data recorded in the captured image data 205b shown in FIG. 7A corresponds to the scene identification information.
  • the CPU 202 creates the scene identification information data 205c (records 801 to 805) shown in FIG. 8A based on the broadcast station ID 71, program ID 72, and playback time 73 data shown in FIG. 7A, and the created scene identification information data 205c is transmitted to the receiving terminal 2.
  • the CPU 202 displays the scene identification information as shown in FIG. 8B.
  • the scene identification information is created by adding the data of ID numbers 84 “ID001” to “ID005” to the records (records 801 to 805) of the data 205c.
  • the CPU 402 of the receiving terminal 2 which is a television receiver, receives a broadcast program broadcast by the broadcast station 3 as real-time content and displays the received real-time content on the display 401. Thereby, the user of the receiving terminal 2 can view real-time content.
  • the CPU 402 of the receiving terminal 2 records the real-time content displayed on the display 401 in the ring buffer 403a (FIG. 4) as in the case of the first transmitting terminal 20 (step S531).
  • the ring buffer 403a can hold a frame of a content video every predetermined time (for example, every second) for a predetermined time.
  • the CPU 402 of the receiving terminal 2 receives scene identification information for identifying a specific scene of the captured image from the first transmitting terminal 20 (step S532). For example, the CPU 402 receives all the records 801 to 805 of the scene identification information data 205c shown in FIG. 8A or 8B.
  • the CPU 402 of the receiving terminal 2 acquires a plurality of captured images by capturing a plurality of frames including a specific scene indicated by the scene identification information, and acquires the acquired captures.
  • the image is recorded in the captured image data 405b (step S533).
  • the CPU 402 refers to a frame displayed in the past recorded in the ring buffer 403a (FIG. 4), and as shown in FIG. 9A, a record 901 indicating five captured images having different reproduction times 93 by 1 second. Record ⁇ 905.
  • the CPU 402 records the actual data of the captured image 94 in the captured image data 405b in association with each data of the broadcast station ID 91, the program ID 92, and the reproduction time 93.
  • the record 903 in FIG. 9A includes data such as a broadcasting station ID 91 “B001”, a program ID 92 “P101”, a reproduction time 93 “18:15:03”, and a captured image 94 “(R_Cap003.bmp)”.
  • the actual data of the captured image 94 is expressed using a file name indicating an image file, for example, “(R_Cap001.bmp)”.
  • the CPU 402 of the receiving terminal 2 records records 901 to 905 indicating five captured images having different reproduction times 93 by 1 second.
  • the CPU 402 records, for example, the record image 903 of the captured image data of the frame that was displayed when the first transmission terminal 20 received the capture instruction data from the second transmission terminal 30. To do.
  • the CPU 402 records the captured image data for the frames of 2 seconds before and after the record 903 as records 901, 902, 904, and 905.
  • each data indicating the assigned ID number (for example, each data of ID numbers 84 “ID001” to “ID005” shown in FIG. 8B) is set as ID numbers 95 “ID001” to “ID005”. Are recorded in association with the respective records (records 901 to 905).
  • step S ⁇ b> 504 of FIG. 5 the CPU 202 of the first transmission terminal 20 transmits the plurality of capture images acquired in step S ⁇ b> 502 to the second transmission terminal 30.
  • the CPU 202 transmits captured image data based on the records 701 to 705 illustrated in FIG. 7A or 7B to the second transmission terminal 30.
  • the CPU 302 of the second transmission terminal 30 records the captured image based on the data acquired from the first transmission terminal 20 in the captured image data 305b of the flash memory 305 and displays it on the touch panel display 301 (step S513).
  • An example of the captured image data 305b of the second transmission terminal 30 is the same as that shown in FIG. 7A or FIG. 7B in the first transmission terminal 20.
  • the CPU 302 displays one of the captured images based on the data acquired from the first transmission terminal 20 in the captured image display area 300 including the center of the screen.
  • the capture image displayed by default in the capture image display area 300 can be a capture image corresponding to the record 703 in the capture image data 205b illustrated in FIG. 7A or 7B.
  • the CPU 302 displays five thumbnail images 321 to 325 at the lower part of the captured image display area 300.
  • the thumbnail images 321 to 325 are obtained by reducing and displaying captured images corresponding to the records 701 to 705 in the captured image data 205b shown in FIG. 7A or 7B.
  • the CPU 302 can enlarge and display the captured image corresponding to the selected thumbnail image in the captured image display area 300.
  • the time when the user of the transmission terminal 1 presses the capture instruction button 311 of the second transmission terminal 30, and the first transmission terminal 20 are displayed.
  • the user of the transmission terminal 1 can select a captured image at a desired timing even if there is a time lag between the time at which the content is actually captured.
  • the user can select an arbitrary captured image from the five thumbnail images 321 to 325, the image that the user tried to capture is different from the image actually captured by the first transmission terminal 20. Also, the user can reselect the image that he tried to capture.
  • the information transmission system of the present embodiment further improves the operability of the user by further increasing the number of thumbnail images displayed and allowing the user to select a desired captured image by a slide operation or the like. Can do.
  • the CPU 302 of the second transmission terminal 30 receives an input of transmission information to be superimposed on the captured image displayed in the captured image display area 300 from the user (step S514).
  • the user performs an operation of directly writing the transmission information on the captured image displayed in the captured image display area 300 using the user's fingertip, stylus, or the like, or an operation of attaching the selected image to the captured image.
  • Communication information can be entered.
  • the CPU 302 of the second transmission terminal 30 displays a plurality of input operation icons in the vicinity of the capture image display area 300 on the touch panel display 301, for example.
  • the CPU 302 displays a plurality of image selection icons 326 adjacent to the right side of the captured image display area 300.
  • the user can input an image (for example, a hat) of the image selection icon 326 as transmission information by dragging and dropping any one of the image selection icons 326 to the captured image display area 300.
  • the identification code of the object indicating the icon can be included in the transmission information.
  • information such as a display position, a display size, a display color, or a visual effect of an object indicating an icon can be included in the transmission information.
  • the receiving terminal 2 acquires an icon image based on the identification code of the object indicating the icon. For this reason, it is not necessary to transmit / receive icon image data.
  • the CPU 302 displays tool selection icons 331 to 339 adjacent to the right side of the image selection icon 326, for example.
  • the user can perform an input operation corresponding to the tapped tool selection icon by tapping any of the tool selection icons 331 to 339.
  • a user who has selected a tool selection icon 331 indicating a pencil tool can perform an input operation using a free curve.
  • the user who has selected the tool selection icon 332 indicating the line color / line type change tool can perform an operation of changing the color and type of the free curve.
  • a user who has selected a tool selection icon 333 indicating an eraser tool can perform an operation of deleting input characters, graphics, and the like.
  • a user who has selected a tool selection icon 334 indicating a text input tool can perform an operation of inputting text characters.
  • the user can input text characters using a software keyboard displayed on the touch panel display 301.
  • a text character can be displayed on the position based on position information, and transmission information can be transmitted correctly.
  • a user who has selected the tool selection icon 335 indicating a quadrangle input tool can perform an operation of inputting a quadrangle.
  • the user who has selected the tool selection icon 336 indicating the range selection tool can perform an operation of selecting an arbitrary range in the captured image.
  • the user who has selected the tool selection icon 337 indicating the magnifying glass tool can perform an operation of enlarging or reducing the captured image.
  • the user who has selected the tool selection icon 338 indicating an ellipse tool can perform an operation of inputting an ellipse.
  • the user who has selected the tool selection icon 339 indicating the color selection tool can perform an operation of adding a color to an arbitrary range in the captured image.
  • the user can input arbitrary transmission information on the captured image display area 300 as shown in FIG. 10A.
  • the transmission information is displayed superimposed on the captured image in the captured image display area 300.
  • the user can input a surrounding line 101 surrounding the dumbbell 330, which is a display object in the captured image, as transmission information. Further, for example, as shown in FIG. 10A, the user can input a free curve indicating “I want this!” As transmission information on the top of the dumbbell 330 that is a display object in the captured image.
  • step S515 of FIG. 5 the CPU 302 determines whether or not the input of the transmission information by the user is completed. For example, when the CPU 302 detects that the input completion button 312 (FIG. 10A) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed by the user's fingertip or stylus, the transmission information Judge that input is complete.
  • step S515 If the CPU 302 determines that the input of the transfer information has been completed (step S515, Yes), the CPU 302 associates the scene identification information of the captured image on which the transfer information is superimposed with the input transfer information data, and the flash memory 305. Is recorded in the transmission information data 305c (step S516).
  • the CPU 302 generates scene identification information by combining the data of the broadcast station ID 71, the program ID 72, and the reproduction time 73 of the record 703 (FIG. 7A) held in the captured image data 305b, and generates the generated scene identification information. Correlate with the transmitted information. Specifically, as shown in FIG. 11A, the CPU 302 converts the data of the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03” into the actual data “( Com001.bmp) ”is recorded in the transmission information data 305c. Note that the captured image data itself on which the transmission information is to be superimposed is not recorded in the transmission information data 305c.
  • the CPU 302 when an ID number is added to each record (records 701 to 705) of the captured image data 305b, the CPU 302 generates scene identification information including the ID number, The generated scene identification information can be associated with the transmission information. Specifically, as illustrated in FIG. 11B, the CPU 302 replaces the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03” illustrated in FIG. 11A with an ID number. The data of 115 “ID003” is recorded in the transmission information data 305 c in association with the actual data “(Com001.bmp)” of the transmission information 114.
  • the transmission information is still image data, and can be, for example, raster image bitmap image data.
  • the still image data of the transmission information can be in a vector format in which the line thickness, color, etc. can be changed for each line.
  • the CPU 302 of the second transmission terminal 30 transmits the transmission information in which the scene identification information is associated in step S516 to the reception terminal 2 (step S517).
  • the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. 11A or 11B to the receiving terminal 2. Therefore, the captured image data itself on which the transmission information is superimposed is not transmitted to the receiving terminal 2.
  • step S534 of FIG. 5 the CPU 402 of the receiving terminal 2 receives and records the transmission information from the second transmitting terminal 30.
  • the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 11A or FIG. 11B from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411.
  • the transmission information data 405c of the receiving terminal 2 is the same as that shown in FIG. 11A or FIG. 11B in the second transmitting terminal 30.
  • the CPU 402 of the receiving terminal 2 generates a superimposed image obtained by superimposing the transmission information received in step S534 on the captured image recorded in the captured image data 405b in step S533 (step S535).
  • the CPU 402 acquires, from the transmission information data 405c, still image data of transmission information (for example, FIG. 12A) indicating the surrounding line 101 and the free curve 102 indicating "I want this!
  • still image data of transmission information for example, FIG. 12A
  • the actual data “(Com001.bmp)” of the transmission information 114 held in the transmission information data 405c includes the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18: 15:03 ”. Therefore, the CPU 402 captures actual data of the captured image associated with the same data values as the data of the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03”. Obtained from the image data 405b. For example, in the record 903 of FIG.
  • the CPU 402 records the actual data “(R_Cap003) of the captured image 94 associated with the broadcasting station ID 91“ B001 ”, the program ID 92“ P101 ”, and the reproduction time 93“ 18:15:03 ”. .bmp) ".
  • the CPU 402 of the receiving terminal 2 acquires the actual data of the captured image from the captured image data 405 b based on the ID number included in the transmission information. . Specifically, as illustrated in FIG. 11B, when the transmission information data 405c including the data of ID number 115 “ID003” is received, the CPU 402 is associated with the same data value as the data of ID number 115 “ID003”. The actual captured image data “(R_Cap003.bmp)” is acquired from the record 903 of the captured image data 405b shown in FIG. 9B. In this case, the receiving terminal 2 can specify the captured image based on the ID number.
  • the CPU 402 of the receiving terminal 2 superimposes the transmission information (Com001.bmp) illustrated in FIG. 12A on the captured image (R_Cap003.bmp) illustrated in FIG. 12B, thereby superimposing the image 120 illustrated in FIG. 12C. Can be generated.
  • the superimposed image 120 illustrated in FIG. 12C matches the image displayed in the captured image display area 300 illustrated in FIG. 10A.
  • the CPU 402 of the receiving terminal 2 displays the generated superimposed image (step S536).
  • the CPU 402 displays the superimposed image 120 illustrated in FIG. 12C on the display 401 of the receiving terminal 2.
  • the CPU 402 causes the superimposed image 120 to be superimposed and displayed on the content screen 130 that the user of the receiving terminal 2 is viewing.
  • the receiving terminal 2 includes the tablet terminal 40
  • the superimposed image 120 can be displayed on the touch panel display of the tablet terminal 40 as shown in FIG. 13B.
  • the tablet terminal 40 of the receiving terminal 2 has the same configuration as that of the second transmitting terminal 30 (FIG. 3)
  • the user of the receiving terminal 2 transmits transmission information to the user of the transmitting terminal 1. Can do.
  • the user of the transmission terminal 1 and the user of the reception terminal 2 can exchange the transmission information about the captured image of the real-time content being viewed, thereby enabling two-way communication.
  • the CPU 202 of the first transmission terminal 20 returns to step S501 after executing step S504, and repeats the above processing (steps S501 to S504).
  • the CPU 302 of the second transmission terminal 30 returns to step S511 and repeats the above processing (steps S511 to S517).
  • the CPU 402 of the receiving terminal 2 returns to step S531, and repeats the above processing (steps S531 to S536).
  • the transmission terminal 1 includes two devices, the first transmission terminal 20 and the second transmission terminal 30, but the transmission terminal 1 can be configured with one device.
  • FIG. 14 is a diagram illustrating an example of a system configuration of an information transmission system when the transmission terminal 1 is configured by a single device.
  • the transmission terminal 1 includes a real-time content reception unit 11, an image display unit 12, a transmission side buffer unit 13, a scene identification information transmission unit 14, a transmission side capture image acquisition unit 15, a transmission information transmission unit 16, A capture instruction receiving unit 17, a transmission information receiving unit 18, and a captured image display unit 19 are provided.
  • the transmission information reception unit 18 is connected to the transmission terminal 1 with a pen tablet, a liquid crystal pen tablet, a touch pad, a touch panel, a mouse, an air mouse (a mouse capable of posture recognition in space), a pointer (a position on the display). Input information using a pointing device such as a touch panel or a touch panel.
  • a pen tablet a liquid crystal pen tablet
  • a touch pad a touch panel
  • a mouse an air mouse (a mouse capable of posture recognition in space)
  • a pointer a position on the display.
  • Input information using a pointing device such as a touch panel or a touch panel.
  • each function part in the transmission terminal 1 and the reception terminal 2 of this modification is the same as each function part in FIG.
  • the second transmission terminal 30 includes the transmission information transmission unit 16, but the first transmission terminal 20 may include the transmission information transmission unit 16.
  • FIG. 15 is a diagram illustrating an example of a system configuration of the information transmission system when the first transmission terminal 20 includes the transmission information transmission unit 16.
  • the first transmission terminal 20 includes a real-time content reception unit 11, an image display unit 12, a transmission side buffer unit 13, a scene identification information transmission unit 14, a transmission side capture image acquisition unit 15, and a transmission information transmission unit.
  • the second transmission terminal 30 includes a capture instruction receiving unit 17, a transmission information receiving unit 18, and a captured image display unit 19.
  • each function part in the 1st transmission terminal 20, the 2nd transmission terminal 30, and the receiving terminal 2 of this modification is the same as each function part of FIG.
  • the first transmission terminal 20 includes the scene identification information transmission unit 14, but the second transmission terminal 30 may include the scene identification information transmission unit 14.
  • FIG. 16 is a diagram illustrating an example of a system configuration of the information transmission system when the second transmission terminal 30 includes the scene identification information transmission unit 14.
  • the first transmission terminal 20 includes a real-time content reception unit 11, an image display unit 12, a transmission side buffer unit 13, and a transmission side capture image acquisition unit 15.
  • the second transmission terminal 30 includes a scene identification information transmission unit 14, a transmission information transmission unit 16, a capture instruction reception unit 17, a transmission information reception unit 18, and a captured image display unit 19.
  • each function part in the 1st transmission terminal 20, the 2nd transmission terminal 30, and the receiving terminal 2 of this modification is the same as each function part of FIG.
  • the information transmission system described above is configured such that each of the transmission terminal 1 and the reception terminal 2 receives real-time content from the broadcast station 3, but each of the transmission terminal 1 and the reception terminal 2 receives real-time content via the Internet. It is also possible to receive real-time content from a content distribution server that can distribute the content.
  • FIG. 17 is a diagram showing an example of the system configuration of the information transmission system in the case where the configuration is such that real-time content is received from the content distribution server.
  • the real-time content reception unit 11 of the first transmission terminal 20 and the real-time content reception unit 21 of the reception terminal 2 can receive real-time content from the content distribution server 4.
  • the real-time content is content that is distributed in a streaming format, for example, and can be viewed simultaneously by each user in a plurality of terminals.
  • each other function part in the 1st transmission terminal 20, the 2nd transmission terminal 30, and the receiving terminal 2 of this modification is the same as that of each function part of FIG.
  • each of the transmission terminal 1 and the reception terminal 2 uses the ring buffer (203a or 403a) to refer to frames displayed in the past, and captures five captured images with different reproduction times by 1 second. It was set as the structure recorded.
  • the transmission terminal 1 and the reception terminal 2 can be configured not to use a ring buffer.
  • scene identification information is created based on one captured image captured at the timing when a capture instruction is received from the user.
  • the above configuration is, for example, between the time when the user desires to capture, the time when the user presses the capture instruction button 311 of the second transmission terminal 30, and the time when the content is actually captured at the first transmission terminal 20. It is preferable to employ when the time lag between each time is small.
  • a superimposed image is generated based on one captured image captured at the timing when the scene identification information is received from the transmitting terminal 1.
  • the above configuration is preferably employed, for example, when the communication speed between the transmission terminal 1 and the reception terminal 2 is high.
  • FIG. 18A is a diagram illustrating an example of a configuration of a receiving terminal when the receiving terminal 1 is configured by two devices. As illustrated in FIG. 18A, the receiving terminal 1 includes a first receiving terminal 40 and a second receiving terminal 50. Note that the transmitting terminal may have any of the configurations described above.
  • the first receiving terminal 40 is a television receiver having the same hardware configuration as in FIG.
  • the first receiving terminal 40 may be a device device (for example, a disk recording / playback device) that incorporates a tuner capable of receiving a television broadcast.
  • the first receiving terminal 40 may be a device device (for example, a set top box or a disk recording / playback device) that incorporates a tuner that can receive a television broadcast via a network such as a cable television.
  • the first receiving terminal 40 can communicate with the transmitting terminal 1 and the second receiving terminal 50.
  • the second receiving terminal 50 is a tablet-type terminal having the same hardware configuration as in FIG.
  • the second receiving terminal 50 can be a smartphone equipped with a touch panel display or the like.
  • the second receiving terminal 50 can communicate with the transmitting terminal 1 and the first receiving terminal 40.
  • the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving side buffer unit 23, a receiving side captured image acquisition unit 25, and a scene identification information receiving unit 27.
  • the second receiving terminal 50 includes an image superimposing unit 24, an image display unit (superimposed image display unit) 28, and a transmission information receiving unit 26.
  • the image display unit 28 in the second receiving terminal 50 can display the superimposed image generated by the image superimposing unit 24.
  • the image display unit 28 can display an image (for example, a moving image) of the content received by the real-time content receiving unit 21 and allow the user of the receiving terminal 2 to view the content.
  • each other function part in the transmission terminal 1 of this modification and the receiving terminal 2 is the same as that of each function part of FIG.
  • the second receiving terminal 50 is configured to include the transmitted information receiving unit 26, but the first receiving terminal 40 is configured to include the transmitted information receiving unit 26. You can also.
  • FIG. 18B is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26.
  • the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving-side buffer unit 23, a receiving-side captured image acquisition unit 25, a scene identification information receiving unit 27, and a transmission information receiving unit. 26.
  • the second receiving terminal 50 includes an image superimposing unit 24 and an image display unit 28.
  • each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
  • the first receiving terminal 40 includes the scene identification information receiving unit 27, but the second receiving terminal 50 includes the scene identification information receiving unit 27. You can also
  • FIG. 18C is a diagram illustrating an example of a configuration of the receiving terminal when the second receiving terminal 40 includes the scene identification information receiving unit 27.
  • the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving side buffer unit 23, and a receiving side captured image acquisition unit 25.
  • the second receiving terminal 50 includes a scene identification information receiving unit 27, an image superimposing unit 24, an image display unit 28, and a transmission information receiving unit 26.
  • each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
  • the first receiving terminal 40 includes the scene identification information receiving unit 27 and the second receiving terminal 50 includes the transmission information receiving unit 26.
  • the first receiving terminal 40 may include the transmission information receiving unit 26, and the second receiving terminal 50 may include the scene identification information receiving unit 27.
  • FIG. 18D is a diagram illustrating an example of a configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26 and the second receiving terminal 50 includes the scene identification information receiving unit 27.
  • the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving-side buffer unit 23, a receiving-side captured image acquisition unit 25, and a transmission information receiving unit 26.
  • the second receiving terminal 50 includes a scene identification information receiving unit 27, an image superimposing unit 24, and an image display unit 28.
  • each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
  • the second receiving terminal 50 includes the image superimposing unit 24.
  • the first receiving terminal 40 includes the image superimposing unit 24. It can also be configured.
  • FIG. 18E is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 in the sixth modification (FIG. 18B) includes the image superimposing unit 24.
  • the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving-side buffer unit 23, a receiving-side captured image acquisition unit 25, a scene identification information receiving unit 27, and a transmission information receiving unit. 26 and an image superimposing unit 24.
  • the second receiving terminal 50 includes an image display unit 28. 18A, 18C, and 18D, the first receiving terminal 40 may include the image superimposing unit 24.
  • each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
  • the transmission information is superimposed on the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) without transmitting the capture image.
  • the captured image can be displayed on the receiving terminal 2. Further, since the transmission terminal 1 does not need to transmit the captured image to the reception terminal 2, it is possible to reduce the network load by suppressing the transmission data amount.
  • the transmission terminal 1 transmits the scene identification information of the captured image to the reception terminal 2 (step S503)
  • the reception terminal 2 is based on the received scene identification information.
  • a specific scene is captured and a captured image is recorded (step S533).
  • the user of the receiving terminal 2 does not need to record a broadcast program beforehand in the receiving terminal 2. Thereby, the trouble of the user of the receiving terminal 2 recording the content and holding the recorded content can be eliminated.
  • the transmission terminal 1 transmits the transmission information associated with the scene identification information to the reception terminal 2 (step S517)
  • the reception terminal 2 receives the received scene identification.
  • the received transmission information is superimposed on the captured image based on the information to generate a superimposed image (step S535). For this reason, the transmission information from the user of the transmission terminal 1 can be transmitted to the user of the reception terminal 2 accurately and in real time.
  • the real-time content receiving unit 11 and the image display unit 12 in the first transmission terminal 20 include the processing function of step S501 in FIG. 5 as an example.
  • the transmission side capture image acquisition unit 15 in the first transmission terminal 20 includes the processing function of step S502 of FIG. 5 as an example.
  • the scene identification information transmission unit 14 in the first transmission terminal 20 includes the processing function of step S503 in FIG. 5 as an example.
  • the capture instruction reception unit 17 in the second transmission terminal 30 includes the processing function of step S511 in FIG. 5 as an example.
  • the captured image display unit 19 in the second transmission terminal 30 includes the processing function of step S513 in FIG. 5 as an example.
  • the transmission information reception unit 18 in the second transmission terminal 30 includes the processing function of step S514 in FIG. 5 as an example.
  • the transmission information transmission unit 16 in the second transmission terminal 30 includes the processing function of step S517 in FIG. 5 as an example.
  • the real-time content receiving unit 21 and the image display unit 22 in the receiving terminal 2 include the processing function of step S531 in FIG. 5 as an example.
  • the scene identification information receiving unit 27 in the receiving terminal 2 includes the processing function of step S532 in FIG. 5 as an example.
  • the receiving side capture image acquisition unit 25 in the receiving terminal 2 includes the processing function of step S533 in FIG. 5 as an example.
  • the transmission information receiving unit 26 in the receiving terminal 2 includes the processing function of step S534 in FIG. 5 as an example.
  • the image superimposing unit 24 in the receiving terminal 2 includes the processing function of step S535 in FIG. 5 as an example.
  • each function part of the transmission terminal 1 (the 1st transmission terminal 20, the 2nd transmission terminal 30) and the reception terminal 2 is implement
  • the information transmission system is configured such that the transmission terminal 1 and the reception terminal 2 communicate with each other via the network N, but can be connected to the transmission terminal 1 and the reception terminal 2 via the network N, respectively.
  • a simple management server can be provided.
  • the same reference numerals are given to functional units or elements common to those in the first embodiment, and a duplicate description thereof is omitted.
  • FIG. 19 is a diagram illustrating an example of a system configuration of an information transmission system according to the second embodiment.
  • the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30), the reception terminal 2a, the reception terminal 2b, the reception terminal 2c, and the management server 5 are connected via the network N so that they can communicate with each other.
  • the management server 5 is a computer device.
  • FIG. 19 three receiving terminals, that is, the receiving terminal 2a, the receiving terminal 2b, and the receiving terminal 2c are shown, but there may be one or more receiving terminals. In FIG. 19, only one transmission terminal 1 and management server 5 are shown, but there may be a plurality of each.
  • the management server 5 includes a scene identification information distribution unit 51 and a transmission information distribution unit 52.
  • the scene identification information distribution unit 51 in the management server 5 distributes the scene identification information received from the transmission terminal 1 to each of the reception terminal 2a, the reception terminal 2b, and the reception terminal 2c.
  • the transmission information distribution unit 52 in the management server 5 distributes the transmission information received from the transmission terminal 1 to each of the reception terminal 2a, the reception terminal 2b, and the reception terminal 2c.
  • the management server 5 distributes the scene identification information and the transmission information received from the transmission terminal 1 to each of the reception terminal 2b and the reception terminal 2c, so that the captured image and the transmission information based on the scene identification information are transmitted. It can be shared among the terminal 1, the receiving terminal 2a, the receiving terminal 2b, and the receiving terminal 2c.
  • the management server 5 is an SNS (Social Network Service) server or a chat server, it is possible to communicate between a plurality of users via transmission information transmitted from the transmission terminal.
  • SNS Social Network Service
  • FIG. 20 is a diagram illustrating an example of a hardware configuration in which the management server 5 illustrated in FIG. 19 is realized using a CPU.
  • the management server 5 includes a display 601, a CPU 602, a RAM 603, a keyboard / mouse 604, a hard disk drive 605, and a communication circuit 606.
  • the display 601 can display images of various data output in accordance with instructions from the CPU 602.
  • the CPU 602 can execute processing based on the OS (not shown) and the management server control program 605a.
  • the memory 603 can provide an address space to the CPU 602.
  • the keyboard / mouse 604 can accept the operation of the management server 5 from the user.
  • the hard disk drive 605 can hold a management server control program 605a, scene identification information data 605b, transmission information data 605c, and the like.
  • the management server control program 605a can be, for example, a communication program that can provide SNS and chat service.
  • the communication circuit 606 can communicate with the transmission terminal 1, the reception terminal 2a, the reception terminal 2b, the reception terminal 2c, and the like via the network N (FIG. 19).
  • the scene identification information distribution unit 51 and the transmission information distribution unit 52 included in the management server 5 illustrated in FIG. 19 are realized by executing the management server control program 605a on the CPU 602.
  • the scene identification information received by the scene identification information distribution unit 51 from the transmission terminal 1 is held in the scene identification information data 605b of the hard disk drive 605 as an example.
  • the transmission information received by the transmission information distribution unit 52 from the transmission terminal 1 is held in the transmission information data 605c of the hard disk drive 605 as an example.
  • FIG. 21A and FIG. 21B show that the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) in the information transmission system of the present embodiment passes through the management server to the reception terminal 2a, the reception terminal 2b, and It is a figure which shows an example of the flowchart which shows the process in the case of transmitting the transmission information of the user who views a real-time content to the receiving terminal 2c (henceforth only "receiving terminal 2").
  • 21A and 21B are the same as those shown in FIG. 5 except that the management server 5 is interposed between the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2. It is the same as the processing.
  • FIG. 5 differences from FIG. 5 will be described.
  • the CPU 202 of the first transmission terminal 20 transmits the scene identification information of the captured image to the management server 5 (step S503).
  • the CPU 602 of the management server 5 receives the scene identification information from the first transmission terminal 20 and records it in the scene identification information data 605b of the hard disk drive 605 (step S521).
  • An example of the scene identification information data 605b is the same as that in FIG. 8A or FIG. 8B.
  • the CPU 602 of the management server 5 distributes the scene identification information received from the first transmission terminal 20 to the reception terminal 2 (step S522).
  • the CPU 402 of the receiving terminal 2 receives scene identification information for identifying a specific scene of the captured image from the management server 5 (step S532a).
  • step S517a the CPU 302 of the second transmission terminal 30 transmits the transmission information associated with the scene identification information to the management server 5 in step S516.
  • the CPU 602 of the management server 5 receives the transmission information from the second transmission terminal 30 and records it in the transmission information data 605c of the hard disk drive 605 (step S523).
  • An example of the transmission information data 605c is the same as that in FIG. 11A or FIG. 11B.
  • the CPU 602 of the management server 5 transmits the transmission information received from the second transmission terminal 30 to the reception terminal 2 (step S524).
  • the CPU 402 of the receiving terminal 2 receives and records the transmission information from the management server 5 (step S534a).
  • the CPU 402 of the receiving terminal 2 can display the generated superimposed image, for example, in a chat format. As illustrated in FIG. 22, for example, the CPU 402 displays the superimposed image 120 illustrated in FIG. 12C on the content screen 130 that the user of the receiving terminal 2 is viewing. At this time, the CPU 402 displays the superimposed image 120 in a chat format in association with the icon 140 indicating “Mr. C”, for example.
  • the CPU 402 displays, in a chat format, a captured image 142 in which the icon image 103 (hat) selected by the image selection icon 326 (FIG. 6C) is superimposed in association with the icon 141 indicating “Mr. B”, for example. .
  • the input text character “This movie is fun!” 144 is displayed in a chat format in association with the icon 143 indicating “Mr. A”. To do.
  • the CPU 602 of the management server 5 returns to step S521 after executing step S524, and repeats the above processing (steps S521 to S524).
  • the scene identification information distribution unit 51 in the management server 5 includes the processing functions of steps S521 and S522 of FIG. 21A as an example.
  • the transmission information distribution unit 52 in the management server 5 includes the processing functions of steps S523 and S524 in FIG. 21B as an example.
  • the configurations of the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are the same as those illustrated in FIG. 1 of the first embodiment. Although similar, this embodiment is applicable to other configurations.
  • the configuration of the transmission terminal 1 or the reception terminal 2 in this embodiment is the same as that of the first to tenth modifications in the first embodiment (transmission terminal 1 (first transmission terminal 20 and second transmission terminal 2). Transmission terminal 30) or reception terminal 2 (first reception terminal 40 and second reception terminal 50)).
  • FIG. 23A is a diagram schematically illustrating a relationship between data transmitted and received between the transmission terminal 1 and the reception terminal 2 in the embodiment.
  • the transmission terminal 1 transmits scene identification information including the reproduction time (hereinafter referred to as “reproduction time etc. 281”) to the reception terminal 2.
  • the receiving terminal 2 acquires the captured image 283 based on the reproduction time 281 received from the transmitting terminal 1.
  • the transmission terminal 1 transmits the reproduction time 281 and the transmission information 282 to the reception terminal 2.
  • the receiving terminal 2 generates a superimposed image by associating the captured image 283 acquired above with the transmission information 282 based on the reproduction time 281 received together with the transmission information 282 from the transmission terminal 1.
  • the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 associate the capture image 283 with the transmission information 282 using the reproduction time 281 or the like. ing.
  • FIG. 23B to FIG. 23E are diagrams schematically illustrating a relationship of data transmitted and received between the transmission terminal 1 and the reception terminal 2 in different embodiments.
  • the transmission terminal 1 issues an ID number and transmits scene identification information (hereinafter referred to as “ID number 284”) including the issued ID number to the reception terminal 2.
  • ID number 284 scene identification information
  • the receiving terminal 2 acquires the captured image 283 when receiving the ID number 284 from the transmitting terminal 1, and associates the acquired captured image 283 with the received ID number 284.
  • the transmission terminal 1 transmits the ID number 284 and the transmission information 282 issued above to the reception terminal 2. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired above with the transmission information 202.
  • the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the transmission terminal 1 to associate the capture image 283 with the transmission information 282. be able to.
  • the transmission terminal 1 issues an ID number associated with the reproduction time and the like, and scene identification information (hereinafter referred to as “ID number 284”) including the reproduction time and the ID number corresponding to the reproduction time. Or “reproduction time etc. 281”) is transmitted to the receiving terminal 2.
  • the receiving terminal 2 acquires the captured image 283 based on the reproduction time 281 received from the transmitting terminal 1, and associates the acquired captured image 283 with the received ID number 284.
  • the transmission terminal 1 transmits the ID number 284 and the transmission information 282 issued above to the reception terminal 2. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired in the above with the transmission information 282.
  • the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the transmission terminal 1 to associate the capture image 283 with the transmission information 282. be able to.
  • the transmission terminal 1 transmits scene identification information including a capture instruction (hereinafter, “capture instruction 285”) to the reception terminal 2.
  • the receiving terminal 2 acquires the capture image 283 at the time of receiving the capture instruction 285 from the transmission terminal 1, issues an ID number, associates the issued ID number with the acquired capture image 283, and issues the issued ID
  • the number is transmitted to the transmission terminal 1. That is, the capture instruction 285 transmitted by the transmission terminal 1 can be data (for example, a predetermined code such as “TRG”) that functions as a trigger for causing the reception terminal 2 to acquire a captured image.
  • TRG predetermined code
  • the transmission terminal 1 transmits the ID number 284 and the transmission information 282 received from the reception terminal 2 to the reception terminal 2. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired in the above with the transmission information 282.
  • the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the reception terminal 2 to associate the capture image 283 with the transmission information 282. be able to.
  • the transmission terminal 1 transmits scene identification information including a capture instruction (hereinafter, referred to as “capture instruction 285”) to the management server 5.
  • the management server 5 issues an ID number when receiving the capture instruction 285 from the transmission terminal 1 and transmits the issued ID number to both the transmission terminal 1 and the reception terminal 2. That is, the capture instruction 285 transmitted by the transmission terminal 1 can be data (for example, a predetermined code such as “CAP”) that functions as a trigger for causing the management server 5 to issue an ID number.
  • CAP predetermined code
  • the receiving terminal 2 acquires the captured image 283 when the ID number 284 is received from the management server 5, and associates the acquired captured image 283 with the received ID number 284.
  • the transmission terminal 1 transmits the ID number 284 and the transmission information 282 received from the management server 5 to the reception terminal 2 via the management server 5. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1 via the management server 5, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired above with the transmission information 282. To do.
  • the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the management server 5 to associate the capture image 283 with the transmission information 282. be able to.
  • FIG. 24 is a diagram illustrating an example of a system configuration of an information transmission system according to the fourth embodiment.
  • the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so as to be able to communicate with each other. .
  • the first transmission terminal 20 includes a quiz information creation unit 181 and a quiz information transmission unit 161 instead of the transmission information transmission unit 16 and the transmission information reception unit 18 illustrated in FIG. 1 of the first embodiment.
  • the receiving terminal 2 includes an image composition unit 241 and a quiz information receiving unit 261 instead of the image superimposing unit 24 and the transmission information receiving unit 26 illustrated in FIG. 1 of the first embodiment.
  • the quiz information creation unit 181 in the second transmission terminal 30 receives quiz setting input related to the captured image displayed by the capture image display unit 19 in the second transmission terminal 30 and generates quiz information.
  • the quiz information transmission unit 161 in the second transmission terminal 30 transmits the quiz information generated by the quiz information creation unit 181 in the second transmission terminal 30 to the reception terminal 2.
  • the quiz information receiving unit 261 in the receiving terminal 2 receives from the second transmitting terminal 30 the quiz information generated by the quiz information creating unit 161 in the second transmitting terminal 30 receiving the setting input.
  • the image combining unit 241 in the receiving terminal 2 combines the quiz information received by the quiz information receiving unit 261 in the receiving terminal 2 and the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2 to generate a combined image. Generate.
  • the image display unit 22 in the receiving terminal 2 displays the synthesized image generated by the image synthesizing unit 241 in the receiving terminal 2 and causes the user of the receiving terminal 2 to visually recognize the synthesized image.
  • combining two or more images is a superordinate concept of superposing two or more images. That is, image composition is a broad concept including image superposition.
  • the quiz information transmission unit 161, the quiz information creation unit 181, the image composition unit 241 and the quiz information reception unit 261 shown in FIG. 24 each include a CPU function realized by a program.
  • FIG. 25 and FIG. 27 respectively show a case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits quiz information as transmission information to the reception terminal 2 in the information transmission system of the present embodiment. It is a figure which shows an example of the flowchart which shows the process of.
  • FIG. 26 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 28 is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 31 is a diagram illustrating an example of the transmission information data 305 c held as quiz information in the second transmission terminal 30.
  • each process other than steps S511a, S514a, S516a, S517a, S534a, S535a, and S536a is the same as the process shown in FIG. In the following, description will be made focusing on the differences between FIG. 25 and FIG.
  • step S511a the CPU 302 of the second transmission terminal 30 determines whether or not the quiz creation button has been pressed by the user. For example, when the CPU 302 detects that the quiz creation button 313 (FIG. 26) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the quiz creation button It is determined that is pressed.
  • step S511a When it is determined that the quiz creation button 313 has been pressed (Yes in step S511a), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 as in the first embodiment (step S511). S512).
  • FIG. 27 is a diagram illustrating an example of a flowchart of quiz information creation processing.
  • the CPU 302 of the second transmission terminal 30 displays an image selection screen on the touch panel display 301 (step S621).
  • FIG. 28 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • a partial selection button 2801 for selecting a part of the image and an entire selection button 2802 for selecting the entire image are displayed.
  • a selection frame 2811 for selecting a part of the image is displayed on the image. Note that the user can select a desired image portion by, for example, dragging the frame of the selection frame 2811 to change the shape and size.
  • FIG. 28 it is assumed that a character face portion is selected in a selection frame 2811.
  • the CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the image selection screen (step S622). For example, when the CPU 302 detects that the “Next” button 2813 (FIG. 28) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the user's fingertip or stylus, the image selection is performed. It is determined that selection completion is instructed on the screen (step S622, Yes determination). When the “return” button 2812 (FIG. 28) is pressed, the CPU 302 returns to step S511 to display the previous display screen (FIG. 26).
  • step S623 of FIG. 27 the CPU 302 of the second transmission terminal 30 displays a quiz setting screen on the touch panel display 301.
  • FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the quiz setting screen shown in FIG. 29 displays a setting field 2901 related to “question type”, a setting field 2902 related to “answer format”, and a setting field 2903 related to “subject” for each quiz to be created.
  • a “question” format that uses a complete sentence for the question sentence, or a “fill-in” form that uses a sentence that is partially missing in the question sentence can be set as the question format for the quiz. It is.
  • the setting item related to “question type” is not limited to “question” or “fill-in”.
  • setting items related to “problem format” can be extracted from data stored in advance or acquired from another device via the Internet, for example.
  • the “4 choice” format for selecting an answer from four choices, the “2 choice” format for selecting an answer from two choices, or the answer as a character "Character input” format entered with can be set as a quiz answer format.
  • the setting item related to “answer format” is not limited to “4 choices”, “2 choices”, or “character input”.
  • the setting items related to “answer format” can be extracted from data stored in advance, or can be acquired from another device via the Internet.
  • the keyword that is the subject of the quiz is extracted based on, for example, metadata associated with the position of the captured image on the content. For this reason, a keyword highly relevant to the captured image can be presented as a candidate for the subject.
  • the keywords related to the subject can be extracted from data stored in advance or acquired from another device via the Internet.
  • the CPU 302 of the second transmission terminal 30 determines whether or not setting completion is instructed on the quiz setting screen (step S624). For example, when the CPU 302 detects that the “quiz creation” button 2912 (FIG. 29) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the fingertip of the user or the stylus, the quiz It is determined that setting completion is instructed on the setting screen (step S624, Yes determination). If the “return” button 2911 is pressed, the CPU 302 returns to step S621 and displays the previous image selection screen (FIG. 28).
  • step S625 of FIG. 27 the CPU 302 of the second transmission terminal 30 generates a quiz screen based on the various setting data (question format, answer format and subject) set on the quiz setting screen (FIG. 29) described above,
  • the generated quiz screen is displayed on the touch panel display 301.
  • the quiz screen can be generated using a quiz generation program (not shown) that can automatically generate a quiz screen by inputting various setting data (question format, answer format and subject).
  • FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the question format is “Question” (“What is the name of this character?” 3002)
  • the answer format is “4 choices” (“1. John”, “2. George”).
  • “3. Paul”, “4. Apple” (3003) and the theme is “Character” (“What is the name of this character?” 3002).
  • the selection image 3001 displayed on the quiz screen is the same as the image selected by the user in step S621.
  • the quiz screen can be generated based on metadata associated with the position of the captured image on the content, for example.
  • the CPU 302 identifies the name of the character appearing at the time of the captured image from the metadata, and sets it as the correct answer for the quiz. Further, the CPU 302 can extract information related to other options of the quiz from the dictionary or obtain it from another device via the Internet.
  • the user can correct the contents of the quiz displayed on the quiz screen by pressing the “correct” button 3012 on the quiz screen.
  • the quiz screen shifts to the correction mode, and correction input of each character and correction of the position, size, range, or the like of an image or a figure can be appropriately performed (step). S626, Yes judgment). If the “return” button 3011 is pressed, the CPU 302 returns to step S625 and displays the quiz setting screen (FIG. 29), which is the previous screen.
  • correct mode it is desirable that correct options cannot be corrected.
  • another option and the display mode may be changed so that the correct option “2. George” cannot be corrected (for example, the correct answer option is displayed in a different color from the other options). .
  • the user can complete the quiz screen by pressing the “complete” button 3013 (No in step S626).
  • the data for configuring the quiz screen is held in the flash memory 305 as quiz information, for example. Note that the quiz information does not include the captured image data itself.
  • step S514a If the CPU 302 determines that the quiz screen has been completed by completing the quiz information creation process in step S514a, the CPU 302 associates the quiz information data with the scene identification information of the captured image captured in step S502, for example, flash.
  • the transmission information data 305c in the memory 305 is recorded (step S516a).
  • the CPU 302 converts the data of the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03” into the actual data “(Quiz001.html) of the quiz information 114a. ) "And recorded in the transmission information data 305c.
  • the captured image data itself is not recorded in the transmission information data 305c.
  • the data format of the actual data of the quiz information 114a may be a format other than html (for example, CSS (Cascading Style Style Sheets)).
  • the CPU 302 of the second transmission terminal 30 transmits the quiz information associated with the scene identification information in the above step S516a to the reception terminal 2 (step S517a).
  • the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. Therefore, the captured image data itself to which the quiz information is associated is not transmitted to the receiving terminal 2.
  • step S534a in FIG. 25 the CPU 402 of the receiving terminal 2 receives and records quiz information from the second transmitting terminal 30.
  • the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 31 from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411.
  • the transmission information data 405c of the receiving terminal 2 is the same as that shown in FIG. 31 in the second transmitting terminal 30.
  • the CPU 402 of the receiving terminal 2 generates a quiz image by synthesizing the capture image recorded in the capture image data 405b in step S533 and the quiz information received in step S534a (step S535a).
  • the CPU 402 of the receiving terminal 2 displays the generated quiz image (step S536a). As illustrated in FIG. 32, for example, the CPU 402 causes the quiz image 121 to be superimposed and displayed on the content screen 130 being viewed by the user of the receiving terminal 2.
  • step S621 As a modification of the image selection shown in step S621 (FIG. 27), instead of an image, a sound reproduced together with the image, a character recognizable on the image, or the like may be selected. In this case, a quiz relating to voice, characters, etc. is created.
  • an image or icon eg, a thumbnail image
  • the image to be used as an option is specified by the reproduction time and the coordinate position, information on the reproduction time and the coordinate position of each image is included in the quiz information, and each image is specified and displayed on the receiving terminal 2. What is necessary is just to comprise.
  • step S514a the user selects an image (step S621), inputs quiz settings (step S623), and corrects the quiz screen (step S625), thereby quiz screen.
  • the CPU 302 of the second transmission terminal 30 may automatically create a quiz screen. In this case, the user can create a quiz simply by pressing the “quiz creation” button 313 in FIG.
  • the CPU 302 of the second transmission terminal 30 may automatically execute at least one of image selection (step S621), quiz setting input (step S623), and quiz screen correction (step S625). .
  • step S621 the image selected in the selection frame 2811 is used as it is for the quiz, but a processed image of the selected image or the entire image may be used for the quiz.
  • a processed image of the selected image or the entire image may be used for the quiz.
  • an image on which edge detection is performed, an image on which another image is superimposed as shown in the first embodiment, or the like can be used for the quiz.
  • the edge detection range and position information and the information of the image to be superimposed are included in the quiz information and transmitted to the receiving terminal 2.
  • an advertisement corresponding to the theme of the quiz may be displayed on the quiz screen.
  • the advertisement may be set manually by the user or automatically by the CPU 302.
  • the link information of the advertisement to be displayed may be included in the quiz information, and the receiving terminal 2 may display the advertisement based on the link information.
  • the transmission information is quiz information
  • advertisement information may be created instead of the quiz information.
  • transmission information displayed in association with a captured image is advertisement information for allowing a receiving-side user to view.
  • functional units or elements that are common to the first to fourth embodiments are given the same reference numerals, and duplicate descriptions thereof are omitted.
  • FIG. 33 is a diagram illustrating an example of a system configuration of an information transmission system according to the fifth embodiment.
  • the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so as to be able to communicate with each other. .
  • the first transmission terminal 20 includes an advertisement information creation unit 182 and an advertisement information transmission unit 162 in place of the transmission information transmission unit 16 and the transmission information reception unit 18 illustrated in FIG. 1 of the first embodiment.
  • the receiving terminal 2 includes an image composition unit 242 and an advertisement information receiving unit 262 instead of the image superimposing unit 24 and the transmission information receiving unit 26 illustrated in FIG. 1 of the first embodiment.
  • the advertisement information creation unit 182 in the second transmission terminal 30 receives advertisement setting input related to the capture image displayed by the capture image display unit 19 in the second transmission terminal 30 and generates advertisement information.
  • the advertisement information transmission unit 162 in the second transmission terminal 30 transmits the advertisement information generated by the advertisement information creation unit 182 in the second transmission terminal 30 to the reception terminal 2.
  • the advertisement information receiving unit 262 in the receiving terminal 2 receives from the second transmitting terminal 30 the advertising information generated by the advertisement information creating unit 162 in the second transmitting terminal 30 receiving the setting input.
  • the image combining unit 242 in the receiving terminal 2 combines the advertisement information received by the advertisement information receiving unit 262 in the receiving terminal 2 and the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2 to generate a combined image. Generate.
  • the image display unit 22 in the receiving terminal 2 displays the synthesized image generated by the image synthesizing unit 242 in the receiving terminal 2 so that the user of the receiving terminal 2 can visually recognize the synthesized image.
  • Each of the advertisement information transmission unit 162, the advertisement information creation unit 182, the image composition unit 242 and the advertisement information reception unit 262 shown in FIG. 33 includes a CPU function realized by a program.
  • FIGS. FIG. 34 and FIG. 36 respectively show a case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits advertisement information as transmission information to the reception terminal 2 in the information transmission system of the present embodiment. It is a figure which shows an example of the flowchart which shows the process of.
  • FIG. 35 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 37 is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 38 is a diagram illustrating an example when an advertisement setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 39 is a diagram illustrating an example when an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • each process other than steps S511b, S514b, S516b, S517b, S534b, S535b, and S536b is the same as the process shown in FIG. In the following, the description will be focused on the difference between FIG. 34 and FIG.
  • the CPU 302 of the second transmission terminal 30 determines whether or not the advertisement creation button has been pressed by the user (step S511b). For example, if the CPU 302 detects that the advertisement creation button 314 (FIG. 35) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the advertisement creation button It is determined that is pressed.
  • step S511b If it is determined that the advertisement creation button 314 has been pressed (step S511b, Yes determination), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 as in the first embodiment (step S512).
  • FIG. 35 is a diagram illustrating an example of a flowchart of the advertisement information creation process.
  • the CPU 302 of the second transmission terminal 30 displays an image selection screen on the touch panel display 301 (step S631).
  • FIG. 37 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • a partial selection button 2801 for selecting a part of an image and an entire selection button 2802 for selecting the entire image are displayed on the image selection screen shown in FIG. .
  • a selection frame 2811 for selecting a part of the image is displayed on the image as in the fourth embodiment.
  • the CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the image selection screen (step S632). For example, when the CPU 302 detects that the “Next” button 2813 (FIG. 37) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the user's fingertip or stylus, the image selection is performed. It is determined that selection completion is instructed on the screen (step S632, Yes determination).
  • FIG. 38 is a diagram illustrating an example when an advertisement setting screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the advertisement setting screen shown in FIG. 38 displays a setting field 3701 related to “product selection”, a setting field 3702 related to “purchase information”, and a setting field 3703 related to “review information” for the advertisement to be created.
  • the setting column 3701 regarding “product selection” for example, a list of works released by a person related to the captured image in FIG. 37 is displayed so as to be selectable.
  • the person related to the captured image can be specified from, for example, metadata associated with the position of the captured image on the content.
  • the person on the captured image is a singer
  • a music CD released by the singer is displayed in the product selection field.
  • the information on the music CD released by the singer can be extracted from data stored in advance or acquired from another device via the Internet, for example.
  • the setting item regarding “product selection” is not limited to the music CD.
  • setting field 3702 regarding “purchase information” information related to the purchase destination of the product displayed in the product selection field is displayed in a selectable manner.
  • the setting item related to “purchase information” is not limited to “URL1”, “URL2”, or “URL3”.
  • setting items related to “purchase information” can be extracted from data stored in advance, or can be acquired from another device via the Internet.
  • the word-of-mouth information can be extracted from prestored data or acquired from another device via the Internet.
  • the CPU 302 of the second transmission terminal 30 determines whether setting completion is instructed on the advertisement setting screen (step S634). For example, when the CPU 302 detects that the “create advertisement” button 3712 (FIG. 38) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the fingertip of the user or the stylus, the advertisement is displayed. It is determined that setting completion is instructed on the setting screen (step S634, Yes determination). If the “return” button 3711 is pressed, the CPU 302 returns to step S631 and displays the previous image selection screen (FIG. 37).
  • step S635 of FIG. 36 the CPU 302 of the second transmission terminal 30 generates an advertisement screen based on the various setting data (product selection, purchaser information and word-of-mouth information) set on the advertisement setting screen (FIG. 38) described above. Then, the generated advertisement screen is displayed on the touch panel display 301.
  • the advertisement screen can be generated using an advertisement generation program (not shown) that can automatically generate an advertisement screen by inputting various setting data (product selection, purchaser information and review information). .
  • FIG. 39 is a diagram illustrating an example when an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the product information is “Best 1 (A child)” 3801
  • the purchase destination information is “URL1” 3802
  • the word-of-mouth information is “Singing power is amazing!” 3803.
  • the case shows the created advertisement.
  • the user can correct the content of the advertisement displayed on the advertisement screen by pressing the “modify” button 3812 on the advertisement screen.
  • the advertisement screen shifts to a correction mode, and correction input of each character and correction of the position, size, range, etc. of an image or a figure can be made as appropriate (step). S636, Yes judgment).
  • the “return” button 3811 is pressed, the CPU 302 returns to step S633 and displays the advertisement setting screen (FIG. 38), which is the previous screen.
  • the user can complete the advertisement screen by pressing the “complete” button 3813 (No in step S636).
  • data for configuring the advertisement screen is held as advertisement information in, for example, the flash memory 305.
  • the CPU 302 associates the scene identification information of the captured image captured in step S502 with the advertisement information data, for example, flash. It is recorded in the transmission information data 305c of the memory 305 (step S516b).
  • transmission information data 305c is basically the same as FIG. However, in this embodiment, instead of the actual data “(Quiz001.html)” of the quiz information 114a, for example, actual data “(Adv001.html)” of the advertisement information 114b is recorded.
  • the CPU 302 of the second transmission terminal 30 transmits the advertisement information associated with the scene identification information in the above step S516b to the reception terminal 2 (step S517b).
  • the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. Therefore, the captured image data itself to which the advertisement information is associated is not transmitted to the receiving terminal 2.
  • step S534b of FIG. 34 the CPU 402 of the receiving terminal 2 receives the advertisement information from the second transmitting terminal 30 and records it.
  • the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 31 from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411.
  • the CPU 402 of the receiving terminal 2 generates an advertisement image by combining the captured image recorded in the captured image data 405b in step S533 and the advertisement information received in step S534b (step S535b).
  • the CPU 402 of the receiving terminal 2 displays the generated advertisement image (step S536b). As illustrated in FIG. 40, for example, the CPU 402 causes the advertisement image 122 to be superimposed and displayed on the content screen 130 being viewed by the user of the receiving terminal 2.
  • the user of the receiving terminal 2 When the user of the receiving terminal 2 actually purchases a product via the link information (for example, “URL1” 3802) displayed on the advertising image 122, the user of the transmitting terminal 1 who created the advertising image 122 is notified. You may introduce an affiliate to be paid a reward. Thereby, the strong motivation for advertisement preparation can be given to the user of a transmission terminal.
  • the link information for example, “URL1” 3802
  • step S631 As a modification of the image selection shown in step S631 (FIG. 36), instead of an image, sound reproduced along with the image, characters recognizable on the image, or the like may be selected. In this case, an advertisement related to voice, characters, etc. is created.
  • step S631 the user selects an image (step S631), inputs advertisement settings (step S623), and modifies the advertisement screen (step S635).
  • the CPU 302 of the second transmission terminal 30 may automatically create an advertisement screen. In this case, the user can create an advertisement simply by pressing the “Create Advertisement” button 314 in FIG. Further, the CPU 302 of the second transmission terminal 30 may automatically execute at least one of image selection (step S631), advertisement setting input (step S633), and advertisement screen modification (step S635).
  • the transmission information is quiz information or advertising information
  • the transmission information displayed in association with the captured image is game information for causing the receiving user to execute the game.
  • the same reference numerals are given to the functional units or elements common to the first to fifth embodiments, and a duplicate description thereof will be omitted.
  • FIG. 41 is a diagram illustrating an example of a system configuration of an information transmission system according to the sixth embodiment.
  • the transmission terminal 1 the first transmission terminal 20 and the second transmission terminal 30
  • the reception terminal 2 are connected via a network N so as to be able to communicate with each other. .
  • the first transmission terminal 20 includes a game information creation unit 183 and a game information transmission unit 163 instead of the transmission information transmission unit 16 and the transmission information reception unit 18 illustrated in FIG. 1 of the first embodiment.
  • the receiving terminal 2 includes an image composition unit 243 and a game information receiving unit 263 instead of the image superimposing unit 24 and the transmission information receiving unit 26 illustrated in FIG. 1 of the first embodiment.
  • the game information creation unit 183 in the second transmission terminal 30 receives game setting input related to the captured image displayed by the capture image display unit 19 in the second transmission terminal 30 and generates game information.
  • the game information transmission unit 163 in the second transmission terminal 30 transmits the game information generated by the game information creation unit 183 in the second transmission terminal 30 to the reception terminal 2.
  • the game information receiving unit 263 in the receiving terminal 2 receives from the second transmitting terminal 30 the game information generated by the game information creating unit 163 in the second transmitting terminal 30 receiving the setting input.
  • the image combining unit 243 in the receiving terminal 2 combines the advertisement information received by the game information receiving unit 263 in the receiving terminal 2 and the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2 to generate a combined image. Generate.
  • the image display unit 22 in the receiving terminal 2 displays the synthesized image generated by the image synthesizing unit 242 in the receiving terminal 2 so that the user of the receiving terminal 2 can visually recognize the synthesized image.
  • the game information transmission unit 163, the game information creation unit 183, the image synthesis unit 243, and the game information reception unit 263 shown in FIG. 41 each include a CPU function realized by a program.
  • FIGS. FIG. 42 and FIG. 44 show a case where the transmission terminal 1 (first transmission terminal 20 and second transmission terminal 30) transmits game information as transmission information to the reception terminal 2 in the information transmission system of the present embodiment, respectively. It is a figure which shows an example of the flowchart which shows the process of.
  • FIG. 43 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 45 is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 46 is a diagram illustrating an example when a game selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • FIG. 47 is a diagram illustrating an example when a game screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • each process other than steps S511c, S514c, S516c, S517c, S534c, S535c, and S536c is the same as the process shown in FIG. In the following, description will be made focusing on the differences between FIG. 43 and FIG.
  • the CPU 302 of the second transmission terminal 30 determines whether or not the game creation button has been pressed by the user (step S511c). For example, when the CPU 302 detects that the game creation button 315 (FIG. 43) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the game creation button It is determined that is pressed.
  • step S511c Yes determination
  • the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 as in the first embodiment (step S512).
  • FIG. 44 is a diagram illustrating an example of a flowchart of the game information creation process. 44, the CPU 302 of the second transmission terminal 30 displays an image selection screen on the touch panel display 301 (step S641).
  • FIG. 45 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the image selection screen shown in FIG. 45 displays a partial selection button 2801 for selecting a part of the image and an entire selection button 2802 for selecting the entire image, respectively, as in the fourth embodiment.
  • a selection frame 2811 for selecting a part of the image is displayed on the image as in the fourth embodiment.
  • the CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the image selection screen (step S642). For example, when the CPU 302 detects that the “next” button 2813 (FIG. 45) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the user's fingertip or stylus, the image selection is performed. It is determined that selection completion is instructed on the screen (step S642, Yes determination).
  • FIG. 46 is a diagram illustrating an example when a game selection screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the game selection screen shown in FIG. 46 is configured so that the type of game to be created can be selected.
  • the game selection screen is configured such that each game such as “15 puzzle” 4601, “find mistake” 4602, or “nervation” 4603 can be selected.
  • step S645 of FIG. 44 the CPU 302 of the second transmission terminal 30 is based on the game type (“15 puzzle”, “look for mistake” or “nervation”) selected on the above-described game selection screen (FIG. 46).
  • a game screen is generated, and the generated game screen is displayed on the touch panel display 301.
  • the game screen is generated by using a game generation program (not shown) that can automatically generate a game screen by inputting a game type (“15 puzzle”, “find mistake” or “nervation”). It can be carried out.
  • FIG. 47 is a diagram illustrating an example in which an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30.
  • the game screen shown in FIG. 47 displays, for example, a 15 puzzle game in which the captured image is equally divided into 16 blocks and an image 4701 in which the lower right block is deleted is displayed, and each block is slid to complete the original image. Show.
  • the user can correct the content of the advertisement displayed on the game screen by pressing the “correct” button 4712 on the game screen.
  • the correction button 4712 is pressed, the game screen shifts to the correction mode, and correction input of each character and correction of the position, size, range, or the like of an image or a figure can be appropriately made (step) S646, Yes judgment).
  • the “return” button 4711 is pressed, the CPU 302 returns to step S643 and displays the previous game selection screen (FIG. 44).
  • data for configuring the game screen is held as game information, for example, in the flash memory 305.
  • the CPU 302 associates the scene identification information of the captured image captured in step S502 with the game information data, for example, flash.
  • the transmission information data 305c in the memory 305 is recorded (step S516c).
  • An example of the transmission information data 305c is basically the same as FIG. However, in this embodiment, instead of the actual data “(Quiz001.html)” of the quiz information 114a, for example, actual data “(Game001.html)” of the game information 114c is recorded. It should be noted that the data format of the actual data of the game information 114a may be a format other than html (for example, CSS (Cascading Style Style Sheets) or Flash).
  • the CPU 302 of the second transmission terminal 30 transmits the advertisement information associated with the scene identification information in step S516c to the reception terminal 2 (step S517c).
  • the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. Therefore, the captured image data itself to which the game information is associated is not transmitted to the receiving terminal 2. Specifically, information such as how to use a target captured image to generate a game image is included in the game information and transmitted.
  • step S534c of FIG. 42 the CPU 402 of the receiving terminal 2 receives and records game information from the second transmitting terminal 30.
  • the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 31 from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411.
  • the CPU 402 of the receiving terminal 2 generates an advertisement image by combining the captured image recorded in the captured image data 405b in step S533 and the game information received in step S534c (step S535c).
  • the CPU 402 of the receiving terminal 2 displays the generated game image (step S536c). As shown in FIG. 48, for example, the CPU 402 causes the game image 123 to be superimposed and displayed on the content screen 130 that the user of the receiving terminal 2 is viewing.
  • step S641 the user selects an image (step S641), selects a game (step S643), and modifies the game screen (step S645).
  • step S643 the example which produces is demonstrated, it is good also as a structure which CPU302 of the 2nd transmission terminal 30 produces a game screen automatically.
  • the user can create a game simply by pressing the “create game” button 315 in FIG.
  • the CPU 302 of the second transmission terminal 30 may automatically execute at least one of image selection (step S641), game selection (step S643), and game screen correction (step S645).
  • each process can be executed in the background regardless of the program that the user is viewing.
  • scene identification information is received from the transmission terminal 1
  • only the reception-side captured image acquisition unit 25 is executed, and when the user views the target content, the image superimposing unit 24 or the image composition unit 241 is executed. It may be.
  • the transmission information receiving unit 26, the quiz information receiving unit 261, the advertisement information receiving unit 262, and the game information receiving unit 263 receive the transmission information, quiz information, and the like from the transmission terminal only when the scene identification information is received from the transmission terminal 1. Advertisement information or game information may be acquired. Thereby, efficient processing becomes possible.
  • the transmission terminal 1, the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2 is a recorder device, a tablet terminal. Or it may be a smartphone.
  • transmission information may be created in advance according to content.
  • the transmission terminal 1 is a broadcaster's server
  • the transmission information corresponding to the acquired content can be transmitted.
  • the scene identification information recorded in the transmission information may be automatically transmitted to the receiving terminal 2.
  • information such as quizzes, advertisements, and games for the user of the receiving terminal can be distributed on the broadcast provider side in accordance with the reproduction of the content, and entertainment in content distribution can be enhanced.
  • the functional blocks shown in FIGS. 1, 14 to 17, 18A to 18E, 19, and 24 are realized by processing of a CPU that executes software. However, some or all of them can be realized by hardware such as a logic circuit. Note that it is possible to cause the OS to further execute a part of the processing of the program.
  • the present application is useful for an information transmission system that transmits information input at a transmission terminal to a reception terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

According to the present invention, a transmission terminal (1) receives an instruction to capture content. The transmission terminal (1) captures a specific scene of the content and acquires a capture image. The transmission terminal (1) transmits scene identification information for the capture image to a reception terminal (2). The transmission terminal (1) receives the input of message information. The transmission terminal (1) transmits the message information to the reception terminal (2). The reception terminal (2) receives the scene identification information. The reception terminal (2) captures the specific scene on the basis of the scene identification information and acquires the capture image. The reception terminal (2) receives the message information from the transmission terminal (1). The reception terminal (2) composites the message information with the capture image.

Description

送信端末、受信端末および情報伝達方法Transmission terminal, reception terminal, and information transmission method
 本発明は、送信端末で入力された情報を受信端末に伝達する情報伝達方法に関する。 The present invention relates to an information transmission method for transmitting information input at a transmission terminal to a reception terminal.
 近年、各ユーザのコメントを、動画上にオーバレイ表示したり、動画の下に一覧表示したりして、ユーザ同士が動画を介してコミュニケーションできる動画共有サービスが注目されている。しかし、このような動画共有サービスは、テキストベースで入力されたコメントを動画の再生時刻に対応させて表示することを前提としているため、動画の特定シーンに表示されている人や物を直接指定してコメントを付すことは困難である。 In recent years, a video sharing service that allows users to communicate with each other via a video by displaying each user's comments as an overlay on the video or in a list below the video has attracted attention. However, such video sharing services are based on the premise that comments entered on a text basis are displayed in correspondence with the playback time of the video, so people and objects displayed in a specific scene of the video are directly specified. It is difficult to make comments.
 仮に、動画の特定シーンに表示されている人や物を直接指定してコメントを付そうとする場合、送信側ユーザ端末において、動画中の特定シーンをキャプチャして取り込んだキャプチャ画像に、自由曲線による手書き入力またはテキスト入力等による文字や図形等をコメントとして重畳させて、コメントが重畳されたキャプチャ画像をサーバや受信側ユーザ端末に送信することが考えられる。 If you want to add a comment by directly specifying a person or object displayed in a specific scene of the video, on the sending user terminal, capture the specific scene in the video and capture it with a free curve It is conceivable to superimpose characters, figures, and the like by handwriting input or text input as a comment, and transmit the captured image with the comment superimposed to the server or the receiving user terminal.
 しかし、送信側ユーザ端末において、著作物である動画の特定シーンをキャプチャしたキャプチャ画像を含むデータをサーバや受信側ユーザ端末に送信することは、その著作権者の著作権に含まれる権利である公衆送信権または送信可能化権(例えば日本国著作権法第23条第1項、第2条1項9号の5)等を侵害してしまうおそれがある。また、文字や図形等がコメントとして重畳されたキャプチャ画像のデータ量が大きい場合、通信ネットワークに負荷がかかるという問題もある。 However, it is a right included in the copyright of the copyright holder to transmit data including a captured image obtained by capturing a specific scene of a moving image, which is a copyrighted work, to the server or the receiving user terminal. There is a risk of infringing on a public transmission right or a transmission enabling right (for example, Article 23, Paragraph 1, Article 2, Paragraph 9, Item 5 of Japan). There is also a problem that a load is applied to the communication network when the amount of data of a captured image in which characters, graphics, and the like are superimposed as comments is large.
 ところで、動画の著作権侵害および通信処理能力の制限に関する問題の解決を目的として、送信側ユーザ端末がコメントおよびコメント入力時点の経過時刻をサーバに送信し、受信側ユーザ端末がコメントおよびコメント入力時点の経過時刻をサーバから受信して、動画再生中の画面に経過時刻に対応するコメントを表示する、動画の視聴システムが知られている(例えば特許文献1参照)。 By the way, for the purpose of solving problems related to copyright infringement of videos and restrictions on communication processing capability, the sending user terminal sends the comment and comment elapsed time to the server, and the receiving user terminal inputs the comment and comment Is received from a server and a comment corresponding to the elapsed time is displayed on a screen during the reproduction of the moving image (for example, see Patent Document 1).
特開2009-94653号公報JP 2009-94653 A
 しかしながら、特許文献1では、受信側ユーザ端末がコンテンツを保持していることを前提として、コメントとコンテンツの特定シーンにおける経過時刻とを対応づけて表示するため、例えば受信側ユーザ端末が放送番組を録画していない場合、送信側ユーザからのコメントをコンテンツの特定シーンにおける経過時刻と対応付けて表示することができない。つまり、このような場合、送信側ユーザは、受信側ユーザに対して、コメントとコンテンツの特定シーンを用いた情報伝達を正確に行うことができない。 However, in Patent Document 1, on the assumption that the receiving user terminal holds the content, the comment and the elapsed time in the specific scene of the content are displayed in association with each other. When recording is not performed, the comment from the transmission-side user cannot be displayed in association with the elapsed time in the specific scene of the content. That is, in such a case, the transmission-side user cannot accurately transmit information using the comment and the specific scene of the content to the reception-side user.
 また、特許文献1に示すように、送信側ユーザがコメントする全てのコンテンツを受信側ユーザ端末で録画することや、その録画したコンテンツを保持することは、受信側ユーザにとって煩雑である。 Also, as shown in Patent Document 1, it is troublesome for the receiving side user to record all the content commented by the transmitting side user at the receiving side user terminal and to hold the recorded content.
 さらに、特許文献1では、放送番組の放送終了後において、受信側ユーザが放送番組を視聴することを前提としているため、受信側ユーザは、放送番組に対する送信側ユーザのコメントを放送中にリアルタイムで確認することができない。 Further, in Patent Document 1, since it is assumed that the receiving side user views the broadcast program after the broadcast of the broadcast program is finished, the receiving side user can comment the transmission side user's comment on the broadcast program in real time during the broadcast. I can't confirm.
 本発明は、上記のような点に鑑みてなされたものであり、著作権侵害を回避するとともにネットワーク負荷を削減することを前提として、送信側ユーザからのコメントを含む伝達情報を正確かつリアルタイムに受信側ユーザに伝達し、かつ、受信側ユーザ端末においてコンテンツを録画することや、その録画したコンテンツを保持することの煩雑さを解消することのできる情報伝達方法を提供することを目的とする。 The present invention has been made in view of the above points, and on the premise of avoiding copyright infringement and reducing the network load, the transmission information including comments from the sending user is accurately and in real time. It is an object of the present invention to provide an information transmission method that can be transmitted to a receiving side user and can eliminate the troublesomeness of recording the content at the receiving side user terminal and holding the recorded content.
 以下に開示する送信端末は、動画コンテンツを再生する再生部と、前記再生部で再生された動画コンテンツを表示する表示部と、前記表示された動画コンテンツのうちいずれかのシーンを示す画像をキャプチャするメモリと、前記キャプチャされた画像に関連する伝達情報の入力を受け付ける受付部と、前記メモリでキャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得するプロセッサと、前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを受信端末に対して送信する送信部と、を備える送信端末であって、前記受信端末は、前記再生部で再生された動画コンテンツを前記送信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうち前記送信部から受信したシーン特定情報で特定されるシーンを示す画像をキャプチャし、前記キャプチャされた画像とともに前記送信部から受信した伝達情報を表示する。 A transmitting terminal disclosed below captures an image indicating any scene of the displayed moving image content, a reproducing unit that reproduces the moving image content, a display unit that displays the moving image content reproduced by the reproducing unit, and A receiving unit that receives input of transfer information related to the captured image, a processor that generates or acquires scene specifying information that is information specifying the image captured in the memory, and the transfer information And a transmitting unit that transmits scene specifying information for specifying an image related to the transmission information to the receiving terminal, wherein the receiving terminal receives the video content reproduced by the reproducing unit. Scene identification obtained from another source different from the transmission terminal and received from the transmission unit among the obtained video content Capturing an image showing a scene specified by broadcast, and displays the transmission information received from the transmitting unit together with the captured image.
 以下に開示する受信端末は、動画コンテンツを再生する再生部と、前記再生部で再生された動画コンテンツを表示する表示部と、伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを送信端末から受信する受信部と、前記表示された動画コンテンツのうち、前記送信端末から受信したシーン特定情報で特定されるシーンを示す画像をキャプチャするメモリと、を備える受信端末であって、前記表示部は、前記キャプチャされた画像とともに前記送信端末から受信した伝達情報を表示し、前記送信端末は、前記再生部で再生された動画コンテンツを前記受信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうちいずれかのシーンを示す画像をキャプチャし、前記キャプチャされた画像に関連する伝達情報の入力を受け付け、前記キャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得し、前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを前記受信端末に対して送信する。 The receiving terminal disclosed below includes a playback unit that plays back video content, a display unit that displays the video content played back by the playback unit, scene specification information that specifies transmission information and an image associated with the transmission information, A receiving unit, and a memory that captures an image indicating a scene specified by the scene specifying information received from the transmitting terminal among the displayed moving image content, and a receiving terminal comprising: The display unit displays the captured information received from the transmission terminal together with the captured image, and the transmission terminal obtains the moving image content reproduced by the reproduction unit from another source different from the reception terminal. , Capturing an image showing any scene of the obtained moving image content, and transmitting information related to the captured image The scene identification information that is information for identifying the captured image is generated or acquired, and the transmission information and the scene identification information for identifying the image related to the transmission information are transmitted to the receiving terminal. To send.
 以下に開示する情報伝達システムは、送信ユーザが有する送信端末と、当該送信端末と通信可能な受信端末であって受信ユーザが有する受信端末とを備える情報伝達システムであって、前記送信端末は、コンテンツ中の特定シーンを識別するシーン識別情報を、前記受信端末に送信するシーン識別情報送信部と、前記シーン識別情報に基づいてキャプチャしたキャプチャ画像と関連付けて表示させる伝達情報を、前記受信端末に送信する伝達情報送信部とを備え、前記受信端末は、前記送信端末から受信した前記シーン識別情報に基づいて、前記受信ユーザが視聴可能なコンテンツ中の特定シーンをキャプチャして、受信側キャプチャ画像を取得する受信側キャプチャ画像取得部と、前記送信端末から受信した前記伝達情報と、前記受信側キャプチャ画像とを合成する画像合成部とを備える。 The information transmission system disclosed below is an information transmission system including a transmission terminal possessed by a transmission user and a reception terminal that is a reception terminal that can communicate with the transmission terminal and that the reception user has. Scene identification information for identifying a specific scene in the content is transmitted to the receiving terminal, and transmission information to be displayed in association with the captured image captured based on the scene identification information is displayed on the receiving terminal. A transmission information transmitting unit for transmitting, and the receiving terminal captures a specific scene in the content that can be viewed by the receiving user based on the scene identification information received from the transmitting terminal, and receives a captured image on the receiving side Receiving side captured image acquisition unit, the transmission information received from the transmitting terminal, and the receiving side And an image combining unit for combining the Yapucha image.
 また、以下に開示する情報伝達システムは、送信端末と、当該送信端末と通信可能な受信端末とを備える情報伝達システムであって、前記送信端末は、ユーザが視聴中のリアルタイムコンテンツについて、前記ユーザからキャプチャ指示を受け付けるキャプチャ指示受付部と、前記キャプチャ指示に基づいて、前記リアルタイムコンテンツ中の特定シーンを少なくとも1つキャプチャして、送信側キャプチャ画像を取得する送信側キャプチャ画像取得部と、前記送信側キャプチャ画像の特定シーンを識別するシーン識別情報を、前記受信端末に送信するシーン識別情報送信部と、前記送信側キャプチャ画像に重畳して表示させる伝達情報の入力を前記ユーザから受け付ける伝達情報受付部と、前記伝達情報受付部にて入力を受け付けた前記伝達情報を、前記受信端末に送信する伝達情報送信部とを備え、前記受信端末は、前記送信端末からシーン識別情報を受信するシーン識別情報受信部と、受信した前記シーン識別情報に基づいて、前記リアルタイムコンテンツ中の特定シーンをキャプチャして、受信側キャプチャ画像を取得する受信側キャプチャ画像取得部と、前記送信端末から前記伝達情報を受信する伝達情報受信部と、受信した前記伝達情報を、前記受信側キャプチャ画像に重畳させる画像重畳部とを備える。 An information transmission system disclosed below is an information transmission system including a transmission terminal and a reception terminal capable of communicating with the transmission terminal. The transmission terminal is configured to transmit the user with respect to real-time content being viewed by the user. A capture instruction receiving unit that receives a capture instruction, a transmission side capture image acquisition unit that captures at least one specific scene in the real-time content and acquires a transmission side capture image based on the capture instruction, and the transmission A scene identification information transmission unit that transmits scene identification information that identifies a specific scene of the side capture image to the receiving terminal, and a transmission information reception that receives input of transmission information to be displayed superimposed on the transmission side capture image from the user And the information received by the transmission information receiving unit A transmission information transmission unit that transmits information to the reception terminal, the reception terminal based on the scene identification information reception unit that receives scene identification information from the transmission terminal and the received scene identification information, Capture a specific scene in the real-time content and acquire a capture image on the reception side, capture information acquisition unit for receiving the transmission information from the transmission terminal, and the received transmission information, An image superimposing unit that superimposes on the reception-side captured image.
 また、以下に開示する情報伝達システムは、送信端末と、受信端末と、当該送信端末および当該受信端末と通信可能な管理サーバとを備える情報伝達システムであって、前記送信端末は、ユーザが視聴中のリアルタイムコンテンツについて、前記ユーザからキャプチャ指示を受け付けるキャプチャ指示受付部と、前記キャプチャ指示に基づいて、前記リアルタイムコンテンツ中の特定シーンを少なくとも1つキャプチャして、送信側キャプチャ画像を取得する送信側キャプチャ画像取得部と、前記送信側キャプチャ画像の特定シーンを識別するシーン識別情報を、前記管理サーバに送信するシーン識別情報送信部と、前記送信側キャプチャ画像に重畳して表示させる伝達情報の入力を前記ユーザから受け付ける伝達情報受付部と、前記伝達情報受付部にて入力を受け付けた前記伝達情報を、前記管理サーバに送信する伝達情報送信部とを備え、前記管理サーバは、前記送信端末から受信した前記シーン識別情報を、前記受信端末に配信するシーン識別情報配信部と、前記送信端末から受信した前記伝達情報を、前記受信端末に配信する伝達情報配信部とを備え、前記受信端末は、前記管理サーバからシーン識別情報を受信するシーン識別情報受信部と、受信した前記シーン識別情報に基づいて、前記リアルタイムコンテンツ中の特定シーンをキャプチャして、受信側キャプチャ画像を取得する受信側キャプチャ画像取得部と、前記管理サーバから前記伝達情報を受信する伝達情報受信部と、受信した前記伝達情報を、前記受信側キャプチャ画像に重畳させる画像重畳部とを備える。 An information transmission system disclosed below is an information transmission system including a transmission terminal, a reception terminal, the transmission terminal, and a management server capable of communicating with the reception terminal, and the transmission terminal is viewed by a user. A capture instruction receiving unit that receives a capture instruction from the user for the real-time content in the transmission, and a transmission side that captures at least one specific scene in the real-time content and acquires a transmission-side capture image based on the capture instruction Capture image acquisition unit, scene identification information for identifying a specific scene of the capture image on the transmission side, a scene identification information transmission unit for transmitting to the management server, and input of transmission information to be displayed superimposed on the capture image on the transmission side A reception information receiving unit for receiving the information from the user, and the transmission A transmission information transmission unit that transmits the transmission information received by the information reception unit to the management server, and the management server distributes the scene identification information received from the transmission terminal to the reception terminal. A scene identification information distribution unit that performs the above-described transmission information received from the transmission terminal and a transmission information distribution unit that distributes the transmission information to the reception terminal. The reception terminal receives scene identification information from the management server. Based on the received scene identification information, an information receiving unit, a receiving-side captured image acquisition unit that captures a specific scene in the real-time content and acquires a receiving-side captured image, and the transmission information from the management server A transmission information receiving unit for receiving, and an image superimposing unit for superimposing the received transmission information on the capture image on the receiving side.
 本願明細書の開示によれば、著作権侵害を回避するとともにネットワーク負荷を削減することを前提として、送信側ユーザからのコメントを含む伝達情報を正確かつリアルタイムに受信側ユーザに伝達し、かつ、受信側ユーザ端末においてコンテンツを録画することや、その録画したコンテンツを保持することの煩雑さを解消することが可能となる。 According to the disclosure of the present specification, on the premise of avoiding copyright infringement and reducing the network load, transmission information including comments from the transmitting user is accurately and in real time transmitted to the receiving user, and It is possible to eliminate the complexity of recording content in the receiving user terminal and maintaining the recorded content.
図1は、第1の実施形態による情報伝達システムのシステム構成の一例を示す図である。FIG. 1 is a diagram illustrating an example of a system configuration of an information transmission system according to the first embodiment. 図2は、図1に示した第1送信端末20を、CPUを用いて実現したハードウェア構成の一例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration in which the first transmission terminal 20 illustrated in FIG. 1 is realized using a CPU. 図3は、図1に示した第2送信端末30を、CPUを用いて実現したハードウェア構成の一例を示す図である。FIG. 3 is a diagram illustrating an example of a hardware configuration in which the second transmission terminal 30 illustrated in FIG. 1 is realized using a CPU. 図4は、図1に示した受信端末2を、CPUを用いて実現したハードウェア構成の一例を示す図である。FIG. 4 is a diagram illustrating an example of a hardware configuration in which the receiving terminal 2 illustrated in FIG. 1 is realized using a CPU. 図5は、送信端末1(第1送信端末20および第2送信端末30)が、リアルタイムコンテンツを視聴するユーザの伝達情報を、受信端末2に伝達する場合の処理を示すフローチャートの一例を示す図である。FIG. 5 is a diagram showing an example of a flowchart showing processing when the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits the transmission information of the user who views real-time content to the reception terminal 2. It is. 図6Aは、第1送信端末20のディスプレイ201にリアルタイムコンテンツを表示させた場合の一例を示す図である。FIG. 6A is a diagram illustrating an example when real-time content is displayed on the display 201 of the first transmission terminal 20. 図6Bは、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。FIG. 6B is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図6Cは、第2送信端末30のタッチパネルディスプレイ301にリアルタイムコンテンツのキャプチャ画像を表示させた場合の一例を示す図である。FIG. 6C is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30. 図7Aは、第1送信端末20において保持されるキャプチャ画像データ205bの一例を示す図である。FIG. 7A is a diagram illustrating an example of captured image data 205b held in the first transmission terminal 20. 図7Bは、第1送信端末20において保持されるキャプチャ画像データ205bの一例を示す図である。FIG. 7B is a diagram illustrating an example of the captured image data 205b held in the first transmission terminal 20. 図8Aは、第1送信端末20において保持されるシーン識別情報データ205cの一例を示す図である。FIG. 8A is a diagram illustrating an example of the scene identification information data 205c held in the first transmission terminal 20. 図8Bは、第1送信端末20において保持されるシーン識別情報データ205cの一例を示す図である。FIG. 8B is a diagram showing an example of the scene identification information data 205c held in the first transmission terminal 20. 図9Aは、受信端末2において保持されるキャプチャ画像データ405bの一例を示す図である。FIG. 9A is a diagram illustrating an example of captured image data 405b held in the receiving terminal 2. 図9Bは、受信端末2において保持されるキャプチャ画像データ405bの一例を示す図である。FIG. 9B is a diagram illustrating an example of captured image data 405b held in the receiving terminal 2. 図10Aは、第2送信端末30のタッチパネルディスプレイ301に表示されたキャプチャ画像に関する伝達情報を入力した場合の一例を示す図である。FIG. 10A is a diagram illustrating an example when the transmission information regarding the captured image displayed on the touch panel display 301 of the second transmission terminal 30 is input. 図10Bは、第2送信端末30のタッチパネルディスプレイ301上においてユーザが入力した伝達情報の一例を示す図である。FIG. 10B is a diagram illustrating an example of transmission information input by the user on the touch panel display 301 of the second transmission terminal 30. 図11Aは、第2送信端末30において保持される伝達情報データ305cの一例を示す図である。FIG. 11A is a diagram illustrating an example of the transmission information data 305 c held in the second transmission terminal 30. 図11Bは、第2送信端末30において保持される伝達情報データ305cの一例を示す図である。FIG. 11B is a diagram illustrating an example of the transmission information data 305 c held in the second transmission terminal 30. 図12Aは、受信端末2が第2送信端末30から受信した伝達情報の一例を示す図である。FIG. 12A is a diagram illustrating an example of transmission information received by the receiving terminal 2 from the second transmitting terminal 30. 図12Bは、受信端末2のディスプレイ401にキャプチャ画像を表示させた場合の一例を示す図である。FIG. 12B is a diagram illustrating an example when a capture image is displayed on the display 401 of the receiving terminal 2. 図12Cは、受信端末2においてキャプチャ画像と伝達情報とを重畳して生成した重畳画像の一例を示す図である。FIG. 12C is a diagram illustrating an example of a superimposed image generated by superimposing a capture image and transmission information on the reception terminal 2. 図13Aは、受信端末2のディスプレイ401において、リアルタイムコンテンツ上に重畳画像を表示させた場合の一例を示す図である。FIG. 13A is a diagram illustrating an example when a superimposed image is displayed on real-time content on the display 401 of the receiving terminal 2. 図13Bは、受信端末2が備えるタブレット型端末に、重畳画像を表示させた場合の一例を示す図である。FIG. 13B is a diagram illustrating an example when a superimposed image is displayed on the tablet terminal included in the reception terminal 2. 図14は、送信端末1を1つの装置で構成した場合における情報伝達システムのシステム構成の一例を示す図である。FIG. 14 is a diagram illustrating an example of a system configuration of an information transmission system when the transmission terminal 1 is configured by one device. 図15は、第1送信端末20が伝達情報送信部16を備える構成とした場合における情報伝達システムのシステム構成の一例を示す図である。FIG. 15 is a diagram illustrating an example of a system configuration of the information transmission system when the first transmission terminal 20 includes the transmission information transmission unit 16. 図16は、第2送信端末30がシーン識別情報送信部14を備える構成とした場合における情報伝達システムのシステム構成の一例を示す図である。FIG. 16 is a diagram illustrating an example of a system configuration of the information transmission system when the second transmission terminal 30 includes the scene identification information transmission unit 14. 図17は、送信端末1および受信端末2がそれぞれコンテンツ配信サーバ4からリアルタイムコンテンツを受信する構成とした場合における情報伝達システムのシステム構成の一例を示す図である。FIG. 17 is a diagram illustrating an example of a system configuration of the information transmission system when the transmission terminal 1 and the reception terminal 2 are configured to receive real-time content from the content distribution server 4. 図18Aは、受信端末2を2つの装置で構成した場合における受信端末の構成の一例を示す図である。FIG. 18A is a diagram illustrating an example of a configuration of a receiving terminal when the receiving terminal 2 is configured by two devices. 図18Bは、第1受信端末40が伝達情報受信部26を備える構成とした場合における受信端末の構成の一例を示す図である。FIG. 18B is a diagram illustrating an example of a configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26. 図18Cは、第2受信端末50がシーン識別情報受信部27を備える構成とした場合における受信端末の構成の一例を示す図である。FIG. 18C is a diagram illustrating an example of the configuration of the receiving terminal when the second receiving terminal 50 is configured to include the scene identification information receiving unit 27. 図18Dは、第1受信端末40が伝達情報受信部26を備え、第2受信端末50がシーン識別情報受信部27を備える構成とした場合における受信端末の構成の一例を示す図である。FIG. 18D is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26 and the second receiving terminal 50 includes the scene identification information receiving unit 27. 図18Eは、第1受信端末40が画像重畳部24を備える構成とした場合における受信端末の構成の一例を示す図である。FIG. 18E is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 includes the image superimposing unit 24. 図19は、第2の実施形態による情報伝達システムのシステム構成の一例を示す図である。FIG. 19 is a diagram illustrating an example of a system configuration of an information transmission system according to the second embodiment. 図20は、図19に示した管理サーバ5を、CPUを用いて実現したハードウェア構成の一例を示す図である。FIG. 20 is a diagram illustrating an example of a hardware configuration in which the management server 5 illustrated in FIG. 19 is realized using a CPU. 図21Aは、送信端末1(第1送信端末20および第2送信端末30)が、管理サーバを経由して、受信端末2に、リアルタイムコンテンツを視聴するユーザの伝達情報を伝達する場合の処理を示すフローチャートの一例を示す図である。FIG. 21A shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits the transmission information of the user who views real-time content to the receiving terminal 2 via the management server. It is a figure which shows an example of the flowchart shown. 図21Bは、送信端末1(第1送信端末20および第2送信端末30)が、管理サーバを経由して、受信端末2に、リアルタイムコンテンツを視聴するユーザの伝達情報を伝達する場合の処理を示すフローチャートの一例を示す図である。FIG. 21B shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits the transmission information of the user who views real-time content to the receiving terminal 2 via the management server. It is a figure which shows an example of the flowchart shown. 図22は、受信端末2のディスプレイ401において、リアルタイムコンテンツ上に重畳画像を表示させた場合の一例を示す図である。FIG. 22 is a diagram illustrating an example when a superimposed image is displayed on real-time content on the display 401 of the receiving terminal 2. 図23Aは、送信端末1と受信端末2との間で送受信される主なデータの関係を模式的に示す図である。FIG. 23A is a diagram schematically illustrating a relationship of main data transmitted and received between the transmission terminal 1 and the reception terminal 2. 図23Bは、送信端末1と受信端末2との間で送受信される主なデータの関係を模式的に示す図である。FIG. 23B is a diagram schematically illustrating a relationship between main data transmitted and received between the transmission terminal 1 and the reception terminal 2. 図23Cは、送信端末1と受信端末2との間で送受信される主なデータの関係を模式的に示す図である。FIG. 23C is a diagram schematically illustrating a relationship of main data transmitted / received between the transmission terminal 1 and the reception terminal 2. 図23Dは、送信端末1と受信端末2との間で送受信される主なデータの関係を模式的に示す図である。FIG. 23D is a diagram schematically illustrating a relationship of main data transmitted and received between the transmission terminal 1 and the reception terminal 2. 図23Eは、送信端末1と管理サーバ5と受信端末2との間で送受信される主なデータの関係を模式的に示す図である。FIG. 23E is a diagram schematically illustrating a relationship of main data transmitted and received among the transmission terminal 1, the management server 5, and the reception terminal 2. 図24は、第4の実施形態による情報伝達システムのシステム構成の一例を示す図である。FIG. 24 is a diagram illustrating an example of a system configuration of an information transmission system according to the fourth embodiment. 図25は、第4の実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、受信端末2に伝達情報としてのクイズ情報を送信する場合の処理を示すフローチャートの一例を示す図である。FIG. 25 shows processing in the case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits quiz information as transmission information to the reception terminal 2 in the information transmission system of the fourth embodiment. It is a figure which shows an example of the flowchart shown. 図26は、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。FIG. 26 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図27は、第4の実施形態の情報伝達システムにおいて、第2送信端末30が実行するクイズ情報作成処理のフローチャートの一例を示す図である。FIG. 27 is a diagram illustrating an example of a flowchart of quiz information creation processing executed by the second transmission terminal 30 in the information transmission system according to the fourth embodiment. 図28は、第2送信端末30のタッチパネルディスプレイ301に画像選択画面を表示させた場合の一例を示す図である。FIG. 28 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図29は、第2送信端末30のタッチパネルディスプレイ301にクイズ設定画面を表示させた場合の一例を示す図である。FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図30は、第2送信端末30のタッチパネルディスプレイ301にクイズ画面を表示させた場合の一例を示す図である。FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図31は、第2送信端末30においてクイズ情報として保持される伝達情報データ305cの一例を示す図である。FIG. 31 is a diagram illustrating an example of the transmission information data 305 c held as quiz information in the second transmission terminal 30. 図32は、リアルタイムコンテンツ上にクイズ画像121を重畳表示させた場合の一例を示す図である。FIG. 32 is a diagram illustrating an example when the quiz image 121 is superimposed and displayed on the real-time content. 図33は、第5の実施形態による情報伝達システムのシステム構成の一例を示す図である。FIG. 33 is a diagram illustrating an example of a system configuration of an information transmission system according to the fifth embodiment. 図34は、第5の実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、受信端末2に伝達情報としての広告情報を送信する場合の処理を示すフローチャートの一例を示す図である。FIG. 34 shows processing in the case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits advertisement information as transmission information to the reception terminal 2 in the information transmission system of the fifth embodiment. It is a figure which shows an example of the flowchart shown. 図35は、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。FIG. 35 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図36は、第5の実施形態の情報伝達システムにおいて、第2送信端末30が実行する広告情報作成処理のフローチャートの一例を示す図である。FIG. 36 is a diagram illustrating an example of a flowchart of the advertisement information creation process executed by the second transmission terminal 30 in the information transmission system of the fifth embodiment. 図37は、第2送信端末30のタッチパネルディスプレイ301に画像選択画面を表示させた場合の一例を示す図である。FIG. 37 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図38は、第2送信端末30のタッチパネルディスプレイ301に広告設定画面を表示させた場合の一例を示す図である。FIG. 38 is a diagram illustrating an example when an advertisement setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図39は、第2送信端末30のタッチパネルディスプレイ301に広告画面を表示させた場合の一例を示す図である。FIG. 39 is a diagram illustrating an example when an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図40は、リアルタイムコンテンツ上に広告画像122を重畳表示させた場合の一例を示す図である。FIG. 40 is a diagram illustrating an example when the advertisement image 122 is superimposed and displayed on the real-time content. 図41は、第6の実施形態による情報伝達システムのシステム構成の一例を示す図である。FIG. 41 is a diagram illustrating an example of a system configuration of an information transmission system according to the sixth embodiment. 図42は、第6の実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、受信端末2に伝達情報としてのゲーム情報を送信する場合の処理を示すフローチャートの一例を示す図である。FIG. 42 shows processing when the transmitting terminal 1 (the first transmitting terminal 20 and the second transmitting terminal 30) transmits game information as transmitted information to the receiving terminal 2 in the information transmitting system of the sixth embodiment. It is a figure which shows an example of the flowchart shown. 図43は、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。FIG. 43 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図44は、第5の実施形態の情報伝達システムにおいて、第2送信端末30が実行するゲーム情報作成処理のフローチャートの一例を示す図である。FIG. 44 is a diagram illustrating an example of a flowchart of game information creation processing executed by the second transmission terminal 30 in the information transmission system of the fifth embodiment. 図45は、第2送信端末30のタッチパネルディスプレイ301に画像選択画面を表示させた場合の一例を示す図である。FIG. 45 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図46は、第2送信端末30のタッチパネルディスプレイ301にゲーム設定画面を表示させた場合の一例を示す図である。FIG. 46 is a diagram illustrating an example when a game setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図47は、第2送信端末30のタッチパネルディスプレイ301にゲーム画面を表示させた場合の一例を示す図である。FIG. 47 is a diagram illustrating an example when a game screen is displayed on the touch panel display 301 of the second transmission terminal 30. 図48は、リアルタイムコンテンツ上にゲーム画像123を重畳表示させた場合の一例を示す図である。FIG. 48 is a diagram showing an example when the game image 123 is superimposed and displayed on the real-time content.
 以下においては、本発明の実施形態について図面を用いて具体的に説明する。 Hereinafter, embodiments of the present invention will be specifically described with reference to the drawings.
 [1.第1の実施形態]
 以下、一例としてテレビ受像機を用いた情報伝達システムについて説明する。テレビ受像機は、テレビジョン放送の電波を受信して、ユーザにリアルタイムコンテンツを視聴させることができる装置である。なお、本実施形態において、リアルタイムコンテンツとは、例えばテレビジョン放送が行われる放送番組のように、一斉に配信される動画コンテンツであって、各視聴ユーザが同タイミングで視聴できるコンテンツをいう。よって、リアルタイムコンテンツは、生放送または収録放送の放送番組を含む概念である。
[1. First Embodiment]
Hereinafter, an information transmission system using a television receiver will be described as an example. A television receiver is a device that can receive a radio wave of a television broadcast and allow a user to view real-time content. In the present embodiment, real-time content refers to content that is distributed all at once, such as a broadcast program in which television broadcasting is performed, and can be viewed by each viewing user at the same timing. Therefore, the real-time content is a concept including a broadcast program of live broadcast or recorded broadcast.
 [1-1.システム構成]
 図1は、第1の実施形態による情報伝達システムのシステム構成の一例を示す図である。このシステム構成においては、送信端末1(第1送信端末20および第2送信端末30)および受信端末2は、ネットワークNを介して相互通信可能に接続されている。ネットワークNは、例えばインターネット、LAN(Local Area Network)、WAN(Wide Area Network)等で実現することができる。
[1-1. System configuration]
FIG. 1 is a diagram illustrating an example of a system configuration of an information transmission system according to the first embodiment. In this system configuration, the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so that they can communicate with each other. The network N can be realized by the Internet, a local area network (LAN), a wide area network (WAN), or the like.
 送信端末1は、第1送信端末20と第2送信端末30とを含む。第1送信端末20および受信端末2は、それぞれ放送局3からの放送電波(例えば地上デジタル放送電波)を受信して、放送番組であるリアルタイムコンテンツをユーザに視聴させることができる。 The transmission terminal 1 includes a first transmission terminal 20 and a second transmission terminal 30. Each of the first transmission terminal 20 and the reception terminal 2 can receive broadcast radio waves (for example, terrestrial digital broadcast radio waves) from the broadcast station 3 and allow the user to view real-time content that is a broadcast program.
 例えば、第1送信端末20および受信端末2は、テレビ受像機である。なお、第1送信端末20または受信端末2は、テレビジョン放送を受信可能なチューナーを内蔵したデバイス装置(例えばディスク録画再生装置等)とすることができる。 For example, the first transmission terminal 20 and the reception terminal 2 are television receivers. Note that the first transmission terminal 20 or the reception terminal 2 can be a device device (for example, a disk recording / playback device) that incorporates a tuner capable of receiving a television broadcast.
 例えば、第2送信端末30は、第1送信端末20と通信可能なタブレット型端末である。なお、第2送信端末30は、タッチパネルディスプレイ等を備えたスマートフォンとすることができる。 For example, the second transmission terminal 30 is a tablet terminal that can communicate with the first transmission terminal 20. In addition, the 2nd transmission terminal 30 can be used as the smart phone provided with the touchscreen display etc.
 なお、第1送信端末20または受信端末2は、例えばケーブルテレビ等のネットワークを介してテレビジョン放送を受信可能なチューナーを内蔵したデバイス装置(例えばセットトップボックスまたはディスク録画再生装置等)とすることができる。 The first transmitting terminal 20 or the receiving terminal 2 is a device device (for example, a set-top box or a disk recording / playback device) having a built-in tuner capable of receiving a television broadcast via a network such as a cable TV. Can do.
 なお、図1において、送信端末1と通信可能な受信端末2は、1つしか記載されていないが複数存在してもよい。また、図1において、送信端末1と受信端末2との組み合わせは、1組しか記載されていないが複数組存在してもよい。 In FIG. 1, only one receiving terminal 2 that can communicate with the transmitting terminal 1 is shown, but a plurality of receiving terminals 2 may exist. In FIG. 1, only one set of the combination of the transmission terminal 1 and the reception terminal 2 is described, but a plurality of combinations may exist.
 [1-2.処理概要]
 以下、第1の実施形態にかかる情報伝達システムの処理概要を、機能ブロック図を用いて説明する。図1における第1送信端末20、第2送信端末30および受信端末2の内部には、各装置に含まれる機能部を示す機能ブロック図が記載されている。
[1-2. Outline of processing]
Hereinafter, an outline of processing of the information transmission system according to the first embodiment will be described with reference to a functional block diagram. In the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2 in FIG. 1, a functional block diagram showing functional units included in each device is described.
 第1送信端末20は、リアルタイムコンテンツ受信部11(動画コンテンツを再生する再生部)、画像表示部12(動画コンテンツを表示する表示部)、送信側バッファ部13(動画コンテンツの一部または全部を一時的に記憶するバッファ部)、シーン識別情報送信部14(シーン特定情報を送信する送信部)および送信側キャプチャ画像取得部15を備える。第2送信端末30は、伝達情報送信部16、キャプチャ指示受付部17、伝達情報受付部18(伝達情報の入力を受け付ける受付部)およびキャプチャ画像表示部19を備える。受信端末2は、リアルタイムコンテンツ受信部21(動画コンテンツを再生する再生部)、画像表示部22(動画コンテンツを表示する表示部)、受信側バッファ部23(動画コンテンツの一部または全部を一時的に記憶するバッファ部)、画像重畳部24、受信側キャプチャ画像取得部25、伝達情報受信部26およびシーン識別情報受信部27(シーン特定情報を受信する受信部)を備える。 The first transmission terminal 20 includes a real-time content reception unit 11 (a reproduction unit that reproduces moving image content), an image display unit 12 (a display unit that displays moving image content), and a transmission side buffer unit 13 (a part or all of the moving image content). A buffer unit for temporarily storing), a scene identification information transmission unit 14 (a transmission unit for transmitting scene specifying information), and a transmission side capture image acquisition unit 15. The second transmission terminal 30 includes a transmission information transmission unit 16, a capture instruction reception unit 17, a transmission information reception unit 18 (a reception unit that receives input of transmission information), and a captured image display unit 19. The receiving terminal 2 includes a real-time content receiving unit 21 (a reproducing unit that reproduces moving image content), an image display unit 22 (a display unit that displays moving image content), and a receiving side buffer unit 23 (a part or all of the moving image content is temporarily stored). And a superimposing section 24, a receiving side capture image acquiring section 25, a transmission information receiving section 26, and a scene identification information receiving section 27 (receiving section for receiving scene specifying information).
 第1送信端末20におけるリアルタイムコンテンツ受信部11は、放送局3が放送する放送番組をリアルタイムコンテンツとして受信して再生する。第1送信端末20における画像表示部12は、第1送信端末20におけるリアルタイムコンテンツ受信部11が受信して再生したコンテンツの画像(例えば動画)を表示させて、送信端末1のユーザにコンテンツを視聴させる。第1送信端末20における送信側バッファ部13は、第1送信端末20における画像表示部12が表示させたコンテンツの画像の一部または全部を一時的に記憶する。 The real-time content receiver 11 in the first transmission terminal 20 receives and reproduces a broadcast program broadcasted by the broadcast station 3 as real-time content. The image display unit 12 in the first transmission terminal 20 displays an image (for example, a moving image) of the content received and reproduced by the real-time content reception unit 11 in the first transmission terminal 20 so that the user of the transmission terminal 1 can view the content. Let The transmission side buffer unit 13 in the first transmission terminal 20 temporarily stores part or all of the content image displayed by the image display unit 12 in the first transmission terminal 20.
 なお、上記において送信側バッファ部13は、画像表示部12が表示させたコンテンツの画像の一部または全部を一時的に記憶するようにしているが、リアルタイムコンテンツ受信部11が出力したコンテンツの画像の一部または全部を一時的に記憶するようにしてもよい。また、送信側バッファ部13をリアルタイムコンテンツ受信部11と画像表示部12との間に配置して、送信側バッファ部13が出力したコンテンツの画像を画像表示部12が表示するようにしてもよい。 In the above description, the transmission side buffer unit 13 temporarily stores part or all of the content image displayed by the image display unit 12. However, the content image output by the real-time content reception unit 11 is used. A part or all of the above may be temporarily stored. Further, the transmission-side buffer unit 13 may be arranged between the real-time content reception unit 11 and the image display unit 12 so that the image display unit 12 displays an image of the content output from the transmission-side buffer unit 13. .
 第2送信端末30におけるキャプチャ指示受付部17は、第1送信端末20における画像表示部12が表示させたコンテンツの画像を特定のタイミングでキャプチャするためのキャプチャ指示を、このコンテンツを視聴中のユーザから受け付ける。第1送信端末20における送信側キャプチャ画像取得部15は、第2送信端末30におけるキャプチャ指示受付部17が受け付けたキャプチャ指示に基づいて、リアルタイムコンテンツ中の特定シーンを少なくとも1つキャプチャすることによりキャプチャ画像を、例えばメモリに取得する。第1送信端末20におけるシーン識別情報送信部14は、例えばプロセッサにて生成または取得されたシーン識別情報であって、第1送信端末20における送信側キャプチャ画像取得部15が取得したキャプチャ画像の特定シーンを識別するためのシーン識別情報を受信端末2に送信する。 The capture instruction receiving unit 17 in the second transmission terminal 30 is a user who is viewing this content for a capture instruction for capturing an image of the content displayed by the image display unit 12 in the first transmission terminal 20 at a specific timing. Accept from. The transmission side capture image acquisition unit 15 in the first transmission terminal 20 captures by capturing at least one specific scene in the real-time content based on the capture instruction received by the capture instruction reception unit 17 in the second transmission terminal 30. The image is acquired in a memory, for example. The scene identification information transmission unit 14 in the first transmission terminal 20 is, for example, scene identification information generated or acquired by a processor, and specifies the capture image acquired by the transmission side capture image acquisition unit 15 in the first transmission terminal 20. Scene identification information for identifying a scene is transmitted to the receiving terminal 2.
 第2送信端末30におけるキャプチャ画像表示部19は、第1送信端末20における送信側キャプチャ画像取得部15が取得したキャプチャ画像を(例えば静止画)を表示させる。第2送信端末30における伝達情報受付部18は、第2送信端末30におけるキャプチャ画像表示部19が表示させたキャプチャ画像に重畳して表示させる伝達情報の入力をユーザから受け付ける。第2送信端末30における伝達情報送信部16は、第2送信端末30における伝達情報受付部18にてユーザから入力を受け付けた伝達情報を受信端末2に送信する。 The capture image display unit 19 in the second transmission terminal 30 displays the capture image (for example, a still image) acquired by the transmission side capture image acquisition unit 15 in the first transmission terminal 20. The transmission information reception unit 18 in the second transmission terminal 30 receives input of transmission information to be displayed superimposed on the capture image displayed by the capture image display unit 19 in the second transmission terminal 30 from the user. The transmission information transmission unit 16 in the second transmission terminal 30 transmits the transmission information received from the user by the transmission information reception unit 18 in the second transmission terminal 30 to the reception terminal 2.
 受信端末2におけるリアルタイムコンテンツ受信部21は、第1送信端末20におけるリアルタイムコンテンツ受信部11の場合と同様に、放送局3が放送する放送番組をリアルタイムコンテンツとして受信して再生する。受信端末2における画像表示部22は、第1送信端末20における画像表示部12の場合と同様に、受信端末2におけるリアルタイムコンテンツ受信部21が受信して再生したコンテンツの画像(例えば動画)を表示させて、受信端末2のユーザにコンテンツを視聴させる。受信端末2における受信側バッファ部23は、受信端末2における画像表示部22が表示させたコンテンツの画像の一部または全部を一時的に記憶する。 The real-time content receiving unit 21 in the receiving terminal 2 receives and reproduces a broadcast program broadcasted by the broadcast station 3 as real-time content, as in the case of the real-time content receiving unit 11 in the first transmitting terminal 20. Similar to the case of the image display unit 12 in the first transmission terminal 20, the image display unit 22 in the reception terminal 2 displays content images (for example, moving images) received and reproduced by the real-time content reception unit 21 in the reception terminal 2. Thus, the user of the receiving terminal 2 is allowed to view the content. The receiving side buffer unit 23 in the receiving terminal 2 temporarily stores part or all of the content image displayed by the image display unit 22 in the receiving terminal 2.
 なお、上記において受信側バッファ部23は、画像表示部22が表示させたコンテンツの画像の一部または全部を一時的に記憶するようにしているが、リアルタイムコンテンツ受信部21が受信したコンテンツの画像の一部または全部を一時的に記憶するようにしてもよい。また、受信側バッファ部23をリアルタイムコンテンツ受信部21と画像表示部22との間に位置するようにして、受信側バッファ部23が記憶したコンテンツの画像を画像表示部22が表示するようにしてもよい。 In the above description, the reception-side buffer unit 23 temporarily stores part or all of the content image displayed by the image display unit 22, but the content image received by the real-time content reception unit 21. A part or all of the above may be temporarily stored. Further, the receiving side buffer unit 23 is positioned between the real-time content receiving unit 21 and the image display unit 22, and the image display unit 22 displays the image of the content stored in the receiving side buffer unit 23. Also good.
 受信端末2におけるシーン識別情報受信部27は、第1送信端末20の送信側キャプチャ画像取得部15が取得したキャプチャ画像の特定シーンを識別するためのシーン識別情報を、第1送信端末から受信する。受信端末2における受信側キャプチャ画像取得部25は、受信端末2におけるシーン識別情報受信部27が第1送信端末20から受信したシーン識別情報に基づいて、リアルタイムコンテンツ中の特定シーンを少なくとも1つ、例えばメモリにキャプチャすることによりキャプチャ画像を取得する。 The scene identification information receiving unit 27 in the receiving terminal 2 receives scene identification information for identifying a specific scene of the captured image acquired by the transmitting-side captured image acquiring unit 15 of the first transmitting terminal 20 from the first transmitting terminal. . Based on the scene identification information received by the scene identification information receiving unit 27 in the receiving terminal 2 from the first transmitting terminal 20, the receiving side capture image acquiring unit 25 in the receiving terminal 2 has at least one specific scene in the real-time content, For example, a captured image is acquired by capturing in a memory.
 受信端末2における伝達情報受信部26は、第2送信端末30における伝達情報受付部18がユーザから入力を受け付けた伝達情報を、第2送信端末30から受信する。受信端末2における画像重畳部24は、受信端末2における伝達情報受信部26が受信した伝達情報を、受信端末2における受信側キャプチャ画像取得部25が取得したキャプチャ画像に重畳させて重畳画像を生成する。受信端末2における画像表示部22は、受信端末2における画像重畳部24が生成した重畳画像を表示させて、この重畳画像を受信端末2のユーザに視認させる。 The transmission information receiving unit 26 in the receiving terminal 2 receives the transmission information received from the user by the transmission information receiving unit 18 in the second transmitting terminal 30 from the second transmitting terminal 30. The image superimposing unit 24 in the receiving terminal 2 generates a superimposed image by superimposing the transmission information received by the transmission information receiving unit 26 in the receiving terminal 2 on the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2. To do. The image display unit 22 in the receiving terminal 2 displays the superimposed image generated by the image superimposing unit 24 in the receiving terminal 2 so that the user of the receiving terminal 2 can visually recognize the superimposed image.
 図1に示した各機能部(リアルタイムコンテンツ受信部11、画像表示部12、シーン識別情報送信部14、送信側キャプチャ画像取得部15、伝達情報送信部16、キャプチャ指示受付部17、伝達情報受付部18、キャプチャ画像表示部19、リアルタイムコンテンツ受信部21、画像表示部22、画像重畳部24、受信側キャプチャ画像取得部25、伝達情報受信部26およびシーン識別情報受信部27)は、それぞれプログラムによって実現されるCPUの機能を含むものである。本明細書において、プログラムは、CPUにより直接実行可能なプログラムだけでなく、ソース形式のプログラム、圧縮処理がされたプログラム、暗号化されたプログラム等を含む。 Each functional unit shown in FIG. 1 (real-time content receiving unit 11, image display unit 12, scene identification information transmission unit 14, transmission side capture image acquisition unit 15, transmission information transmission unit 16, capture instruction reception unit 17, transmission information reception Unit 18, captured image display unit 19, real-time content reception unit 21, image display unit 22, image superimposition unit 24, reception side capture image acquisition unit 25, transmission information reception unit 26, and scene identification information reception unit 27) It includes the functions of the CPU realized by the above. In the present specification, the program includes not only a program that can be directly executed by the CPU, but also a source format program, a compressed program, an encrypted program, and the like.
 [1-3.ハードウェア構成]
 [1-3-1.第1送信端末20]
 図2は、図1に示した第1送信端末20を、CPUを用いて実現したハードウェア構成の一例を示す図である。第1送信端末20は、ディスプレイ201、CPU(Central Processing Unit)202、RAM(Random Access Memory)203、操作ボタン204、フラッシュメモリ205、ROM(Read Only Memory)206、チューナー回路207、第1通信回路208、第2通信回路209および外部接続ポート210を有する。これらは、内部バス200を介して相互に接続されている。
[1-3. Hardware configuration]
[1-3-1. First transmitting terminal 20]
FIG. 2 is a diagram illustrating an example of a hardware configuration in which the first transmission terminal 20 illustrated in FIG. 1 is realized using a CPU. The first transmission terminal 20 includes a display 201, a CPU (Central Processing Unit) 202, a RAM (Random Access Memory) 203, an operation button 204, a flash memory 205, a ROM (Read Only Memory) 206, a tuner circuit 207, and a first communication circuit. 208, a second communication circuit 209, and an external connection port 210. These are connected to each other via an internal bus 200.
 ディスプレイ201は、CPU202の命令により出力されたリアルタイムコンテンツの画像を表示することができる。CPU202は、OS(Operating System)206aおよび第1送信端末制御プログラム205aに基づく処理を実行することができる。RAM203は、CPU202にアドレス空間を提供することができる。操作ボタン204は、第1送信端末20の基本操作(例えば電源のオンオフ操作等)をユーザから受け付けることができる。フラッシュメモリ205は、第1送信端末制御プログラム205aを保持することができる。 The display 201 can display an image of real-time content output according to a command from the CPU 202. The CPU 202 can execute processing based on an OS (Operating System) 206a and a first transmission terminal control program 205a. The RAM 203 can provide an address space to the CPU 202. The operation button 204 can accept a basic operation of the first transmission terminal 20 (for example, a power on / off operation) from the user. The flash memory 205 can hold the first transmission terminal control program 205a.
 外部接続ポート210は、外部接続型ハードディスクドライブ211を接続することができる。外部接続ポート210は、例えばUSB(Universal Serial Bus)規格やIEEE1394規格に基づく接続が可能な外部機器を接続することができる。外部接続型ハードディスクドライブ211の一部は、リングバッファ203aとして機能することができる。また、外部接続型ハードディスクドライブ211は、キャプチャ画像データ205bおよびシーン識別情報データ205c等を保持することができる。 The external connection port 210 can be connected to an external connection type hard disk drive 211. The external connection port 210 can be connected to an external device that can be connected based on, for example, the USB (Universal Serial Bus) standard or the IEEE 1394 standard. A part of the externally connected hard disk drive 211 can function as the ring buffer 203a. Further, the external connection type hard disk drive 211 can hold captured image data 205b, scene identification information data 205c, and the like.
 なお、リングバッファ203a、キャプチャ画像データ205bまたはシーン識別情報データ205cのうちのいずれかまたは全部は、RAM203、フラッシュメモリ205、内部接続型ハードディスクドライブ(図示しない)、ネットワーク接続型ハードディスクドライブ(図示しない)またはネットワークストレージ(図示しない)に保持されてもよい。例えば、内部接続型ハードディスクドライブ(図示しない)は、外部接続ポート210を経由せずに内部バス200に直接接続されているハードディスクドライブである。例えば、ネットワーク接続型ハードディスクドライブ(図示しない)は、インターネット等のネットワークNを介して接続されているハードディスクドライブである。例えば、ネットワークストレージ(図示しない)は、インターネット等のネットワークNを介して接続されているファイルサーバ(図示しない)である。 Note that any or all of the ring buffer 203a, the captured image data 205b, and the scene identification information data 205c are stored in the RAM 203, the flash memory 205, an internal connection type hard disk drive (not shown), and a network connection type hard disk drive (not shown). Alternatively, it may be held in a network storage (not shown). For example, an internal connection type hard disk drive (not shown) is a hard disk drive that is directly connected to the internal bus 200 without going through the external connection port 210. For example, a network-connected hard disk drive (not shown) is a hard disk drive connected via a network N such as the Internet. For example, the network storage (not shown) is a file server (not shown) connected via a network N such as the Internet.
 ROM206は、OS206aを保持することができる。チューナー回路207は、放送局3が放送する放送番組の電波を複数受信することができる。 The ROM 206 can hold the OS 206a. The tuner circuit 207 can receive a plurality of radio waves of broadcast programs broadcast by the broadcast station 3.
 第1通信回路208は、ネットワークN(図1)を介して受信端末2と通信することができる。第1通信回路208は、例えばTCP/IP(Transmission Control Protocol/Internet Protocol)による通信が可能である。第2通信回路209は、第2送信端末30と通信することができる。第2通信回路209は、例えばIrDA(商標)、Bluetooth(商標)、Wireless USBまたはWi-Fi(商標)等の規格に基づく無線通信や、USB(Universal Serial Bus)またはHDMI(High-Definition Multimedia Interface)等の規格に基づく有線通信が可能である。 The first communication circuit 208 can communicate with the receiving terminal 2 via the network N (FIG. 1). The first communication circuit 208 can perform communication using, for example, TCP / IP (Transmission Control Protocol / Internet Protocol). The second communication circuit 209 can communicate with the second transmission terminal 30. The second communication circuit 209 is, for example, wireless communication based on a standard such as IrDA (trademark), Bluetooth (trademark), Wireless USB, or Wi-Fi (trademark), USB (Universal Serial Bus), or HDMI (High-Definition Multimedia Interface). ) Etc. are possible.
 図1に示した第1送信端末20を構成する、リアルタイムコンテンツ受信部11、画像表示部12、シーン識別情報送信部14および送信側キャプチャ画像取得部15は、一例として、CPU202上において第1送信端末制御プログラム205a(図2)を実行することによって実現される。図1に示したリアルタイムコンテンツ受信部11が受信したリアルタイムコンテンツ、または、図1に示した画像表示部12が表示したリアルタイムコンテンツは、一例として、外部接続型ハードディスクドライブ211のリングバッファ203a(図2)に保持される。図1に示した送信側キャプチャ画像取得部15が取得したキャプチャ画像は、一例として、外部接続型ハードディスクドライブ211のキャプチャ画像データ205b(図2)に保持される。図1に示した送信側キャプチャ画像取得部15が取得したキャプチャ画像の特定シーンを識別するシーン識別情報は、一例として、外部接続型ハードディスクドライブ211のシーン識別情報データ205c(図2)に保持される。 As an example, the real-time content reception unit 11, the image display unit 12, the scene identification information transmission unit 14, and the transmission side capture image acquisition unit 15 included in the first transmission terminal 20 illustrated in FIG. This is realized by executing the terminal control program 205a (FIG. 2). The real-time content received by the real-time content receiver 11 shown in FIG. 1 or the real-time content displayed by the image display unit 12 shown in FIG. 1 is, for example, the ring buffer 203a (FIG. 2) of the externally connected hard disk drive 211. ). The captured image acquired by the transmission-side captured image acquisition unit 15 illustrated in FIG. 1 is held in the captured image data 205b (FIG. 2) of the external connection type hard disk drive 211 as an example. The scene identification information for identifying the specific scene of the captured image acquired by the transmission-side captured image acquisition unit 15 illustrated in FIG. 1 is held in the scene identification information data 205c (FIG. 2) of the externally connected hard disk drive 211 as an example. The
 [1-3-2.第2送信端末30]
 図3は、図1に示した第2送信端末30を、CPUを用いて実現したハードウェア構成の一例を示す図である。第2送信端末30は、タッチパネルディスプレイ301、CPU302、RAM303、操作ボタン304、フラッシュメモリ305、ROM306、第1通信回路307および第2通信回路308を有する。
[1-3-2. Second transmitting terminal 30]
FIG. 3 is a diagram illustrating an example of a hardware configuration in which the second transmission terminal 30 illustrated in FIG. 1 is realized using a CPU. The second transmission terminal 30 includes a touch panel display 301, a CPU 302, a RAM 303, operation buttons 304, a flash memory 305, a ROM 306, a first communication circuit 307, and a second communication circuit 308.
 タッチパネルディスプレイ301は、CPU302の命令により出力されたキャプチャ画像を画面に表示することができる。また、タッチパネルディスプレイ301は、画面上の位置を指定する指示操作(例えばユーザの指先またはスタイラスによる接触操作)をユーザから受け付けることができる。 The touch panel display 301 can display a captured image output by a command from the CPU 302 on the screen. In addition, the touch panel display 301 can accept an instruction operation (for example, a contact operation by a user's fingertip or stylus) that specifies a position on the screen from the user.
 CPU302は、OS306aおよび第2送信端末制御プログラム305aに基づく処理を実行することができる。RAM303は、CPU302にアドレス空間を提供することができる。操作ボタン304は、第2送信端末30の基本操作(例えば電源のオンオフ操作等)をユーザから受け付けることができる。 The CPU 302 can execute processing based on the OS 306a and the second transmission terminal control program 305a. The RAM 303 can provide an address space to the CPU 302. The operation button 304 can accept a basic operation (for example, a power on / off operation) of the second transmission terminal 30 from the user.
 フラッシュメモリ305は、第2送信端末制御プログラム305a、キャプチャ画像データ305bおよび伝達情報データ305c等を保持することができる。ROM306は、OS306aを保持することができる。なお、キャプチャ画像データ305bおよび伝達情報データ305cは、RAM303に保持されてもよい。 The flash memory 305 can hold the second transmission terminal control program 305a, capture image data 305b, transmission information data 305c, and the like. The ROM 306 can hold the OS 306a. Note that the captured image data 305 b and the transmission information data 305 c may be held in the RAM 303.
 第1通信回路307は、第1送信端末20の場合と同様に、ネットワークN(図1)を介して受信端末2と通信することができる。第1通信回路307は、第1送信端末20の場合と同様に、例えばTCP/IPによる通信が可能である。第2通信回路308は、第1送信端末20と通信することができる。第2通信回路308は、第1送信端末20の場合と同様に、例えばIrDA(商標)、Bluetooth(商標)、Wireless USBまたはWi-Fi(商標)等の規格に基づく無線通信や、USBまたはHDMI(商標)等の規格に基づく有線通信が可能である。 The first communication circuit 307 can communicate with the receiving terminal 2 via the network N (FIG. 1) as in the case of the first transmitting terminal 20. The first communication circuit 307 can perform communication by, for example, TCP / IP, as in the case of the first transmission terminal 20. The second communication circuit 308 can communicate with the first transmission terminal 20. As in the case of the first transmission terminal 20, the second communication circuit 308 is configured to perform wireless communication based on standards such as IrDA (trademark), Bluetooth (trademark), Wireless USB, or Wi-Fi (trademark), USB, or HDMI. Wired communication based on standards such as (trademark) is possible.
 図1に示した第2送信端末30を構成する、伝達情報送信部16、キャプチャ指示受付部17、伝達情報受付部18およびキャプチャ画像表示部19は、一例として、CPU302上において第2送信端末制御プログラム305a(図3)を実行することによって実現される。図1に示した送信側キャプチャ画像取得部15が取得したキャプチャ画像は、一例として、フラッシュメモリ305のキャプチャ画像データ305b(図3)に保持される。図1に示した伝達情報受付部18がユーザから入力を受け付けた伝達情報は、一例として、フラッシュメモリ305の伝達情報データ305c(図3)に保持される。 As an example, the transmission information transmission unit 16, the capture instruction reception unit 17, the transmission information reception unit 18, and the captured image display unit 19 configuring the second transmission terminal 30 illustrated in FIG. This is realized by executing the program 305a (FIG. 3). The captured image acquired by the transmission-side captured image acquisition unit 15 illustrated in FIG. 1 is held in captured image data 305b (FIG. 3) of the flash memory 305 as an example. The transmission information received by the transmission information receiving unit 18 shown in FIG. 1 from the user is held in the transmission information data 305c (FIG. 3) of the flash memory 305 as an example.
 [1-3-3.受信端末2]
 図4は、図1に示した受信端末2を、CPUを用いて実現したハードウェア構成の一例を示す図である。受信端末2は、ディスプレイ401、CPU402、RAM403、操作ボタン404、フラッシュメモリ405、ROM406、チューナー回路407、通信回路408および外部接続ポート410を有する。これらは、内部バス400を介して相互に接続されている。
[1-3-3. Receiving terminal 2]
FIG. 4 is a diagram illustrating an example of a hardware configuration in which the receiving terminal 2 illustrated in FIG. 1 is realized using a CPU. The receiving terminal 2 includes a display 401, a CPU 402, a RAM 403, operation buttons 404, a flash memory 405, a ROM 406, a tuner circuit 407, a communication circuit 408, and an external connection port 410. These are connected to each other via an internal bus 400.
 ディスプレイ401は、CPU402の命令により出力されたリアルタイムコンテンツの画像を表示することができる。CPU402は、OS406aおよび受信端末制御プログラム405aに基づく処理を実行することができる。RAM403は、CPU402にアドレス空間を提供することができる。操作ボタン404は、受信端末2の基本操作(例えば電源のオンオフ操作等)をユーザから受け付けることができる。フラッシュメモリ405は、受信端末制御プログラム405aを保持することができる。 The display 401 can display an image of real-time content output in accordance with an instruction from the CPU 402. The CPU 402 can execute processing based on the OS 406a and the receiving terminal control program 405a. The RAM 403 can provide an address space to the CPU 402. The operation button 404 can accept a basic operation of the receiving terminal 2 (for example, a power on / off operation) from the user. The flash memory 405 can hold a receiving terminal control program 405a.
 外部接続ポート410は、外部接続型ハードディスクドライブ411を接続することができる。外部接続ポート410は、例えばUSB規格やIEEE1394規格に基づく接続が可能な外部機器を接続することができる。外部接続型ハードディスクドライブ411の一部は、リングバッファ403aとして機能することができる。また、外部接続型ハードディスクドライブ411は、キャプチャ画像データ405bおよび伝達情報データ405c等を保持することができる。 The external connection port 410 can be connected to an external connection type hard disk drive 411. The external connection port 410 can connect an external device that can be connected based on, for example, the USB standard or the IEEE1394 standard. A part of the externally connected hard disk drive 411 can function as the ring buffer 403a. The external connection type hard disk drive 411 can hold capture image data 405b, transmission information data 405c, and the like.
 なお、リングバッファ403a、キャプチャ画像データ405bまたは伝達情報データ405cのうちのいずれかまたは全部は、RAM403、フラッシュメモリ405、内部接続型ハードディスクドライブ(図示しない)、ネットワーク接続型ハードディスクドライブ(図示しない)またはネットワークストレージ(図示しない)に保持されてもよい。例えば、内部接続型ハードディスクドライブ(図示しない)は、外部接続ポート410を経由せずに内部バス400に直接接続されているハードディスクドライブである。例えば、ネットワーク接続型ハードディスクドライブ(図示しない)は、インターネット等のネットワークNを介して接続されているハードディスクドライブである。例えば、ネットワークストレージ(図示しない)は、インターネット等のネットワークNを介して接続されているファイルサーバ(図示しない)である。 Note that any or all of the ring buffer 403a, the captured image data 405b, and the transmission information data 405c are stored in the RAM 403, the flash memory 405, an internal connection type hard disk drive (not shown), a network connection type hard disk drive (not shown), or It may be held in a network storage (not shown). For example, an internal connection type hard disk drive (not shown) is a hard disk drive that is directly connected to the internal bus 400 without going through the external connection port 410. For example, a network-connected hard disk drive (not shown) is a hard disk drive connected via a network N such as the Internet. For example, the network storage (not shown) is a file server (not shown) connected via a network N such as the Internet.
 ROM406は、OS406aを保持することができる。チューナー回路407は、第1送信端末20または第2送信端末30の場合と同様に、放送局3が放送する放送番組の電波を複数受信することができる。通信回路408は、ネットワークN(図1)を介して第1送信端末20および第2送信端末30と通信することができる。通信回路408は、第1送信端末20または第2送信端末30の場合と同様に、例えばTCP/IPによる通信が可能である。 The ROM 406 can hold the OS 406a. The tuner circuit 407 can receive a plurality of radio waves of a broadcast program broadcast by the broadcast station 3 as in the case of the first transmission terminal 20 or the second transmission terminal 30. The communication circuit 408 can communicate with the first transmission terminal 20 and the second transmission terminal 30 via the network N (FIG. 1). The communication circuit 408 can perform communication by TCP / IP, for example, as in the case of the first transmission terminal 20 or the second transmission terminal 30.
 図1に示した受信端末2を構成する、リアルタイムコンテンツ受信部21、画像表示部22、画像重畳部24、受信側キャプチャ画像取得部25、伝達情報受信部26およびシーン識別情報受信部27は、一例として、CPU402上において受信端末制御プログラム405a(図4)を実行することによって実現される。図1に示したリアルタイムコンテンツ受信部21が受信したリアルタイムコンテンツ、または、図1に示した画像表示部22が表示したリアルタイムコンテンツは、一例として、外部接続型ハードディスクドライブ411のリングバッファ403a(図4)に保持される。図1に示した受信側キャプチャ画像取得部25が取得したキャプチャ画像は、一例として、外部接続型ハードディスクドライブ411のキャプチャ画像データ405b(図4)に保持される。図1に示した伝達情報受信部26が第2送信端末から受信した伝達情報は、一例として、外部接続型ハードディスクドライブ411の伝達情報データ405c(図4)に保持される。 The real-time content receiving unit 21, the image display unit 22, the image superimposing unit 24, the receiving side capture image acquiring unit 25, the transmission information receiving unit 26, and the scene identification information receiving unit 27 that constitute the receiving terminal 2 shown in FIG. As an example, it is realized by executing the receiving terminal control program 405a (FIG. 4) on the CPU 402. The real-time content received by the real-time content receiving unit 21 shown in FIG. 1 or the real-time content displayed by the image display unit 22 shown in FIG. 1 is, for example, the ring buffer 403a (see FIG. 4) of the externally connected hard disk drive 411. ). The captured image acquired by the reception-side captured image acquisition unit 25 illustrated in FIG. 1 is held in the captured image data 405b (FIG. 4) of the externally connected hard disk drive 411 as an example. The transmission information received from the second transmission terminal by the transmission information receiving unit 26 shown in FIG. 1 is held in the transmission information data 405c (FIG. 4) of the external connection type hard disk drive 411 as an example.
 [1-4.処理詳細]
 図5~13Bを用いて、本実施形態における処理詳細を説明する。図5は、本実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、リアルタイムコンテンツを視聴するユーザの伝達情報を、受信端末2に伝達する場合の処理を示すフローチャートの一例を示す図である。
[1-4. Processing details]
Details of processing in this embodiment will be described with reference to FIGS. 5 to 13B. FIG. 5 shows a case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits the transmission information of the user who views real-time content to the reception terminal 2 in the information transmission system of the present embodiment. It is a figure which shows an example of the flowchart which shows a process.
 図6Aは、第1送信端末20のディスプレイ201にリアルタイムコンテンツを表示させた場合の一例を示す図である。図6Bは、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。図6Cは、第2送信端末30のタッチパネルディスプレイ301にリアルタイムコンテンツのキャプチャ画像を表示させた場合の一例を示す図である。 FIG. 6A is a diagram illustrating an example when real-time content is displayed on the display 201 of the first transmission terminal 20. FIG. 6B is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 6C is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30.
 図7Aおよび図7Bは、第1送信端末20において保持されるキャプチャ画像データ205bの一例を示す図である。図8Aおよび図8Bは、第1送信端末20において保持されるシーン識別情報データ205cの一例を示す図である。図9Aおよび図9Bは、受信端末2において保持されるキャプチャ画像データ405bの一例を示す図である。 7A and 7B are diagrams illustrating an example of the captured image data 205b held in the first transmission terminal 20. 8A and 8B are diagrams illustrating an example of the scene identification information data 205c held in the first transmission terminal 20. 9A and 9B are diagrams illustrating an example of the captured image data 405b held in the receiving terminal 2.
 図10Aは、第2送信端末30のタッチパネルディスプレイ301に表示されたキャプチャ画像に関する伝達情報を入力した場合の一例を示す図である。図10Bは、第2送信端末30のタッチパネルディスプレイ301上においてユーザが入力した伝達情報の一例を示す図である。図11Aおよび図11Bは、第2送信端末30において保持される伝達情報データ305cの一例を示す図である。 FIG. 10A is a diagram illustrating an example in a case where transmission information related to a captured image displayed on the touch panel display 301 of the second transmission terminal 30 is input. FIG. 10B is a diagram illustrating an example of transmission information input by the user on the touch panel display 301 of the second transmission terminal 30. 11A and 11B are diagrams illustrating an example of the transmission information data 305c held in the second transmission terminal 30. FIG.
 図12Aは、受信端末2が第2送信端末30から受信した伝達情報の一例を示す図である。図12Bは、受信端末2のディスプレイ401にキャプチャ画像を表示させた場合の一例を示す図である。図12Cは、受信端末2においてキャプチャ画像と伝達情報とを重畳して生成した重畳画像の一例を示す図である。 FIG. 12A is a diagram illustrating an example of transmission information received by the receiving terminal 2 from the second transmitting terminal 30. FIG. 12B is a diagram illustrating an example when a capture image is displayed on the display 401 of the receiving terminal 2. FIG. 12C is a diagram illustrating an example of a superimposed image generated by superimposing a capture image and transmission information on the reception terminal 2.
 図13Aは、受信端末2のディスプレイ401において、リアルタイムコンテンツ上に重畳画像を表示させた場合の一例を示す図である。図13Bは、受信端末2が備えるタブレット型端末に、重畳画像を表示させた場合の一例を示す図である。 FIG. 13A is a diagram illustrating an example when a superimposed image is displayed on real-time content on the display 401 of the receiving terminal 2. FIG. 13B is a diagram illustrating an example when a superimposed image is displayed on a tablet terminal included in the reception terminal 2.
 [1-4-1.第1送信端末におけるキャプチャ指示]
 テレビ受像機である第1送信端末20のCPU202は、放送局3が放送する放送番組をリアルタイムコンテンツとして受信し、受信したリアルタイムコンテンツをディスプレイ201に表示する。これにより、第1送信端末20のユーザは、リアルタイムコンテンツを視聴することができる(例えば図6A参照)。
[1-4-1. Capture instruction at first transmission terminal]
The CPU 202 of the first transmission terminal 20, which is a television receiver, receives a broadcast program broadcast by the broadcast station 3 as real time content and displays the received real time content on the display 201. Thereby, the user of the 1st transmission terminal 20 can view real-time content (for example, refer to Drawing 6A).
 図5のフローチャートに示すように、第1送信端末20のCPU202は、ディスプレイ201に表示したリアルタイムコンテンツを、リングバッファ203a(図2)に記録する(ステップS501)。リングバッファ203aは、過去に表示したコンテンツの画像または過去に受信したコンテンツの画像を所定時間分保持することができる。本実施形態において、リングバッファ203aは、コンテンツ映像の所定時間毎(例えば1秒毎)のフレームを所定時間分保持するが、コンテンツ映像の全てのフレーム(例えば1秒間に30フレーム)を所定時間分保持してもよい。 As shown in the flowchart of FIG. 5, the CPU 202 of the first transmission terminal 20 records the real-time content displayed on the display 201 in the ring buffer 203a (FIG. 2) (step S501). The ring buffer 203a can hold an image of content displayed in the past or an image of content received in the past for a predetermined time. In the present embodiment, the ring buffer 203a holds a frame of a content video every predetermined time (for example, every second) for a predetermined time, but all the frames of the content video (for example, 30 frames per second) for a predetermined time. It may be held.
 ユーザは、テレビ受像機である第1送信端末20においてリアルタイムコンテンツを視聴しているときに、タブレット型端末である第2送信端末30を使用することができる。なお、本実施形態において、タブレット型端末である第2送信端末30は、テレビ受像機である第1送信端末20と、予め通信可能に接続されているものとする。 The user can use the second transmission terminal 30 that is a tablet type terminal while viewing real-time content on the first transmission terminal 20 that is a television receiver. In the present embodiment, it is assumed that the second transmission terminal 30 that is a tablet-type terminal is connected to the first transmission terminal 20 that is a television receiver so as to be able to communicate in advance.
 第2送信端末30のCPU302は、キャプチャ指示ボタンがユーザにより押下されたか否かを判断する(ステップS511)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されているキャプチャ指示ボタン311(図6B)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、キャプチャ指示ボタンが押下されたと判断する。 The CPU 302 of the second transmission terminal 30 determines whether or not the capture instruction button has been pressed by the user (step S511). For example, when the CPU 302 detects that the capture instruction button 311 (FIG. 6B) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip, stylus, or the like, the capture instruction button It is determined that is pressed.
 キャプチャ指示ボタン311が押下されたと判断すると(ステップS511、Yes判断)、第2送信端末30のCPU302は、キャプチャ指示データを第1送信端末20に送信する(ステップS512)。 If it is determined that the capture instruction button 311 has been pressed (Yes in step S511), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 (step S512).
 キャプチャ指示データの送信を受けて、第1送信端末20のCPU202は、ディスプレイ201にそのとき表示しているリアルタイムコンテンツの特定シーンを含むフレームを複数キャプチャすることにより、複数のキャプチャ画像を取得し、取得したキャプチャ画像をキャプチャ画像データ205bに記録する(ステップS502)。例えば、CPU202は、リングバッファ203aに記録されている過去に表示したフレームを参照して、図7Aに示すように、再生時刻73が1秒ずつ異なる、5つのキャプチャ画像を示すレコード701~705を記録する。 Upon receiving the capture instruction data, the CPU 202 of the first transmission terminal 20 acquires a plurality of captured images by capturing a plurality of frames including a specific scene of the real-time content currently displayed on the display 201, The acquired captured image is recorded in the captured image data 205b (step S502). For example, the CPU 202 refers to the frames displayed in the past recorded in the ring buffer 203a, and records 701 to 705 indicating five captured images with different reproduction times 73 by 1 second as shown in FIG. 7A. Record.
 図7Aに示すように、CPU202は、例えば、キャプチャ画像74の実データを、放送局ID71、番組ID72、再生時刻73の各データに対応付けて、キャプチャ画像データ205bに記録する。例えば、図7Aのレコード703は、放送局ID71「B001」、番組ID72「P101」、再生時刻73「18:15:03」およびキャプチャ画像74「(S_Cap003.bmp)」等のデータを含む。なお、図7Aにおいて、キャプチャ画像74の実データは、「(S_Cap001.bmp)」のように、画像ファイルを示すファイル名を用いて表現されている。 As shown in FIG. 7A, for example, the CPU 202 records the actual data of the captured image 74 in the captured image data 205b in association with each data of the broadcast station ID 71, the program ID 72, and the reproduction time 73. For example, the record 703 in FIG. 7A includes data such as a broadcasting station ID 71 “B001”, a program ID 72 “P101”, a reproduction time 73 “18:15:03”, and a captured image 74 “(S_Cap003.bmp)”. In FIG. 7A, the actual data of the captured image 74 is expressed using a file name indicating an image file, such as “(S_Cap001.bmp)”.
 なお、テレビション放送にMPEG-2システム(Moving Picture Experts Group 2 Systems)に基づく伝送規格を用いる場合、再生時刻73は、PES(Packetized Elementary Stream)パケットのヘッダ情報に含まれる再生時刻とすることができる。 When a transmission standard based on MPEG-2 systems (Moving Picture Experts Group 2 Systems) is used for television broadcasting, the reproduction time 73 is a reproduction time included in header information of a PES (Packetized Elementary Stream) packet. it can.
 図7Aに示すように、CPU202は、例えば、レコード703の前後2秒間のフレームについてのキャプチャ画像のデータを、レコード701、702、704および705として記録する。このように、第2送信端末30からキャプチャ指示データの送信を受けたタイミングと前後するタイミングのフレームをキャプチャ画像に含め、そこから、ユーザが任意のキャプチャ画像を選択できるようにすることにより、後述するように、ユーザが第2送信端末30のキャプチャ指示ボタン311を押下した時刻と、第1送信端末20においてコンテンツを実際にキャプチャする時刻とのタイムラグを解消することができる。また、ユーザは複数のキャプチャ画像の中から任意のキャプチャ画像を選択できるので、ユーザがキャプチャしようとした画像と、実際にキャプチャされた画像とが異なっていても、ユーザがキャプチャしようとした画像を選択し直すことができる。 As shown in FIG. 7A, the CPU 202 records, for example, captured image data for frames of 2 seconds before and after the record 703 as records 701, 702, 704, and 705. In this way, by including in the captured image a frame at a timing before and after receiving the transmission of the capture instruction data from the second transmitting terminal 30, and allowing the user to select an arbitrary captured image therefrom, As described above, the time lag between the time when the user presses the capture instruction button 311 of the second transmission terminal 30 and the time when the content is actually captured at the first transmission terminal 20 can be eliminated. In addition, since the user can select an arbitrary captured image from a plurality of captured images, even if the image that the user tried to capture differs from the actually captured image, the image that the user tried to capture can be selected. You can choose again.
 なお、CPU202は、図7Bに示すように、図7Aに示したキャプチャ画像データ205bの各レコード(レコード701~705)にそれぞれ対応するユニークなID番号(識別番号)を発行し、例えば、発行したID番号75「ID001」~「ID005」を各レコードに付加することができる。この場合、後述するように、受信端末2は、ID番号に基づいてキャプチャ画像を識別することができる。 As shown in FIG. 7B, the CPU 202 issues unique ID numbers (identification numbers) corresponding to the respective records (records 701 to 705) of the captured image data 205b shown in FIG. 7A. ID numbers 75 “ID001” to “ID005” can be added to each record. In this case, as will be described later, the receiving terminal 2 can identify the captured image based on the ID number.
 続いて第1送信端末20のCPU202は、キャプチャ画像のシーン識別情報を受信端末2に送信する(ステップS503)。シーン識別情報とは、キャプチャ画像の特定シーンを識別するための情報である。具体的には、例えば、図7Aに示したキャプチャ画像データ205bに記録した放送局ID71、番組ID72および再生時刻73の各データを組み合わせたものがシーン識別情報に該当する。例えば、CPU202は、図7Aの放送局ID71、番組ID72および再生時刻73の各データに基づいて、図8Aに示すシーン識別情報データ205c(レコード801~805)を作成し、作成したシーン識別情報データ205cを受信端末2に送信する。 Subsequently, the CPU 202 of the first transmission terminal 20 transmits the scene identification information of the captured image to the reception terminal 2 (step S503). The scene identification information is information for identifying a specific scene of the captured image. Specifically, for example, a combination of the broadcast station ID 71, program ID 72, and playback time 73 data recorded in the captured image data 205b shown in FIG. 7A corresponds to the scene identification information. For example, the CPU 202 creates the scene identification information data 205c (records 801 to 805) shown in FIG. 8A based on the broadcast station ID 71, program ID 72, and playback time 73 data shown in FIG. 7A, and the created scene identification information data 205c is transmitted to the receiving terminal 2.
 なお、図7Bに示したように、キャプチャ画像データ205bの各レコード(レコード701~705)にID番号75の各データが付加されていた場合、CPU202は、図8Bに示すように、シーン識別情報データ205cの各レコード(レコード801~805)にID番号84「ID001」~「ID005」の各データを付加してシーン識別情報を作成する。 As shown in FIG. 7B, when each data of the ID number 75 is added to each record (records 701 to 705) of the captured image data 205b, the CPU 202 displays the scene identification information as shown in FIG. 8B. The scene identification information is created by adding the data of ID numbers 84 “ID001” to “ID005” to the records (records 801 to 805) of the data 205c.
 テレビ受像機である受信端末2のCPU402は、第1送信端末20の場合と同様に、放送局3が放送する放送番組をリアルタイムコンテンツとして受信し、受信したリアルタイムコンテンツをディスプレイ401に表示する。これにより、受信端末2のユーザは、リアルタイムコンテンツを視聴することができる。 As in the case of the first transmission terminal 20, the CPU 402 of the receiving terminal 2, which is a television receiver, receives a broadcast program broadcast by the broadcast station 3 as real-time content and displays the received real-time content on the display 401. Thereby, the user of the receiving terminal 2 can view real-time content.
 受信端末2のCPU402は、ディスプレイ401に表示したリアルタイムコンテンツを、第1送信端末20の場合と同様に、リングバッファ403a(図4)に記録する(ステップS531)。リングバッファ403aは、第1送信端末20のリングバッファ203aの場合と同様に、コンテンツ映像の所定時間毎(例えば1秒毎)のフレームを所定時間分保持することができる。 The CPU 402 of the receiving terminal 2 records the real-time content displayed on the display 401 in the ring buffer 403a (FIG. 4) as in the case of the first transmitting terminal 20 (step S531). Similarly to the ring buffer 203a of the first transmission terminal 20, the ring buffer 403a can hold a frame of a content video every predetermined time (for example, every second) for a predetermined time.
 受信端末2のCPU402は、キャプチャ画像の特定シーンを識別するためのシーン識別情報を、第1送信端末20から受信する(ステップS532)。例えば、CPU402は、図8Aまたは図8Bに示したシーン識別情報データ205cのレコード801~805を全て受信する。 The CPU 402 of the receiving terminal 2 receives scene identification information for identifying a specific scene of the captured image from the first transmitting terminal 20 (step S532). For example, the CPU 402 receives all the records 801 to 805 of the scene identification information data 205c shown in FIG. 8A or 8B.
 受信端末2のCPU402は、第1送信端末20から受信したシーン識別情報に基づいて、シーン識別情報が示す特定シーンを含むフレームを複数キャプチャすることにより、複数のキャプチャ画像を取得し、取得したキャプチャ画像をキャプチャ画像データ405bに記録する(ステップS533)。例えば、CPU402は、リングバッファ403a(図4)に記録されている過去に表示したフレームを参照して、図9Aに示すように、再生時刻93が1秒ずつ異なる5つのキャプチャ画像を示すレコード901~905を記録する。 Based on the scene identification information received from the first transmission terminal 20, the CPU 402 of the receiving terminal 2 acquires a plurality of captured images by capturing a plurality of frames including a specific scene indicated by the scene identification information, and acquires the acquired captures. The image is recorded in the captured image data 405b (step S533). For example, the CPU 402 refers to a frame displayed in the past recorded in the ring buffer 403a (FIG. 4), and as shown in FIG. 9A, a record 901 indicating five captured images having different reproduction times 93 by 1 second. Record ~ 905.
 図9Aに示すように、CPU402は、例えば、キャプチャ画像94の実データを、放送局ID91、番組ID92、再生時刻93の各データに対応付けて、キャプチャ画像データ405bに記録する。例えば、図9Aのレコード903は、放送局ID91「B001」、番組ID92「P101」、再生時刻93「18:15:03」およびキャプチャ画像94「(R_Cap003.bmp)」等のデータを含む。なお、図9Aにおいて、キャプチャ画像94の実データは、例えば「(R_Cap001.bmp)」のように、画像ファイルを示すファイル名を用いて表現されている。 As shown in FIG. 9A, for example, the CPU 402 records the actual data of the captured image 94 in the captured image data 405b in association with each data of the broadcast station ID 91, the program ID 92, and the reproduction time 93. For example, the record 903 in FIG. 9A includes data such as a broadcasting station ID 91 “B001”, a program ID 92 “P101”, a reproduction time 93 “18:15:03”, and a captured image 94 “(R_Cap003.bmp)”. In FIG. 9A, the actual data of the captured image 94 is expressed using a file name indicating an image file, for example, “(R_Cap001.bmp)”.
 ここで、受信端末2のCPU402は、図9Aに示すように、再生時刻93が1秒ずつ異なる5つのキャプチャ画像を示すレコード901~905を記録する。図9Aに示すように、CPU402は、例えば、第1送信端末20が第2送信端末30からキャプチャ指示データの送信を受けたときに表示していたフレームのキャプチャ画像のデータを、レコード903として記録する。また、CPU402は、第1送信端末20の場合と同様に、レコード903の前後2秒間のフレームについてのキャプチャ画像のデータを、レコード901、902、904および905として記録する。 Here, as shown in FIG. 9A, the CPU 402 of the receiving terminal 2 records records 901 to 905 indicating five captured images having different reproduction times 93 by 1 second. As illustrated in FIG. 9A, the CPU 402 records, for example, the record image 903 of the captured image data of the frame that was displayed when the first transmission terminal 20 received the capture instruction data from the second transmission terminal 30. To do. Similarly to the case of the first transmission terminal 20, the CPU 402 records the captured image data for the frames of 2 seconds before and after the record 903 as records 901, 902, 904, and 905.
 なお、第1送信端末20から受信した識別情報データ205cの各レコード(レコード801~805)にID番号が付されていた場合、CPU402は、識別情報データ205cの各レコード(レコード801~805)に付されていたID番号を示す各データ(例えば図8Bに示したID番号84「ID001」~「ID005」の各データ)を、ID番号95「ID001」~「ID005」として、図9Bに示すように、各レコード(レコード901~905)に対応づけて記録する。 If an ID number is assigned to each record (records 801 to 805) of the identification information data 205c received from the first transmission terminal 20, the CPU 402 records each record (records 801 to 805) of the identification information data 205c. As shown in FIG. 9B, each data indicating the assigned ID number (for example, each data of ID numbers 84 “ID001” to “ID005” shown in FIG. 8B) is set as ID numbers 95 “ID001” to “ID005”. Are recorded in association with the respective records (records 901 to 905).
 [1-4-2.第2送信端末における伝達情報の入力]
 図5のステップS504において、第1送信端末20のCPU202は、上記ステップS502にて取得した複数のキャプチャ画像を、第2送信端末30に送信する。例えば、CPU202は、図7Aまたは図7Bに示したレコード701~705に基づくキャプチャ画像のデータを、第2送信端末30に送信する。
[1-4-2. Input of transmission information at the second transmitting terminal]
In step S <b> 504 of FIG. 5, the CPU 202 of the first transmission terminal 20 transmits the plurality of capture images acquired in step S <b> 502 to the second transmission terminal 30. For example, the CPU 202 transmits captured image data based on the records 701 to 705 illustrated in FIG. 7A or 7B to the second transmission terminal 30.
 第2送信端末30のCPU302は、第1送信端末20から取得したデータに基づくキャプチャ画像を、フラッシュメモリ305のキャプチャ画像データ305bに記録し、タッチパネルディスプレイ301に表示する(ステップS513)。なお、第2送信端末30のキャプチャ画像データ305bの例は、第1送信端末20において図7Aまたは図7Bに示したものと同様である。 The CPU 302 of the second transmission terminal 30 records the captured image based on the data acquired from the first transmission terminal 20 in the captured image data 305b of the flash memory 305 and displays it on the touch panel display 301 (step S513). An example of the captured image data 305b of the second transmission terminal 30 is the same as that shown in FIG. 7A or FIG. 7B in the first transmission terminal 20.
 図6Cに示すように、CPU302は、例えば、画面中央部を含むキャプチャ画像表示領域300に、第1送信端末20から取得したデータに基づくキャプチャ画像の一つを表示する。例えば、キャプチャ画像表示領域300にデフォルト表示されるキャプチャ画像は、図7Aまたは図7Bに示したキャプチャ画像データ205bにおけるレコード703に対応するキャプチャ画像とすることができる。 As shown in FIG. 6C, for example, the CPU 302 displays one of the captured images based on the data acquired from the first transmission terminal 20 in the captured image display area 300 including the center of the screen. For example, the capture image displayed by default in the capture image display area 300 can be a capture image corresponding to the record 703 in the capture image data 205b illustrated in FIG. 7A or 7B.
 また、CPU302は、キャプチャ画像表示領域300の下部において、5つのサムネイル画像321~325を表示する。サムネイル画像321~325は、図7Aまたは図7Bに示したキャプチャ画像データ205bにおけるレコード701~705のそれぞれに対応するキャプチャ画像を縮小表示したものである。なお、ユーザによりサムネイル画像321~325のいずれかが選択操作されると、CPU302は、選択操作されたサムネイル画像に対応するキャプチャ画像を、キャプチャ画像表示領域300に拡大表示することができる。 Further, the CPU 302 displays five thumbnail images 321 to 325 at the lower part of the captured image display area 300. The thumbnail images 321 to 325 are obtained by reducing and displaying captured images corresponding to the records 701 to 705 in the captured image data 205b shown in FIG. 7A or 7B. When any of the thumbnail images 321 to 325 is selected by the user, the CPU 302 can enlarge and display the captured image corresponding to the selected thumbnail image in the captured image display area 300.
 このように、キャプチャ画像表示領域300に5つのサムネイル画像321~325を表示することにより、送信端末1のユーザが第2送信端末30のキャプチャ指示ボタン311を押下した時刻と、第1送信端末20においてコンテンツを実際にキャプチャした時刻との間にタイムラグがあっても、送信端末1のユーザは、所望するタイミングのキャプチャ画像を選択することができる。また、ユーザは、5つのサムネイル画像321~325の中から任意のキャプチャ画像を選択できるので、ユーザがキャプチャしようとした画像と、第1送信端末20において実際にキャプチャされた画像とが異なっていても、ユーザがキャプチャしようとした画像を選択し直すことができる。 In this way, by displaying the five thumbnail images 321 to 325 in the capture image display area 300, the time when the user of the transmission terminal 1 presses the capture instruction button 311 of the second transmission terminal 30, and the first transmission terminal 20 are displayed. The user of the transmission terminal 1 can select a captured image at a desired timing even if there is a time lag between the time at which the content is actually captured. In addition, since the user can select an arbitrary captured image from the five thumbnail images 321 to 325, the image that the user tried to capture is different from the image actually captured by the first transmission terminal 20. Also, the user can reselect the image that he tried to capture.
 なお、サムネイル画像の表示数をさらに増加させて、スライド操作等によりユーザが所望のキャプチャ画像を選択できるようにすることにより、本実施形態の情報伝達システムは、ユーザの操作性をより向上させることができる。 Note that the information transmission system of the present embodiment further improves the operability of the user by further increasing the number of thumbnail images displayed and allowing the user to select a desired captured image by a slide operation or the like. Can do.
 第2送信端末30のCPU302は、キャプチャ画像表示領域300に表示したキャプチャ画像に重畳する伝達情報の入力を、ユーザから受け付ける(ステップS514)。例えば、ユーザは、キャプチャ画像表示領域300に表示されたキャプチャ画像に、ユーザの指先やスタイラス等を用いて伝達情報を直接書き込む操作や、選択した画像をキャプチャ画像に付す操作等を行うことにより、伝達情報を入力することができる。 The CPU 302 of the second transmission terminal 30 receives an input of transmission information to be superimposed on the captured image displayed in the captured image display area 300 from the user (step S514). For example, the user performs an operation of directly writing the transmission information on the captured image displayed in the captured image display area 300 using the user's fingertip, stylus, or the like, or an operation of attaching the selected image to the captured image. Communication information can be entered.
 図6Cに示すように、第2送信端末30のCPU302は、例えば、タッチパネルディスプレイ301において、キャプチャ画像表示領域300の近傍に入力操作アイコンを複数表示する。CPU302は、例えば、キャプチャ画像表示領域300の右側に隣接させて画像選択アイコン326を複数表示する。ユーザは、いずれかの画像選択アイコン326をドラッグ・アンド・ドロップしてキャプチャ画像表示領域300に移動させることにより、画像選択アイコン326の画像(例えば帽子)を伝達情報として入力することができる。 As shown in FIG. 6C, the CPU 302 of the second transmission terminal 30 displays a plurality of input operation icons in the vicinity of the capture image display area 300 on the touch panel display 301, for example. For example, the CPU 302 displays a plurality of image selection icons 326 adjacent to the right side of the captured image display area 300. The user can input an image (for example, a hat) of the image selection icon 326 as transmission information by dragging and dropping any one of the image selection icons 326 to the captured image display area 300.
 なお、画像選択アイコン326の画像を選択した場合、アイコンを示すオブジェクトの識別コードを伝達情報に含めることができる。また、アイコンを示すオブジェクトの表示位置、表示サイズ、表示色または視覚効果等の情報を伝達情報に含めることもできる。この場合、受信端末2は、アイコンを示すオブジェクトの識別コードに基づいてアイコン画像を取得する。このため、アイコン画像のデータを送受信する必要がなくなる。 In addition, when the image of the image selection icon 326 is selected, the identification code of the object indicating the icon can be included in the transmission information. In addition, information such as a display position, a display size, a display color, or a visual effect of an object indicating an icon can be included in the transmission information. In this case, the receiving terminal 2 acquires an icon image based on the identification code of the object indicating the icon. For this reason, it is not necessary to transmit / receive icon image data.
 また、図6Cに示すように、CPU302は、例えば、画像選択アイコン326の右側に隣接させて、ツール選択アイコン331~339を表示する。ユーザは、例えば、いずれかのツール選択アイコン331~339をタップすることにより、タップしたツール選択アイコンに対応する入力操作を行うことができる。 Further, as shown in FIG. 6C, the CPU 302 displays tool selection icons 331 to 339 adjacent to the right side of the image selection icon 326, for example. For example, the user can perform an input operation corresponding to the tapped tool selection icon by tapping any of the tool selection icons 331 to 339.
 例えば、鉛筆ツールを示すツール選択アイコン331を選択したユーザは、自由曲線による入力操作を行うことができる。例えば、線色・線種変更ツールを示すツール選択アイコン332を選択したユーザは、自由曲線の色や種類を変更する操作を行うことができる。例えば、消しゴムツールを示すツール選択アイコン333を選択したユーザは、入力した文字や図形等を消去する操作を行うことができる。 For example, a user who has selected a tool selection icon 331 indicating a pencil tool can perform an input operation using a free curve. For example, the user who has selected the tool selection icon 332 indicating the line color / line type change tool can perform an operation of changing the color and type of the free curve. For example, a user who has selected a tool selection icon 333 indicating an eraser tool can perform an operation of deleting input characters, graphics, and the like.
 例えば、テキスト入力ツールを示すツール選択アイコン334を選択したユーザは、テキスト文字を入力する操作を行うことができる。この場合、ユーザは、タッチパネルディスプレイ301上に表示されたソフトウェアキーボードを使用してテキスト文字を入力可能である。なお、ユーザがテキスト入力ツールを用いてキャプチャ画像にテキスト文字を入力した場合、キャプチャ画像における入力したテキスト文字の位置情報を伝達情報に含めておくことが好ましい。これにより、受信端末2において、位置情報に基づく位置にテキスト文字を表示することができ、伝達情報を正確に伝達することができる。 For example, a user who has selected a tool selection icon 334 indicating a text input tool can perform an operation of inputting text characters. In this case, the user can input text characters using a software keyboard displayed on the touch panel display 301. In addition, when a user inputs a text character into a capture image using a text input tool, it is preferable to include position information of the input text character in the capture image in the transmission information. Thereby, in the receiving terminal 2, a text character can be displayed on the position based on position information, and transmission information can be transmitted correctly.
 例えば、四角形入力ツールを示すツール選択アイコン335を選択したユーザは、四角形を入力する操作を行うことができる。例えば、範囲選択ツールを示すツール選択アイコン336を選択したユーザは、キャプチャ画像において任意の範囲を選択する操作を行うことができる。 For example, a user who has selected the tool selection icon 335 indicating a quadrangle input tool can perform an operation of inputting a quadrangle. For example, the user who has selected the tool selection icon 336 indicating the range selection tool can perform an operation of selecting an arbitrary range in the captured image.
 例えば、虫眼鏡ツールを示すツール選択アイコン337を選択したユーザは、キャプチャ画像を拡大または縮小させる操作を行うことができる。例えば、楕円ツールを示すツール選択アイコン338を選択したユーザは、楕円を入力する操作を行うことができる。例えば、色選択ツールを示すツール選択アイコン339を選択したユーザは、キャプチャ画像において任意の範囲に色を付す操作を行うことができる。 For example, the user who has selected the tool selection icon 337 indicating the magnifying glass tool can perform an operation of enlarging or reducing the captured image. For example, the user who has selected the tool selection icon 338 indicating an ellipse tool can perform an operation of inputting an ellipse. For example, the user who has selected the tool selection icon 339 indicating the color selection tool can perform an operation of adding a color to an arbitrary range in the captured image.
 上記において説明したアイコンやツール等を用いて、ユーザは、図10Aに示すように、キャプチャ画像表示領域300上に任意の伝達情報を入力することができる。この場合、伝達情報は、キャプチャ画像表示領域300のキャプチャ画像に重畳して表示される。 Using the icons and tools described above, the user can input arbitrary transmission information on the captured image display area 300 as shown in FIG. 10A. In this case, the transmission information is displayed superimposed on the captured image in the captured image display area 300.
 例えば、ユーザは、図10Aに示すように、キャプチャ画像内の表示物体であるダンベル330を囲む囲み線101を、伝達情報として入力することができる。また、例えば、ユーザは、図10Aに示すように、キャプチャ画像内の表示物体であるダンベル330の上部に「これほしい!」を示す自由曲線を、伝達情報として入力することができる。 For example, as shown in FIG. 10A, the user can input a surrounding line 101 surrounding the dumbbell 330, which is a display object in the captured image, as transmission information. Further, for example, as shown in FIG. 10A, the user can input a free curve indicating “I want this!” As transmission information on the top of the dumbbell 330 that is a display object in the captured image.
 図5のステップS515において、CPU302は、ユーザによる伝達情報の入力が完了したか否かを判断する。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている入力完了ボタン312(図10A)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、伝達情報の入力が完了したと判断する。 In step S515 of FIG. 5, the CPU 302 determines whether or not the input of the transmission information by the user is completed. For example, when the CPU 302 detects that the input completion button 312 (FIG. 10A) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed by the user's fingertip or stylus, the transmission information Judge that input is complete.
 伝達情報の入力が完了したと判断すると(ステップS515、Yes)、CPU302は、入力された伝達情報のデータに、この伝達情報が重畳されたキャプチャ画像のシーン識別情報を対応づけて、フラッシュメモリ305の伝達情報データ305cに記録する(ステップS516)。 If the CPU 302 determines that the input of the transfer information has been completed (step S515, Yes), the CPU 302 associates the scene identification information of the captured image on which the transfer information is superimposed with the input transfer information data, and the flash memory 305. Is recorded in the transmission information data 305c (step S516).
 例えば、CPU302は、キャプチャ画像データ305bに保持しているレコード703(図7A)の放送局ID71、番組ID72および再生時刻73の各データを組み合わせてシーン識別情報を生成し、生成したシーン識別情報を伝達情報に対応づける。具体的には、CPU302は、図11Aに示すように、放送局ID111「B001」、番組ID112「P101」、再生時刻113「18:15:03」のデータを、伝達情報114の実データ「(Com001.bmp)」に対応付けて、伝達情報データ305cに記録する。なお、伝達情報が重畳される対象となったキャプチャ画像のデータ自体は、伝達情報データ305cに記録されない。 For example, the CPU 302 generates scene identification information by combining the data of the broadcast station ID 71, the program ID 72, and the reproduction time 73 of the record 703 (FIG. 7A) held in the captured image data 305b, and generates the generated scene identification information. Correlate with the transmitted information. Specifically, as shown in FIG. 11A, the CPU 302 converts the data of the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03” into the actual data “( Com001.bmp) ”is recorded in the transmission information data 305c. Note that the captured image data itself on which the transmission information is to be superimposed is not recorded in the transmission information data 305c.
 なお、図7Bに示したように、キャプチャ画像データ305bの各レコード(レコード701~705)にID番号が付加されていた場合には、CPU302は、ID番号を含むシーン識別情報を生成して、生成したシーン識別情報を伝達情報に対応付けることができる。具体的には、図11Bに示すように、CPU302は、図11Aに示した、放送局ID111「B001」、番組ID112「P101」、再生時刻113「18:15:03」に代えて、ID番号115「ID003」のデータを、伝達情報114の実データ「(Com001.bmp)」に対応付けて、伝達情報データ305cに記録する。 As shown in FIG. 7B, when an ID number is added to each record (records 701 to 705) of the captured image data 305b, the CPU 302 generates scene identification information including the ID number, The generated scene identification information can be associated with the transmission information. Specifically, as illustrated in FIG. 11B, the CPU 302 replaces the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03” illustrated in FIG. 11A with an ID number. The data of 115 “ID003” is recorded in the transmission information data 305 c in association with the actual data “(Com001.bmp)” of the transmission information 114.
 なお、伝達情報は静止画像データであり、例えばラスタ形式であるビットマップ画像データとすることができる。なお、伝達情報の静止画像データは、線毎に線の太さや色等を変更可能なベクタ形式とすることができる。 Note that the transmission information is still image data, and can be, for example, raster image bitmap image data. The still image data of the transmission information can be in a vector format in which the line thickness, color, etc. can be changed for each line.
 続いて、第2送信端末30のCPU302は、上記ステップS516にてシーン識別情報を対応づけた伝達情報を、受信端末2に送信する(ステップS517)。例えば、CPU302は、図11Aまたは図11Bに示した伝達情報データ305cのレコード1100のデータを、受信端末2に送信する。よって、伝達情報が重畳される対象となったキャプチャ画像のデータ自体は、受信端末2に送信されない。 Subsequently, the CPU 302 of the second transmission terminal 30 transmits the transmission information in which the scene identification information is associated in step S516 to the reception terminal 2 (step S517). For example, the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. 11A or 11B to the receiving terminal 2. Therefore, the captured image data itself on which the transmission information is superimposed is not transmitted to the receiving terminal 2.
 [1-4-3.受信端末2における重畳画像の表示]
 図5のステップS534において、受信端末2のCPU402は、伝達情報を第2送信端末30から受信して記録する。例えば、CPU402は、図11Aまたは図11Bに示した伝達情報データ305cのレコード1100のデータを第2送信端末から受信して、外部接続型ハードディスクドライブ411の伝達情報データ405cに記録する。なお、受信端末2の伝達情報データ405cは、第2送信端末30において、図11Aまたは図11Bに示したものと同様である。
[1-4-3. Display of superimposed image on receiving terminal 2]
In step S534 of FIG. 5, the CPU 402 of the receiving terminal 2 receives and records the transmission information from the second transmitting terminal 30. For example, the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 11A or FIG. 11B from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411. The transmission information data 405c of the receiving terminal 2 is the same as that shown in FIG. 11A or FIG. 11B in the second transmitting terminal 30.
 続いて受信端末2のCPU402は、上記ステップS533にてキャプチャ画像データ405bに記録したキャプチャ画像に、上記ステップS534にて受信した伝達情報を重畳させた重畳画像を生成する(ステップS535)。 Subsequently, the CPU 402 of the receiving terminal 2 generates a superimposed image obtained by superimposing the transmission information received in step S534 on the captured image recorded in the captured image data 405b in step S533 (step S535).
 例えば、CPU402は、囲み線101と、「これほしい!」を示す自由曲線102を示す伝達情報(例えば図12A)の静止画像データを、伝達情報データ405cから取得する。 For example, the CPU 402 acquires, from the transmission information data 405c, still image data of transmission information (for example, FIG. 12A) indicating the surrounding line 101 and the free curve 102 indicating "I want this!"
 図11Aに示すように、伝達情報データ405cに保持されている伝達情報114の実データ「(Com001.bmp)」は、放送局ID111「B001」、番組ID112「P101」、再生時刻113「18:15:03」の各データに対応付けられている。このため、CPU402は、放送局ID111「B001」、番組ID112「P101」、再生時刻113「18:15:03」の各データと同じデータ値に対応付けられているキャプチャ画像の実データを、キャプチャ画像データ405bから取得する。例えば、CPU402は、図9Aのレコード903において、放送局ID91「B001」、番組ID92「P101」、再生時刻93「18:15:03」に対応付けられているキャプチャ画像94の実データ「(R_Cap003.bmp)」を取得する。 As shown in FIG. 11A, the actual data “(Com001.bmp)” of the transmission information 114 held in the transmission information data 405c includes the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18: 15:03 ”. Therefore, the CPU 402 captures actual data of the captured image associated with the same data values as the data of the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03”. Obtained from the image data 405b. For example, in the record 903 of FIG. 9A, the CPU 402 records the actual data “(R_Cap003) of the captured image 94 associated with the broadcasting station ID 91“ B001 ”, the program ID 92“ P101 ”, and the reproduction time 93“ 18:15:03 ”. .bmp) ".
 なお、受信端末2のCPU402は、ID番号を含む伝達情報を第2送信端末30から受信した場合、伝達情報に含まれるID番号に基づいて、キャプチャ画像の実データをキャプチャ画像データ405bから取得する。具体的には、図11Bに示すように、ID番号115「ID003」のデータを含む伝達情報データ405cを受信した場合、CPU402は、ID番号115「ID003」のデータと同じデータ値に対応付けられているキャプチャ画像の実データ「(R_Cap003.bmp)」を、図9Bに示したキャプチャ画像データ405bのレコード903から取得する。この場合、受信端末2は、ID番号に基づいてキャプチャ画像を特定することができる。 When receiving the transmission information including the ID number from the second transmission terminal 30, the CPU 402 of the receiving terminal 2 acquires the actual data of the captured image from the captured image data 405 b based on the ID number included in the transmission information. . Specifically, as illustrated in FIG. 11B, when the transmission information data 405c including the data of ID number 115 “ID003” is received, the CPU 402 is associated with the same data value as the data of ID number 115 “ID003”. The actual captured image data “(R_Cap003.bmp)” is acquired from the record 903 of the captured image data 405b shown in FIG. 9B. In this case, the receiving terminal 2 can specify the captured image based on the ID number.
 以上により、受信端末2のCPU402は、例えば、図12Bに示すキャプチャ画像(R_Cap003.bmp)上に、図12Aに示す伝達情報(Com001.bmp)を重畳させることにより、図12Cに示す重畳画像120を生成することができる。ここで、図12Cに示した重畳画像120は、図10Aに示したキャプチャ画像表示領域300に表示されている画像と一致している。 As described above, for example, the CPU 402 of the receiving terminal 2 superimposes the transmission information (Com001.bmp) illustrated in FIG. 12A on the captured image (R_Cap003.bmp) illustrated in FIG. 12B, thereby superimposing the image 120 illustrated in FIG. 12C. Can be generated. Here, the superimposed image 120 illustrated in FIG. 12C matches the image displayed in the captured image display area 300 illustrated in FIG. 10A.
 受信端末2のCPU402は、生成した重畳画像を表示する(ステップS536)。CPU402は、図12Cに示した重畳画像120を受信端末2のディスプレイ401に表示させる。図13Aに示すように、CPU402は、例えば、受信端末2のユーザが視聴中のコンテンツ画面130上に、重畳画像120を重畳表示させる。 The CPU 402 of the receiving terminal 2 displays the generated superimposed image (step S536). The CPU 402 displays the superimposed image 120 illustrated in FIG. 12C on the display 401 of the receiving terminal 2. As illustrated in FIG. 13A, for example, the CPU 402 causes the superimposed image 120 to be superimposed and displayed on the content screen 130 that the user of the receiving terminal 2 is viewing.
 また、受信端末2がタブレット型端末40を含む場合には、図13Bに示すように、タブレット型端末40のタッチパネルディスプレイ上に、重畳画像120を表示することができる。なお、受信端末2のタブレット型端末40が第2送信端末30と同様の構成(図3)である場合には、受信端末2のユーザは送信端末1のユーザに対して伝達情報を送信することができる。この場合、送信端末1のユーザと受信端末2のユーザは、視聴中のリアルタイムコンテンツのキャプチャ画像についての伝達情報を相互に交換することができ、これにより双方向のコミュニケーションを行うことができる。 Further, when the receiving terminal 2 includes the tablet terminal 40, the superimposed image 120 can be displayed on the touch panel display of the tablet terminal 40 as shown in FIG. 13B. When the tablet terminal 40 of the receiving terminal 2 has the same configuration as that of the second transmitting terminal 30 (FIG. 3), the user of the receiving terminal 2 transmits transmission information to the user of the transmitting terminal 1. Can do. In this case, the user of the transmission terminal 1 and the user of the reception terminal 2 can exchange the transmission information about the captured image of the real-time content being viewed, thereby enabling two-way communication.
 なお、第1送信端末20のCPU202は、上記ステップS504の実行後、上記ステップS501に戻り、上記の処理(ステップS501~S504)を繰り返す。第2送信端末30のCPU302は、上記ステップS517の実行後、上記ステップS511に戻り、上記の処理(ステップS511~S517)を繰り返す。受信端末2のCPU402は、上記ステップS536の実行後、上記ステップS531に戻り、上記の処理(ステップS531~S536)を繰り返す。 The CPU 202 of the first transmission terminal 20 returns to step S501 after executing step S504, and repeats the above processing (steps S501 to S504). After executing step S517, the CPU 302 of the second transmission terminal 30 returns to step S511 and repeats the above processing (steps S511 to S517). After executing step S536, the CPU 402 of the receiving terminal 2 returns to step S531, and repeats the above processing (steps S531 to S536).
 [1-5.変形例]
 [1-5-1.第1の変形例]
 上記において説明した情報伝達システムは、送信端末1が第1送信端末20と第2送信端末30の2つの装置を含む構成としたが、送信端末1は1つの装置で構成することができる。
[1-5. Modified example]
[1-5-1. First Modification]
In the information transmission system described above, the transmission terminal 1 includes two devices, the first transmission terminal 20 and the second transmission terminal 30, but the transmission terminal 1 can be configured with one device.
 図14は、送信端末1を1つの装置で構成した場合における情報伝達システムのシステム構成の一例を示す図である。図14に示すように、送信端末1は、リアルタイムコンテンツ受信部11、画像表示部12、送信側バッファ部13、シーン識別情報送信部14、送信側キャプチャ画像取得部15、伝達情報送信部16、キャプチャ指示受付部17、伝達情報受付部18およびキャプチャ画像表示部19を備える。 FIG. 14 is a diagram illustrating an example of a system configuration of an information transmission system when the transmission terminal 1 is configured by a single device. As shown in FIG. 14, the transmission terminal 1 includes a real-time content reception unit 11, an image display unit 12, a transmission side buffer unit 13, a scene identification information transmission unit 14, a transmission side capture image acquisition unit 15, a transmission information transmission unit 16, A capture instruction receiving unit 17, a transmission information receiving unit 18, and a captured image display unit 19 are provided.
 例えば、伝達情報受付部18は、送信端末1に接続された、ペンタブレット、液晶ペンタブレット、タッチパッド、タッチパネル、マウス、エアマウス(空間における姿勢認識が可能なマウス)、ポインタ(ディスプレイ上の位置を指定可能なポインタ)またはタッチパネル等のポインティングデバイスを用いて入力された伝達情報を受け付けることができる。なお、本変形例の送信端末1および受信端末2における各機能部は、図1等の各機能部と同様である。 For example, the transmission information reception unit 18 is connected to the transmission terminal 1 with a pen tablet, a liquid crystal pen tablet, a touch pad, a touch panel, a mouse, an air mouse (a mouse capable of posture recognition in space), a pointer (a position on the display). Input information using a pointing device such as a touch panel or a touch panel. In addition, each function part in the transmission terminal 1 and the reception terminal 2 of this modification is the same as each function part in FIG.
 [1-5-2.第2の変形例]
 上記において説明した情報伝達システムは、第2送信端末30が伝達情報送信部16を備える構成としたが、第1送信端末20が伝達情報送信部16を備える構成とすることもできる。
[1-5-2. Second Modification]
In the information transmission system described above, the second transmission terminal 30 includes the transmission information transmission unit 16, but the first transmission terminal 20 may include the transmission information transmission unit 16.
 図15は、第1送信端末20が伝達情報送信部16を備える構成とした場合における情報伝達システムのシステム構成の一例を示す図である。図15に示すように、第1送信端末20は、リアルタイムコンテンツ受信部11、画像表示部12、送信側バッファ部13、シーン識別情報送信部14、送信側キャプチャ画像取得部15および伝達情報送信部16を備える。第2送信端末30は、キャプチャ指示受付部17、伝達情報受付部18およびキャプチャ画像表示部19を備える。なお、本変形例の第1送信端末20、第2送信端末30および受信端末2における各機能部は、図1等の各機能部と同様である。 FIG. 15 is a diagram illustrating an example of a system configuration of the information transmission system when the first transmission terminal 20 includes the transmission information transmission unit 16. As shown in FIG. 15, the first transmission terminal 20 includes a real-time content reception unit 11, an image display unit 12, a transmission side buffer unit 13, a scene identification information transmission unit 14, a transmission side capture image acquisition unit 15, and a transmission information transmission unit. 16. The second transmission terminal 30 includes a capture instruction receiving unit 17, a transmission information receiving unit 18, and a captured image display unit 19. In addition, each function part in the 1st transmission terminal 20, the 2nd transmission terminal 30, and the receiving terminal 2 of this modification is the same as each function part of FIG.
 [1-5-3.第3の変形例]
 上記において説明した情報伝達システムは、第1送信端末20がシーン識別情報送信部14を備える構成としたが、第2送信端末30がシーン識別情報送信部14を備える構成とすることもできる。
[1-5-3. Third Modification]
In the information transmission system described above, the first transmission terminal 20 includes the scene identification information transmission unit 14, but the second transmission terminal 30 may include the scene identification information transmission unit 14.
 図16は、第2送信端末30がシーン識別情報送信部14を備える構成とした場合における情報伝達システムのシステム構成の一例を示す図である。図16に示すように、第1送信端末20は、リアルタイムコンテンツ受信部11、画像表示部12、送信側バッファ部13および送信側キャプチャ画像取得部15を備える。第2送信端末30は、シーン識別情報送信部14、伝達情報送信部16、キャプチャ指示受付部17、伝達情報受付部18およびキャプチャ画像表示部19を備える。なお、本変形例の第1送信端末20、第2送信端末30および受信端末2における各機能部は、図1等の各機能部と同様である。 FIG. 16 is a diagram illustrating an example of a system configuration of the information transmission system when the second transmission terminal 30 includes the scene identification information transmission unit 14. As shown in FIG. 16, the first transmission terminal 20 includes a real-time content reception unit 11, an image display unit 12, a transmission side buffer unit 13, and a transmission side capture image acquisition unit 15. The second transmission terminal 30 includes a scene identification information transmission unit 14, a transmission information transmission unit 16, a capture instruction reception unit 17, a transmission information reception unit 18, and a captured image display unit 19. In addition, each function part in the 1st transmission terminal 20, the 2nd transmission terminal 30, and the receiving terminal 2 of this modification is the same as each function part of FIG.
 [1-5-4.第4の変形例]
 上記において説明した情報伝達システムは、送信端末1および受信端末2のそれぞれが放送局3からリアルタイムコンテンツを受信する構成としたが、送信端末1および受信端末2のそれぞれが、インターネットを介してリアルタイムコンテンツを配信可能なコンテンツ配信サーバから、リアルタイムコンテンツを受信する構成とすることもできる。
[1-5-4. Fourth Modification]
The information transmission system described above is configured such that each of the transmission terminal 1 and the reception terminal 2 receives real-time content from the broadcast station 3, but each of the transmission terminal 1 and the reception terminal 2 receives real-time content via the Internet. It is also possible to receive real-time content from a content distribution server that can distribute the content.
 図17は、コンテンツ配信サーバからリアルタイムコンテンツを受信する構成とした場合における情報伝達システムのシステム構成の一例を示す図である。図17に示すように、第1送信端末20のリアルタイムコンテンツ受信部11および受信端末2のリアルタイムコンテンツ受信部21は、コンテンツ配信サーバ4からリアルタイムコンテンツを受信することができる。この場合、リアルタイムコンテンツは、例えばストリーミング形式で配信され、複数端末における各ユーザが同時に視聴可能なコンテンツである。なお、本変形例の第1送信端末20、第2送信端末30および受信端末2におけるその他の各機能部は、図1等の各機能部と同様である。 FIG. 17 is a diagram showing an example of the system configuration of the information transmission system in the case where the configuration is such that real-time content is received from the content distribution server. As shown in FIG. 17, the real-time content reception unit 11 of the first transmission terminal 20 and the real-time content reception unit 21 of the reception terminal 2 can receive real-time content from the content distribution server 4. In this case, the real-time content is content that is distributed in a streaming format, for example, and can be viewed simultaneously by each user in a plurality of terminals. In addition, each other function part in the 1st transmission terminal 20, the 2nd transmission terminal 30, and the receiving terminal 2 of this modification is the same as that of each function part of FIG.
 [1-5-5.第5の変形例]
 上記において説明した情報伝達システムは、送信端末1および受信端末2のそれぞれがリングバッファ(203aまたは403a)を用いて過去に表示したフレームを参照して再生時刻が1秒ずつ異なる5つのキャプチャ画像を記録する構成とした。しかし、送信端末1および受信端末2のいずれか一方または両方が、リングバッファを使用しない構成とすることができる。
[1-5-5. Fifth Modification]
In the information transmission system described above, each of the transmission terminal 1 and the reception terminal 2 uses the ring buffer (203a or 403a) to refer to frames displayed in the past, and captures five captured images with different reproduction times by 1 second. It was set as the structure recorded. However, one or both of the transmission terminal 1 and the reception terminal 2 can be configured not to use a ring buffer.
 送信端末1がリングバッファ203aを使用しない場合、ユーザからキャプチャ指示を受けたタイミングでキャプチャされた1つのキャプチャ画像に基づいて、シーン識別情報が作成される。上記構成は、例えば、ユーザがキャプチャを希望した時刻と、ユーザが第2送信端末30のキャプチャ指示ボタン311を押下した時刻と、第1送信端末20においてコンテンツを実際にキャプチャする時刻との間において、各時刻間のタイムラグが小さい場合に採用することが好ましい。 When the transmission terminal 1 does not use the ring buffer 203a, scene identification information is created based on one captured image captured at the timing when a capture instruction is received from the user. The above configuration is, for example, between the time when the user desires to capture, the time when the user presses the capture instruction button 311 of the second transmission terminal 30, and the time when the content is actually captured at the first transmission terminal 20. It is preferable to employ when the time lag between each time is small.
 受信端末2がリングバッファ403aを使用しない場合、送信端末1からシーン識別情報を受信したタイミングでキャプチャされた1つのキャプチャ画像に基づいて、重畳画像が生成される。上記構成は、例えば、送信端末1と受信端末2の間の通信速度が速い場合に採用することが好ましい。 When the receiving terminal 2 does not use the ring buffer 403a, a superimposed image is generated based on one captured image captured at the timing when the scene identification information is received from the transmitting terminal 1. The above configuration is preferably employed, for example, when the communication speed between the transmission terminal 1 and the reception terminal 2 is high.
 [1-5-6.第6の変形例]
 上記において説明した情報伝達システムは、受信端末2を1つの装置で構成したが、受信端末2を2つの装置を含む構成とすることができる。図18Aは、受信端末1を2つの装置で構成した場合における受信端末の構成の一例を示す図である。図18Aに示すように、受信端末1は、第1受信端末40と第2受信端末50とを含む。なお、送信端末は、上記において説明したいずれの構成であってもよい。
[1-5-6. Sixth Modification]
In the information transmission system described above, the receiving terminal 2 is configured by one device, but the receiving terminal 2 may be configured to include two devices. FIG. 18A is a diagram illustrating an example of a configuration of a receiving terminal when the receiving terminal 1 is configured by two devices. As illustrated in FIG. 18A, the receiving terminal 1 includes a first receiving terminal 40 and a second receiving terminal 50. Note that the transmitting terminal may have any of the configurations described above.
 例えば、第1受信端末40は、図4と同様のハードウェア構成を有するテレビ受像機である。なお、第1受信端末40は、テレビジョン放送を受信可能なチューナーを内蔵したデバイス装置(例えばディスク録画再生装置等)とすることができる。なお、第1受信端末40は、例えばケーブルテレビ等のネットワークを介してテレビジョン放送を受信可能なチューナーを内蔵したデバイス装置(例えばセットトップボックスまたはディスク録画再生装置等)とすることができる。第1受信端末40は、送信端末1および第2受信端末50と通信可能である。 For example, the first receiving terminal 40 is a television receiver having the same hardware configuration as in FIG. Note that the first receiving terminal 40 may be a device device (for example, a disk recording / playback device) that incorporates a tuner capable of receiving a television broadcast. The first receiving terminal 40 may be a device device (for example, a set top box or a disk recording / playback device) that incorporates a tuner that can receive a television broadcast via a network such as a cable television. The first receiving terminal 40 can communicate with the transmitting terminal 1 and the second receiving terminal 50.
 例えば、第2受信端末50は、図3と同様のハードウェア構成を有するタブレット型端末である。なお、第2受信端末50は、タッチパネルディスプレイ等を備えたスマートフォンとすることができる。第2受信端末50は、送信端末1および第1受信端末40と通信可能である。 For example, the second receiving terminal 50 is a tablet-type terminal having the same hardware configuration as in FIG. The second receiving terminal 50 can be a smartphone equipped with a touch panel display or the like. The second receiving terminal 50 can communicate with the transmitting terminal 1 and the first receiving terminal 40.
 図18Aに示すように、第1受信端末40は、リアルタイムコンテンツ受信部21、画像表示部22、受信側バッファ部23、受信側キャプチャ画像取得部25およびシーン識別情報受信部27を備える。第2受信端末50は、画像重畳部24、画像表示部(重畳画像表示部)28および伝達情報受信部26を備える。 As shown in FIG. 18A, the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving side buffer unit 23, a receiving side captured image acquisition unit 25, and a scene identification information receiving unit 27. The second receiving terminal 50 includes an image superimposing unit 24, an image display unit (superimposed image display unit) 28, and a transmission information receiving unit 26.
 第2受信端末50における画像表示部28は、画像重畳部24が生成した重畳画像を表示させることができる。また、画像表示部28は、リアルタイムコンテンツ受信部21が受信したコンテンツの画像(例えば動画)を表示させて、受信端末2のユーザにコンテンツを視聴させることもできる。なお、本変形例の送信端末1および受信端末2におけるその他の各機能部は、図1等の各機能部と同様である。 The image display unit 28 in the second receiving terminal 50 can display the superimposed image generated by the image superimposing unit 24. In addition, the image display unit 28 can display an image (for example, a moving image) of the content received by the real-time content receiving unit 21 and allow the user of the receiving terminal 2 to view the content. In addition, each other function part in the transmission terminal 1 of this modification and the receiving terminal 2 is the same as that of each function part of FIG.
 [1-5-7.第7の変形例]
 上記において説明した第6の変形例の情報伝達システムは、第2受信端末50が伝達情報受信部26を備える構成としたが、第1受信端末40が伝達情報受信部26を備える構成とすることもできる。
[1-5-7. Seventh Modification]
In the information transmission system of the sixth modified example described above, the second receiving terminal 50 is configured to include the transmitted information receiving unit 26, but the first receiving terminal 40 is configured to include the transmitted information receiving unit 26. You can also.
 図18Bは、第1受信端末40が伝達情報受信部26を備える構成とした場合における受信端末の構成の一例を示す図である。図18Bに示すように、第1受信端末40は、リアルタイムコンテンツ受信部21、画像表示部22、受信側バッファ部23、受信側キャプチャ画像取得部25、シーン識別情報受信部27および伝達情報受信部26を備える。第2受信端末50は、画像重畳部24および画像表示部28を備える。なお、本変形例の送信端末1、第1受信端末40および第2受信端末50における各機能部は、図1および図18A等の各機能部と同様である。 FIG. 18B is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26. As shown in FIG. 18B, the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving-side buffer unit 23, a receiving-side captured image acquisition unit 25, a scene identification information receiving unit 27, and a transmission information receiving unit. 26. The second receiving terminal 50 includes an image superimposing unit 24 and an image display unit 28. In addition, each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
 [1-5-8.第8の変形例]
 上記において説明した第6の変形例の情報伝達システムは、第1受信端末40がシーン識別情報受信部27を備える構成としたが、第2受信端末50がシーン識別情報受信部27を備える構成とすることもできる。
[1-5-8. Eighth Modification]
In the information transmission system of the sixth modification described above, the first receiving terminal 40 includes the scene identification information receiving unit 27, but the second receiving terminal 50 includes the scene identification information receiving unit 27. You can also
 図18Cは、第2受信端末40がシーン識別情報受信部27を備える構成とした場合における受信端末の構成の一例を示す図である。図18Cに示すように、第1受信端末40は、リアルタイムコンテンツ受信部21、画像表示部22、受信側バッファ部23および受信側キャプチャ画像取得部25を備える。第2受信端末50は、シーン識別情報受信部27、画像重畳部24、画像表示部28および伝達情報受信部26を備える。なお、本変形例の送信端末1、第1受信端末40および第2受信端末50における各機能部は、図1および図18A等の各機能部と同様である。 FIG. 18C is a diagram illustrating an example of a configuration of the receiving terminal when the second receiving terminal 40 includes the scene identification information receiving unit 27. As shown in FIG. 18C, the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving side buffer unit 23, and a receiving side captured image acquisition unit 25. The second receiving terminal 50 includes a scene identification information receiving unit 27, an image superimposing unit 24, an image display unit 28, and a transmission information receiving unit 26. In addition, each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
 [1-5-9.第9の変形例]
 上記において説明した第6の変形例の情報伝達システムは、第1受信端末40がシーン識別情報受信部27を備え、第2受信端末50が伝達情報受信部26を備える構成としたが、その逆に、第1受信端末40が伝達情報受信部26を備え、第2受信端末50がシーン識別情報受信部27を備える構成とすることもできる。
[1-5-9. Ninth Modification]
In the information transmission system of the sixth modified example described above, the first receiving terminal 40 includes the scene identification information receiving unit 27 and the second receiving terminal 50 includes the transmission information receiving unit 26. In addition, the first receiving terminal 40 may include the transmission information receiving unit 26, and the second receiving terminal 50 may include the scene identification information receiving unit 27.
 図18Dは、第1受信端末40が伝達情報受信部26を備え、第2受信端末50がシーン識別情報受信部27を備える構成とした場合における受信端末の構成の一例を示す図である。図18Dに示すように、第1受信端末40は、リアルタイムコンテンツ受信部21、画像表示部22、受信側バッファ部23、受信側キャプチャ画像取得部25および伝達情報受信部26を備える。第2受信端末50は、シーン識別情報受信部27、画像重畳部24および画像表示部28を備える。なお、本変形例の送信端末1、第1受信端末40および第2受信端末50における各機能部は、図1および図18A等の各機能部と同様である。 FIG. 18D is a diagram illustrating an example of a configuration of the receiving terminal when the first receiving terminal 40 includes the transmission information receiving unit 26 and the second receiving terminal 50 includes the scene identification information receiving unit 27. As illustrated in FIG. 18D, the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving-side buffer unit 23, a receiving-side captured image acquisition unit 25, and a transmission information receiving unit 26. The second receiving terminal 50 includes a scene identification information receiving unit 27, an image superimposing unit 24, and an image display unit 28. In addition, each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
 [1-5-10.第10の変形例]
 上記において説明した第6の変形例~第9の変形例の情報伝達システムは、第2受信端末50が画像重畳部24を備える構成としたが、第1受信端末40が画像重畳部24を備える構成とすることもできる。
[1-5-10. Tenth Modification]
In the information transmission systems of the sixth to ninth modifications described above, the second receiving terminal 50 includes the image superimposing unit 24. However, the first receiving terminal 40 includes the image superimposing unit 24. It can also be configured.
 図18Eは、上記第6の変形例(図18B)における第1受信端末40が画像重畳部24を備える構成とした場合における受信端末の構成の一例を示す図である。図18Eに示すように、第1受信端末40は、リアルタイムコンテンツ受信部21、画像表示部22、受信側バッファ部23、受信側キャプチャ画像取得部25、シーン識別情報受信部27、伝達情報受信部26および画像重畳部24を備える。第2受信端末50は、画像表示部28を備える。また、図18A、図18Cおよび図18Dに示した各構成において、第1受信端末40が画像重畳部24を備える構成とすることもできる。なお、本変形例の送信端末1、第1受信端末40および第2受信端末50における各機能部は、図1および図18A等の各機能部と同様である。 FIG. 18E is a diagram illustrating an example of the configuration of the receiving terminal when the first receiving terminal 40 in the sixth modification (FIG. 18B) includes the image superimposing unit 24. As shown in FIG. 18E, the first receiving terminal 40 includes a real-time content receiving unit 21, an image display unit 22, a receiving-side buffer unit 23, a receiving-side captured image acquisition unit 25, a scene identification information receiving unit 27, and a transmission information receiving unit. 26 and an image superimposing unit 24. The second receiving terminal 50 includes an image display unit 28. 18A, 18C, and 18D, the first receiving terminal 40 may include the image superimposing unit 24. In addition, each function part in the transmission terminal 1, the 1st receiving terminal 40, and the 2nd receiving terminal 50 of this modification is the same as that of each function part, such as FIG. 1 and FIG. 18A.
 [1-6.まとめ]
 以上説明したように、この実施形態に開示する情報伝達システムによれば、送信端末1(第1送信端末20および第2送信端末30)は、キャプチャ画像を送信することなく、伝達情報が重畳されたキャプチャ画像を受信端末2に表示させることができる。また、送信端末1は、キャプチャ画像を受信端末2に送信する必要がないので、送信データ量を抑制してネットワーク負荷を削減することができる。
[1-6. Summary]
As described above, according to the information transmission system disclosed in this embodiment, the transmission information is superimposed on the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) without transmitting the capture image. The captured image can be displayed on the receiving terminal 2. Further, since the transmission terminal 1 does not need to transmit the captured image to the reception terminal 2, it is possible to reduce the network load by suppressing the transmission data amount.
 また、この実施形態に開示する情報伝達システムによれば、送信端末1が、キャプチャ画像のシーン識別情報を受信端末2に送信すると(ステップS503)、受信端末2が、受信したシーン識別情報に基づく特定シーンをキャプチャしてキャプチャ画像を記録する(ステップS533)。このため、受信端末2のユーザは、受信端末2において予め放送番組を録画しておく必要がない。これにより、受信端末2のユーザがコンテンツを録画することや、その録画したコンテンツを保持することの煩雑さを解消することができる。 Further, according to the information transmission system disclosed in this embodiment, when the transmission terminal 1 transmits the scene identification information of the captured image to the reception terminal 2 (step S503), the reception terminal 2 is based on the received scene identification information. A specific scene is captured and a captured image is recorded (step S533). For this reason, the user of the receiving terminal 2 does not need to record a broadcast program beforehand in the receiving terminal 2. Thereby, the trouble of the user of the receiving terminal 2 recording the content and holding the recorded content can be eliminated.
 また、この実施形態に開示する情報伝達システムによれば、送信端末1が、シーン識別情報に対応付けた伝達情報を受信端末2に送信すると(ステップS517)、受信端末2が、受信したシーン識別情報に基づくキャプチャ画像上に、受信した伝達情報を重畳させて、重畳画像を生成する(ステップS535)。このため、送信端末1のユーザからの伝達情報を、正確かつリアルタイムに受信端末2のユーザに伝達することができる。 Also, according to the information transmission system disclosed in this embodiment, when the transmission terminal 1 transmits the transmission information associated with the scene identification information to the reception terminal 2 (step S517), the reception terminal 2 receives the received scene identification. The received transmission information is superimposed on the captured image based on the information to generate a superimposed image (step S535). For this reason, the transmission information from the user of the transmission terminal 1 can be transmitted to the user of the reception terminal 2 accurately and in real time.
 本実施形態において、第1送信端末20におけるリアルタイムコンテンツ受信部11および画像表示部12は、一例として図5のステップS501の処理機能を含む。第1送信端末20における送信側キャプチャ画像取得部15は、一例として図5のステップS502の処理機能を含む。第1送信端末20におけるシーン識別情報送信部14は、一例として図5のステップS503の処理機能を含む。 In the present embodiment, the real-time content receiving unit 11 and the image display unit 12 in the first transmission terminal 20 include the processing function of step S501 in FIG. 5 as an example. The transmission side capture image acquisition unit 15 in the first transmission terminal 20 includes the processing function of step S502 of FIG. 5 as an example. The scene identification information transmission unit 14 in the first transmission terminal 20 includes the processing function of step S503 in FIG. 5 as an example.
 本実施形態において、第2送信端末30におけるキャプチャ指示受付部17は、一例として図5のステップS511の処理機能を含む。第2送信端末30におけるキャプチャ画像表示部19は、一例として図5のステップS513の処理機能を含む。第2送信端末30における伝達情報受付部18は、一例として図5のステップS514の処理機能を含む。第2送信端末30における伝達情報送信部16は、一例として図5のステップS517の処理機能を含む。 In the present embodiment, the capture instruction reception unit 17 in the second transmission terminal 30 includes the processing function of step S511 in FIG. 5 as an example. The captured image display unit 19 in the second transmission terminal 30 includes the processing function of step S513 in FIG. 5 as an example. The transmission information reception unit 18 in the second transmission terminal 30 includes the processing function of step S514 in FIG. 5 as an example. The transmission information transmission unit 16 in the second transmission terminal 30 includes the processing function of step S517 in FIG. 5 as an example.
 本実施形態において、受信端末2におけるリアルタイムコンテンツ受信部21および画像表示部22は、一例として図5のステップS531の処理機能を含む。受信端末2におけるシーン識別情報受信部27は、一例として図5のステップS532の処理機能を含む。受信端末2における受信側キャプチャ画像取得部25は、一例として図5のステップS533の処理機能を含む。受信端末2における伝達情報受信部26は、一例として図5のステップS534の処理機能を含む。受信端末2における画像重畳部24は、一例として図5のステップS535の処理機能を含む。 In the present embodiment, the real-time content receiving unit 21 and the image display unit 22 in the receiving terminal 2 include the processing function of step S531 in FIG. 5 as an example. The scene identification information receiving unit 27 in the receiving terminal 2 includes the processing function of step S532 in FIG. 5 as an example. The receiving side capture image acquisition unit 25 in the receiving terminal 2 includes the processing function of step S533 in FIG. 5 as an example. The transmission information receiving unit 26 in the receiving terminal 2 includes the processing function of step S534 in FIG. 5 as an example. The image superimposing unit 24 in the receiving terminal 2 includes the processing function of step S535 in FIG. 5 as an example.
 なお、本実施形態においては、本発明を情報伝達システムに適用した場合を一例として説明したが、本発明の適用範囲はこれに限定されるものではない。例えば、本発明の情報伝達システムをウェブシステムに適用することが可能である。この場合、送信端末1(第1送信端末20、第2送信端末30)および受信端末2の各機能部は、ブラウザ機能等を有するコンピュータ装置、タブレットPC、タブレット、スマートフォンまたは携帯電話等で実現することができる。 In the present embodiment, the case where the present invention is applied to an information transmission system has been described as an example, but the scope of application of the present invention is not limited to this. For example, the information transmission system of the present invention can be applied to a web system. In this case, each function part of the transmission terminal 1 (the 1st transmission terminal 20, the 2nd transmission terminal 30) and the reception terminal 2 is implement | achieved by the computer apparatus, tablet PC, tablet, smart phone, or mobile phone etc. which have a browser function etc. be able to.
 [2.第2の実施形態]
 上記実施形態において、情報伝達システムは、送信端末1と受信端末2とがネットワークNを介して相互に通信する構成としたが、ネットワークNを介して送信端末1と受信端末2とにそれぞれ接続可能な管理サーバを設けた構成とすることができる。なお、本実施の形態では、上記第1の実施形態と共通する機能部または各要素については、同じ符号を付して、その重複した説明を省略する。
[2. Second Embodiment]
In the above embodiment, the information transmission system is configured such that the transmission terminal 1 and the reception terminal 2 communicate with each other via the network N, but can be connected to the transmission terminal 1 and the reception terminal 2 via the network N, respectively. A simple management server can be provided. In the present embodiment, the same reference numerals are given to functional units or elements common to those in the first embodiment, and a duplicate description thereof is omitted.
 [2-1.システム構成]
 図19は、第2の実施形態による情報伝達システムのシステム構成の一例を示す図である。このシステム構成においては、送信端末1(第1送信端末20および第2送信端末30)、受信端末2a、受信端末2b、受信端末2cおよび管理サーバ5が、ネットワークNを介して相互通信可能に接続されている。例えば、管理サーバ5は、コンピュータ装置である。
[2-1. System configuration]
FIG. 19 is a diagram illustrating an example of a system configuration of an information transmission system according to the second embodiment. In this system configuration, the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30), the reception terminal 2a, the reception terminal 2b, the reception terminal 2c, and the management server 5 are connected via the network N so that they can communicate with each other. Has been. For example, the management server 5 is a computer device.
 なお、図19において、受信端末は、受信端末2a、受信端末2bおよび受信端末2cの3つが記載されているが、受信端末は、1つでもよいし複数存在してもよい。また、図19において、送信端末1と管理サーバ5は、それぞれ1つしか記載されていないが、それぞれ複数存在してもよい。 In FIG. 19, three receiving terminals, that is, the receiving terminal 2a, the receiving terminal 2b, and the receiving terminal 2c are shown, but there may be one or more receiving terminals. In FIG. 19, only one transmission terminal 1 and management server 5 are shown, but there may be a plurality of each.
 [2-2.処理概要]
 以下、第2の実施形態にかかる情報伝達システムの処理概要を、機能ブロック図を用いて説明する。管理サーバ5は、シーン識別情報配信部51および伝達情報配信部52を備える。管理サーバ5におけるシーン識別情報配信部51は、送信端末1から受信したシーン識別情報を、受信端末2a、受信端末2bおよび受信端末2cのそれぞれに配信する。管理サーバ5における伝達情報配信部52は、送信端末1から受信した伝達情報を、受信端末2a、受信端末2bおよび受信端末2cのそれぞれに配信する。
[2-2. Outline of processing]
Hereinafter, an outline of processing of the information transmission system according to the second embodiment will be described using a functional block diagram. The management server 5 includes a scene identification information distribution unit 51 and a transmission information distribution unit 52. The scene identification information distribution unit 51 in the management server 5 distributes the scene identification information received from the transmission terminal 1 to each of the reception terminal 2a, the reception terminal 2b, and the reception terminal 2c. The transmission information distribution unit 52 in the management server 5 distributes the transmission information received from the transmission terminal 1 to each of the reception terminal 2a, the reception terminal 2b, and the reception terminal 2c.
 このように、管理サーバ5は、送信端末1から受信したシーン識別情報および伝達情報を、受信端末2bおよび受信端末2cのそれぞれに配信するので、シーン識別情報に基づくキャプチャ画像および伝達情報を、送信端末1、受信端末2a、受信端末2bおよび受信端末2cの間で共有することができる。例えば、管理サーバ5が、SNS(Social Network Service)サーバやチャットサーバである場合、送信端末から送信された伝達情報を介して、複数のユーザ間でコミュニケーションをとることが可能となる。 As described above, the management server 5 distributes the scene identification information and the transmission information received from the transmission terminal 1 to each of the reception terminal 2b and the reception terminal 2c, so that the captured image and the transmission information based on the scene identification information are transmitted. It can be shared among the terminal 1, the receiving terminal 2a, the receiving terminal 2b, and the receiving terminal 2c. For example, when the management server 5 is an SNS (Social Network Service) server or a chat server, it is possible to communicate between a plurality of users via transmission information transmitted from the transmission terminal.
 [2-3.管理サーバ5のハードウェア構成]
 図20は、図19に示した管理サーバ5を、CPUを用いて実現したハードウェア構成の一例を示す図である。管理サーバ5は、ディスプレイ601、CPU602、RAM603、キーボード/マウス604、ハードディスクドライブ605および通信回路606を有する。
[2-3. Hardware configuration of management server 5]
FIG. 20 is a diagram illustrating an example of a hardware configuration in which the management server 5 illustrated in FIG. 19 is realized using a CPU. The management server 5 includes a display 601, a CPU 602, a RAM 603, a keyboard / mouse 604, a hard disk drive 605, and a communication circuit 606.
 ディスプレイ601は、CPU602の命令により出力された各種データの画像を表示することができる。CPU602は、OS(図示しない)および管理サーバ制御プログラム605aに基づく処理を実行することができる。メモリ603は、CPU602にアドレス空間を提供することができる。キーボード/マウス604は、管理サーバ5の操作をユーザから受け付けることができる。 The display 601 can display images of various data output in accordance with instructions from the CPU 602. The CPU 602 can execute processing based on the OS (not shown) and the management server control program 605a. The memory 603 can provide an address space to the CPU 602. The keyboard / mouse 604 can accept the operation of the management server 5 from the user.
 ハードディスクドライブ605は、管理サーバ制御プログラム605a、シーン識別情報データ605bおよび伝達情報データ605c等を保持することができる。管理サーバ制御プログラム605aは、例えば、SNSやチャットサービスを提供可能なコミュニケーションプログラムとすることができる。 The hard disk drive 605 can hold a management server control program 605a, scene identification information data 605b, transmission information data 605c, and the like. The management server control program 605a can be, for example, a communication program that can provide SNS and chat service.
 なお、シーン識別情報データ205bまたは伝達情報データ605cは、メモリ603に保持されてもよい。通信回路606は、ネットワークN(図19)を介して、送信端末1、受信端末2a、受信端末2bおよび受信端末2c等と通信することができる。 Note that the scene identification information data 205b or the transmission information data 605c may be held in the memory 603. The communication circuit 606 can communicate with the transmission terminal 1, the reception terminal 2a, the reception terminal 2b, the reception terminal 2c, and the like via the network N (FIG. 19).
 図19に示した管理サーバ5を構成する、シーン識別情報配信部51および伝達情報配信部52は、一例として、CPU602上において管理サーバ制御プログラム605aを実行することによって実現される。シーン識別情報配信部51が送信端末1から受信したシーン識別情報は、一例として、ハードディスクドライブ605のシーン識別情報データ605bに保持される。伝達情報配信部52が送信端末1から受信した伝達情報は、一例として、ハードディスクドライブ605の伝達情報データ605cに保持される。 As an example, the scene identification information distribution unit 51 and the transmission information distribution unit 52 included in the management server 5 illustrated in FIG. 19 are realized by executing the management server control program 605a on the CPU 602. The scene identification information received by the scene identification information distribution unit 51 from the transmission terminal 1 is held in the scene identification information data 605b of the hard disk drive 605 as an example. The transmission information received by the transmission information distribution unit 52 from the transmission terminal 1 is held in the transmission information data 605c of the hard disk drive 605 as an example.
 [2-4.処理詳細]
 図21A、図21Bおよび図22を用いて、本実施形態における処理詳細を説明する。図21Aおよび図21Bは、それぞれ、本実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、管理サーバを経由して受信端末2a、受信端末2bおよび受信端末2c(以下、単に「受信端末2」とする)に、リアルタイムコンテンツを視聴するユーザの伝達情報を伝達する場合の処理を示すフローチャートの一例を示す図である。なお、図21Aおよび図21Bのフローチャートの処理は、第1送信端末20および第2送信端末30と受信端末2との間に管理サーバ5が介在して処理を行う点以外、図5に示した処理と同様である。以下においては、図5との相違点について説明する。
[2-4. Processing details]
Details of processing in the present embodiment will be described with reference to FIGS. 21A, 21 </ b> B, and 22. FIG. 21A and FIG. 21B show that the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) in the information transmission system of the present embodiment passes through the management server to the reception terminal 2a, the reception terminal 2b, and It is a figure which shows an example of the flowchart which shows the process in the case of transmitting the transmission information of the user who views a real-time content to the receiving terminal 2c (henceforth only "receiving terminal 2"). 21A and 21B are the same as those shown in FIG. 5 except that the management server 5 is interposed between the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2. It is the same as the processing. Hereinafter, differences from FIG. 5 will be described.
 第1送信端末20のCPU202は、キャプチャ画像のシーン識別情報を管理サーバ5に送信する(ステップS503)。管理サーバ5のCPU602は、第1送信端末20からシーン識別情報を受信して、ハードディスクドライブ605のシーン識別情報データ605bに記録する(ステップS521)。なお、シーン識別情報データ605bの例は、図8Aまたは図8Bと同様である。 The CPU 202 of the first transmission terminal 20 transmits the scene identification information of the captured image to the management server 5 (step S503). The CPU 602 of the management server 5 receives the scene identification information from the first transmission terminal 20 and records it in the scene identification information data 605b of the hard disk drive 605 (step S521). An example of the scene identification information data 605b is the same as that in FIG. 8A or FIG. 8B.
 管理サーバ5のCPU602は、第1送信端末20から受信したシーン識別情報を、受信端末2に配信する(ステップS522)。受信端末2のCPU402は、キャプチャ画像の特定シーンを識別するためのシーン識別情報を、管理サーバ5から受信する(ステップS532a)。 The CPU 602 of the management server 5 distributes the scene identification information received from the first transmission terminal 20 to the reception terminal 2 (step S522). The CPU 402 of the receiving terminal 2 receives scene identification information for identifying a specific scene of the captured image from the management server 5 (step S532a).
 ステップS517aにおいて、第2送信端末30のCPU302は、ステップS516にてシーン識別情報を対応づけた伝達情報を、管理サーバ5に送信する。管理サーバ5のCPU602は、第2送信端末30から伝達情報を受信して、ハードディスクドライブ605の伝達情報データ605cに記録する(ステップS523)。なお、伝達情報データ605cの例は、図11Aまたは図11Bと同様である。 In step S517a, the CPU 302 of the second transmission terminal 30 transmits the transmission information associated with the scene identification information to the management server 5 in step S516. The CPU 602 of the management server 5 receives the transmission information from the second transmission terminal 30 and records it in the transmission information data 605c of the hard disk drive 605 (step S523). An example of the transmission information data 605c is the same as that in FIG. 11A or FIG. 11B.
 管理サーバ5のCPU602は、第2送信端末30から受信した伝達情報を受信端末2に送信する(ステップS524)。受信端末2のCPU402は、伝達情報を管理サーバ5から受信して記録する(ステップS534a)。 The CPU 602 of the management server 5 transmits the transmission information received from the second transmission terminal 30 to the reception terminal 2 (step S524). The CPU 402 of the receiving terminal 2 receives and records the transmission information from the management server 5 (step S534a).
 ステップS536において、受信端末2のCPU402は、生成した重畳画像を、例えばチャット形式で表示することができる。図22に示すように、CPU402は、例えば、図12Cに示した重畳画像120を、受信端末2のユーザが視聴中のコンテンツ画面130上に表示する。このときCPU402は、例えば「Cさん」を示すアイコン140に対応付けて、チャット形式で重畳画像120を表示する。 In step S536, the CPU 402 of the receiving terminal 2 can display the generated superimposed image, for example, in a chat format. As illustrated in FIG. 22, for example, the CPU 402 displays the superimposed image 120 illustrated in FIG. 12C on the content screen 130 that the user of the receiving terminal 2 is viewing. At this time, the CPU 402 displays the superimposed image 120 in a chat format in association with the icon 140 indicating “Mr. C”, for example.
 なお、CPU402は、例えば「Bさん」を示すアイコン141に対応付けて、画像選択アイコン326(図6C)で選択したアイコン画像103(帽子)を重畳させたキャプチャ画像142を、チャット形式で表示する。また、キャプチャ画像を指定することなくテキスト文字のみを入力した場合は、例えば「Aさん」を示すアイコン143に対応付けて、入力されたテキスト文字「この映画おもしろいね!」144をチャット形式で表示する。 The CPU 402 displays, in a chat format, a captured image 142 in which the icon image 103 (hat) selected by the image selection icon 326 (FIG. 6C) is superimposed in association with the icon 141 indicating “Mr. B”, for example. . When only text characters are input without specifying a capture image, for example, the input text character “This movie is fun!” 144 is displayed in a chat format in association with the icon 143 indicating “Mr. A”. To do.
 なお、管理サーバ5のCPU602は、上記ステップS524の実行後、上記ステップS521に戻り、上記の処理(ステップS521~S524)を繰り返す。 The CPU 602 of the management server 5 returns to step S521 after executing step S524, and repeats the above processing (steps S521 to S524).
 本実施形態において、管理サーバ5におけるシーン識別情報配信部51は、一例として図21AのステップS521およびS522の処理機能を含む。管理サーバ5における伝達情報配信部52は、一例として図21BのステップS523およびS524の処理機能を含む。 In this embodiment, the scene identification information distribution unit 51 in the management server 5 includes the processing functions of steps S521 and S522 of FIG. 21A as an example. The transmission information distribution unit 52 in the management server 5 includes the processing functions of steps S523 and S524 in FIG. 21B as an example.
 [2-5.その他]
 本実施形態で説明した図19においては、送信端末1(第1の送信端末20および第2の送信端末30)および受信端末2の構成は、第1の実施形態の図1で説明した構成と同様としたが、本実施形態は他の構成によっても適用可能である。例えば、本実施形態における送信端末1または受信端末2の構成は、第1の実施形態における第1の変形例~第10の変形例の構成(送信端末1(第1の送信端末20および第2の送信端末30)または受信端末2(第1の受信端末40および第2の受信端末50))とすることができる。
[2-5. Others]
In FIG. 19 described in the present embodiment, the configurations of the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are the same as those illustrated in FIG. 1 of the first embodiment. Although similar, this embodiment is applicable to other configurations. For example, the configuration of the transmission terminal 1 or the reception terminal 2 in this embodiment is the same as that of the first to tenth modifications in the first embodiment (transmission terminal 1 (first transmission terminal 20 and second transmission terminal 2). Transmission terminal 30) or reception terminal 2 (first reception terminal 40 and second reception terminal 50)).
 [3.第3の実施形態]
 [3-1.送受信データ]
 [3-1-1.再生時刻等による紐付け]
 図23Aは、上記実施形態において、送信端末1と受信端末2との間で送受信されるデータの関係を模式的に示す図である。
[3. Third Embodiment]
[3-1. Send / receive data]
[3-1-1. Linking by playback time etc.]
FIG. 23A is a diagram schematically illustrating a relationship between data transmitted and received between the transmission terminal 1 and the reception terminal 2 in the embodiment.
 図23Aに示すように、送信端末1は、再生時刻等を含むシーン識別情報(以下「再生時刻等281」とする)を受信端末2に送信する。受信端末2は、送信端末1から受信した再生時刻等281に基づいてキャプチャ画像283を取得する。 As shown in FIG. 23A, the transmission terminal 1 transmits scene identification information including the reproduction time (hereinafter referred to as “reproduction time etc. 281”) to the reception terminal 2. The receiving terminal 2 acquires the captured image 283 based on the reproduction time 281 received from the transmitting terminal 1.
 また、送信端末1は、再生時刻等281および伝達情報282を受信端末2に送信する。受信端末2は、送信端末1から伝達情報282とともに受信した再生時刻等281に基づいて、上記において取得したキャプチャ画像283と伝達情報282とを紐付けることにより重畳画像を生成する。 Further, the transmission terminal 1 transmits the reproduction time 281 and the transmission information 282 to the reception terminal 2. The receiving terminal 2 generates a superimposed image by associating the captured image 283 acquired above with the transmission information 282 based on the reproduction time 281 received together with the transmission information 282 from the transmission terminal 1.
 このように上記実施形態においては、送信端末1(第1送信端末20または第2送信端末30)および受信端末2は、再生時刻等281を用いて、キャプチャ画像283と伝達情報282とを紐付けている。 As described above, in the above embodiment, the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 associate the capture image 283 with the transmission information 282 using the reproduction time 281 or the like. ing.
 しかしながら、キャプチャ画像283と伝達情報282とを紐付けるデータは、再生時刻等281に限定されるものではない。図23B~図23Eは、それぞれ別の実施形態における、送信端末1と受信端末2との間で送受信されるデータの関係を模式的に示す図である。 However, the data linking the captured image 283 and the transmission information 282 is not limited to the reproduction time 281 or the like. FIG. 23B to FIG. 23E are diagrams schematically illustrating a relationship of data transmitted and received between the transmission terminal 1 and the reception terminal 2 in different embodiments.
 [3-1-2.送信端末1が発行したID番号による紐付け]
 図23Bに示すように、送信端末1は、ID番号を発行して、発行したID番号を含むシーン識別情報(以下「ID番号284」とする)を受信端末2に送信する。受信端末2は、送信端末1からID番号284を受信した時点でキャプチャ画像283を取得し、取得したキャプチャ画像283と受信したID番号284とを対応付ける。
[3-1-2. Linking by ID number issued by sending terminal 1]
As illustrated in FIG. 23B, the transmission terminal 1 issues an ID number and transmits scene identification information (hereinafter referred to as “ID number 284”) including the issued ID number to the reception terminal 2. The receiving terminal 2 acquires the captured image 283 when receiving the ID number 284 from the transmitting terminal 1, and associates the acquired captured image 283 with the received ID number 284.
 また、送信端末1は、上記において発行したID番号284および伝達情報282を受信端末2に送信する。受信端末2は、送信端末1から伝達情報282とともに受信したID番号284に基づいて、上記において取得したキャプチャ画像283と伝達情報202とを紐付けることにより重畳画像を生成する。 Further, the transmission terminal 1 transmits the ID number 284 and the transmission information 282 issued above to the reception terminal 2. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired above with the transmission information 202.
 このように、送信端末1(第1送信端末20または第2送信端末30)および受信端末2は、送信端末1が発行したID番号284を用いて、キャプチャ画像283と伝達情報282とを紐付けることができる。 As described above, the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the transmission terminal 1 to associate the capture image 283 with the transmission information 282. be able to.
 [3-1-3.送信端末1が発行したID番号と再生時刻等による紐付け]
 図23Cに示すように、送信端末1は、再生時刻等に対応づけたID番号を発行して、再生時刻等および再生時刻等に対応するID番号を含むシーン識別情報(以下「ID番号284」または「再生時刻等281」とする)を受信端末2に送信する。受信端末2は、送信端末1から受信した再生時刻等281に基づいてキャプチャ画像283を取得し、取得したキャプチャ画像283と受信したID番号284とを対応付ける。
[3-1-3. Linking by ID number issued by sending terminal 1 and playback time]
As shown in FIG. 23C, the transmission terminal 1 issues an ID number associated with the reproduction time and the like, and scene identification information (hereinafter referred to as “ID number 284”) including the reproduction time and the ID number corresponding to the reproduction time. Or “reproduction time etc. 281”) is transmitted to the receiving terminal 2. The receiving terminal 2 acquires the captured image 283 based on the reproduction time 281 received from the transmitting terminal 1, and associates the acquired captured image 283 with the received ID number 284.
 また、送信端末1は、上記において発行したID番号284および伝達情報282を受信端末2に送信する。受信端末2は、送信端末1から伝達情報282とともに受信したID番号284に基づいて、上記において取得したキャプチャ画像283と伝達情報282とを紐付けることにより重畳画像を生成する。 Further, the transmission terminal 1 transmits the ID number 284 and the transmission information 282 issued above to the reception terminal 2. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired in the above with the transmission information 282.
 このように、送信端末1(第1送信端末20または第2送信端末30)および受信端末2は、送信端末1が発行したID番号284を用いて、キャプチャ画像283と伝達情報282とを紐付けることができる。 As described above, the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the transmission terminal 1 to associate the capture image 283 with the transmission information 282. be able to.
 [3-1-4.受信端末2が発行したID番号による紐付け]
 図23Dに示すように、送信端末1は、キャプチャ指示を含むシーン識別情報(以下「キャプチャ指示285」とする)を受信端末2に送信する。受信端末2は、送信端末1からキャプチャ指示285を受信した時点でキャプチャ画像283を取得するとともに、ID番号を発行して、発行したID番号と取得したキャプチャ画像283とを対応付け、発行したID番号を送信端末1に送信する。つまり、送信端末1が送信するキャプチャ指示285は、受信端末2にキャプチャ画像を取得させるトリガーとして機能するデータ(例えば「TRG」などの所定のコード)とすることができる。
[3-1-4. Linking by ID number issued by receiving terminal 2]
As illustrated in FIG. 23D, the transmission terminal 1 transmits scene identification information including a capture instruction (hereinafter, “capture instruction 285”) to the reception terminal 2. The receiving terminal 2 acquires the capture image 283 at the time of receiving the capture instruction 285 from the transmission terminal 1, issues an ID number, associates the issued ID number with the acquired capture image 283, and issues the issued ID The number is transmitted to the transmission terminal 1. That is, the capture instruction 285 transmitted by the transmission terminal 1 can be data (for example, a predetermined code such as “TRG”) that functions as a trigger for causing the reception terminal 2 to acquire a captured image.
 また、送信端末1は、受信端末2から受信したID番号284および伝達情報282を受信端末2に送信する。受信端末2は、送信端末1から伝達情報282とともに受信したID番号284に基づいて、上記において取得したキャプチャ画像283と伝達情報282とを紐付けることにより重畳画像を生成する。 Further, the transmission terminal 1 transmits the ID number 284 and the transmission information 282 received from the reception terminal 2 to the reception terminal 2. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired in the above with the transmission information 282.
 このように、送信端末1(第1送信端末20または第2送信端末30)および受信端末2は、受信端末2が発行したID番号284を用いて、キャプチャ画像283と伝達情報282とを紐付けることができる。 Thus, the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the reception terminal 2 to associate the capture image 283 with the transmission information 282. be able to.
 [3-1-5.管理サーバ5が発行したID番号による紐付け]
 図23Eに示すように、送信端末1は、キャプチャ指示を含むシーン識別情報(以下「キャプチャ指示285」とする)を管理サーバ5に送信する。管理サーバ5は、送信端末1からキャプチャ指示285を受信した時点でID番号を発行して、発行したID番号を送信端末1および受信端末2の双方に送信する。つまり、送信端末1が送信するキャプチャ指示285は、管理サーバ5にID番号を発行させるトリガーとして機能するデータ(例えば「CAP」などの所定のコード)とすることができる。
[3-1-5. Association with ID number issued by management server 5]
As illustrated in FIG. 23E, the transmission terminal 1 transmits scene identification information including a capture instruction (hereinafter, referred to as “capture instruction 285”) to the management server 5. The management server 5 issues an ID number when receiving the capture instruction 285 from the transmission terminal 1 and transmits the issued ID number to both the transmission terminal 1 and the reception terminal 2. That is, the capture instruction 285 transmitted by the transmission terminal 1 can be data (for example, a predetermined code such as “CAP”) that functions as a trigger for causing the management server 5 to issue an ID number.
 また、受信端末2は、管理サーバ5からID番号284を受信した時点でキャプチャ画像283を取得し、取得したキャプチャ画像283と受信したID番号284とを対応付ける。 Further, the receiving terminal 2 acquires the captured image 283 when the ID number 284 is received from the management server 5, and associates the acquired captured image 283 with the received ID number 284.
 また、送信端末1は、管理サーバ5から受信したID番号284および伝達情報282を、管理サーバ5を経由して受信端末2に送信する。受信端末2は、管理サーバ5を経由して送信端末1から伝達情報282とともに受信したID番号284に基づいて、上記において取得したキャプチャ画像283と伝達情報282とを紐付けることにより重畳画像を生成する。 Further, the transmission terminal 1 transmits the ID number 284 and the transmission information 282 received from the management server 5 to the reception terminal 2 via the management server 5. Based on the ID number 284 received together with the transmission information 282 from the transmission terminal 1 via the management server 5, the reception terminal 2 generates a superimposed image by associating the captured image 283 acquired above with the transmission information 282. To do.
 このように、送信端末1(第1送信端末20または第2送信端末30)および受信端末2は、管理サーバ5が発行したID番号284を用いて、キャプチャ画像283と伝達情報282とを紐付けることができる。 As described above, the transmission terminal 1 (the first transmission terminal 20 or the second transmission terminal 30) and the reception terminal 2 use the ID number 284 issued by the management server 5 to associate the capture image 283 with the transmission information 282. be able to.
 [4.第4の実施形態]
 上記実施形態においては、伝達情報をキャプチャ画像に重畳して表示させる例を説明したが、伝達情報とキャプチャ画像とを関連付けて表示させてもよい。本実施形態においては、キャプチャ画像と関連付けて表示する伝達情報が、受信側ユーザに対してクイズを出題するためのクイズ情報である例について説明する。なお、本実施の形態では、上記第1~3の実施形態と共通する機能部または各要素については、同じ符号を付して、その重複した説明を省略する。
[4. Fourth Embodiment]
In the above embodiment, the example in which the transmission information is displayed superimposed on the capture image has been described, but the transmission information and the capture image may be displayed in association with each other. In the present embodiment, an example will be described in which transmission information displayed in association with a captured image is quiz information for giving a quiz to a receiving user. In the present embodiment, functional units or elements that are common to the first to third embodiments are given the same reference numerals, and redundant descriptions thereof are omitted.
 [4-1.システム構成]
 図24は、第4の実施形態による情報伝達システムのシステム構成の一例を示す図である。このシステム構成においては、第1の実施形態と同様に、送信端末1(第1送信端末20および第2送信端末30)および受信端末2が、ネットワークNを介して相互通信可能に接続されている。
[4-1. System configuration]
FIG. 24 is a diagram illustrating an example of a system configuration of an information transmission system according to the fourth embodiment. In this system configuration, as in the first embodiment, the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so as to be able to communicate with each other. .
 [4-2.処理概要]
 以下、第4の実施形態にかかる情報伝達システムの処理概要を、機能ブロック図を用いて説明する。図24における第1送信端末20、第2送信端末30および受信端末2の内部には、各装置に含まれる機能部を示す機能ブロック図が記載されている。
[4-2. Outline of processing]
Hereinafter, an outline of processing of the information transmission system according to the fourth embodiment will be described with reference to a functional block diagram. In the first transmitting terminal 20, the second transmitting terminal 30, and the receiving terminal 2 in FIG. 24, a functional block diagram showing functional units included in each device is described.
 第1送信端末20は、第1の実施形態の図1に示した伝達情報送信部16および伝達情報受付部18に代えて、クイズ情報作成部181およびクイズ情報送信部161を備える。受信端末2は、第1の実施形態の図1に示した画像重畳部24および伝達情報受信部26に代えて、画像合成部241およびクイズ情報受信部261を備える。 The first transmission terminal 20 includes a quiz information creation unit 181 and a quiz information transmission unit 161 instead of the transmission information transmission unit 16 and the transmission information reception unit 18 illustrated in FIG. 1 of the first embodiment. The receiving terminal 2 includes an image composition unit 241 and a quiz information receiving unit 261 instead of the image superimposing unit 24 and the transmission information receiving unit 26 illustrated in FIG. 1 of the first embodiment.
 第2送信端末30におけるクイズ情報作成部181は、第2送信端末30におけるキャプチャ画像表示部19が表示させたキャプチャ画像に関するクイズの設定入力を受け付けてクイズ情報を生成する。第2送信端末30におけるクイズ情報送信部161は、第2送信端末30におけるクイズ情報作成部181にて生成されたクイズ情報を受信端末2に送信する。 The quiz information creation unit 181 in the second transmission terminal 30 receives quiz setting input related to the captured image displayed by the capture image display unit 19 in the second transmission terminal 30 and generates quiz information. The quiz information transmission unit 161 in the second transmission terminal 30 transmits the quiz information generated by the quiz information creation unit 181 in the second transmission terminal 30 to the reception terminal 2.
 受信端末2におけるクイズ情報受信部261は、第2送信端末30におけるクイズ情報作成部161が設定入力を受け付けて生成したクイズ情報を、第2送信端末30から受信する。受信端末2における画像合成部241は、受信端末2におけるクイズ情報受信部261が受信したクイズ情報と、受信端末2における受信側キャプチャ画像取得部25が取得したキャプチャ画像とを合成して合成画像を生成する。受信端末2における画像表示部22は、受信端末2における画像合成部241が生成した合成画像を表示させて、この合成画像を受信端末2のユーザに視認させる。なお、本発明において、2以上の画像を合成することは、2以上の画像を重畳することの上位概念とする。つまり、画像の合成は、画像の重畳を含む広い概念である。 The quiz information receiving unit 261 in the receiving terminal 2 receives from the second transmitting terminal 30 the quiz information generated by the quiz information creating unit 161 in the second transmitting terminal 30 receiving the setting input. The image combining unit 241 in the receiving terminal 2 combines the quiz information received by the quiz information receiving unit 261 in the receiving terminal 2 and the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2 to generate a combined image. Generate. The image display unit 22 in the receiving terminal 2 displays the synthesized image generated by the image synthesizing unit 241 in the receiving terminal 2 and causes the user of the receiving terminal 2 to visually recognize the synthesized image. In the present invention, combining two or more images is a superordinate concept of superposing two or more images. That is, image composition is a broad concept including image superposition.
 図24に示したクイズ情報送信部161、クイズ情報作成部181、画像合成部241およびクイズ情報受信部261は、それぞれプログラムによって実現されるCPUの機能を含むものである。 The quiz information transmission unit 161, the quiz information creation unit 181, the image composition unit 241 and the quiz information reception unit 261 shown in FIG. 24 each include a CPU function realized by a program.
 [4-3.処理詳細]
 図25~図32を用いて、本実施形態における処理詳細を説明する。図25および図27は、それぞれ、本実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、受信端末2に伝達情報としてのクイズ情報を送信する場合の処理を示すフローチャートの一例を示す図である。
[4-3. Processing details]
Details of processing in the present embodiment will be described with reference to FIGS. 25 to 32. FIG. 25 and FIG. 27 respectively show a case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits quiz information as transmission information to the reception terminal 2 in the information transmission system of the present embodiment. It is a figure which shows an example of the flowchart which shows the process of.
 図26は、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。図28は、第2送信端末30のタッチパネルディスプレイ301にリアルタイムコンテンツのキャプチャ画像を表示させた場合の一例を示す図である。図29は、第2送信端末30のタッチパネルディスプレイ301にクイズ設定画面を表示させた場合の一例を示す図である。図30は、第2送信端末30のタッチパネルディスプレイ301にクイズ画面を表示させた場合の一例を示す図である。図31は、第2送信端末30においてクイズ情報として保持される伝達情報データ305cの一例を示す図である。 FIG. 26 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 28 is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 31 is a diagram illustrating an example of the transmission information data 305 c held as quiz information in the second transmission terminal 30.
 なお、図25に示すフローチャートの各処理において、ステップS511a、S514a、S516a、S517a、S534a、S535a、S536a以外の各処理は、図5に示した処理と同様である。以下においては、図25と図5との相違点を中心にして説明する。 In each process of the flowchart shown in FIG. 25, each process other than steps S511a, S514a, S516a, S517a, S534a, S535a, and S536a is the same as the process shown in FIG. In the following, description will be made focusing on the differences between FIG. 25 and FIG.
 [4-3-1.第2送信端末におけるクイズ情報の作成]
 図25のフローチャートにおいて、第2送信端末30のCPU302は、クイズ作成ボタンがユーザにより押下されたか否かを判断する(ステップS511a)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されているクイズ作成ボタン313(図26)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、クイズ作成ボタンが押下されたと判断する。
[4-3-1. Creation of quiz information at the second transmitting terminal]
In the flowchart of FIG. 25, the CPU 302 of the second transmission terminal 30 determines whether or not the quiz creation button has been pressed by the user (step S511a). For example, when the CPU 302 detects that the quiz creation button 313 (FIG. 26) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the quiz creation button It is determined that is pressed.
 クイズ作成ボタン313が押下されたと判断すると(ステップS511a、Yes判断)、第2送信端末30のCPU302は、第1の実施形態と同様に、キャプチャ指示データを第1送信端末20に送信する(ステップS512)。 When it is determined that the quiz creation button 313 has been pressed (Yes in step S511a), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 as in the first embodiment (step S511). S512).
 キャプチャ指示データを第1送信端末20に送信すると、第2送信端末30のCPU302は、クイズ情報作成処理を実行する(ステップS514a)。図27は、クイズ情報作成処理のフローチャートの一例を示す図である。図27のフローチャートにおいて、第2送信端末30のCPU302は、画像選択画面をタッチパネルディスプレイ301に表示する(ステップS621)。 When the capture instruction data is transmitted to the first transmission terminal 20, the CPU 302 of the second transmission terminal 30 executes a quiz information creation process (step S514a). FIG. 27 is a diagram illustrating an example of a flowchart of quiz information creation processing. In the flowchart of FIG. 27, the CPU 302 of the second transmission terminal 30 displays an image selection screen on the touch panel display 301 (step S621).
 図28は、第2送信端末30のタッチパネルディスプレイ301に画像選択画面を表示させた場合の一例を示す図である。図28に示す画像選択画面には、画像の一部を選択するための部分選択ボタン2801および画像の全体を選択するための全体選択ボタン2802がそれぞれ表示される。部分選択ボタン2801を押下操作された場合、画像の一部を選択するための選択フレーム2811が画像上に表示される。なお、ユーザは、例えば選択フレーム2811のフレーム上をドラッグ操作して形状や大きさを変更することにより、所望の画像部分を選択することができる。本実施形態では、図28に示すように、選択フレーム2811でキャラクターの顔部分を選択したものとして説明する。 FIG. 28 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. On the image selection screen shown in FIG. 28, a partial selection button 2801 for selecting a part of the image and an entire selection button 2802 for selecting the entire image are displayed. When the partial selection button 2801 is pressed, a selection frame 2811 for selecting a part of the image is displayed on the image. Note that the user can select a desired image portion by, for example, dragging the frame of the selection frame 2811 to change the shape and size. In the present embodiment, as shown in FIG. 28, it is assumed that a character face portion is selected in a selection frame 2811.
 第2送信端末30のCPU302は、画像選択画面において選択完了が指示されたか否かを判断する(ステップS622)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている「次へ」ボタン2813(図28)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合、画像選択画面において選択完了が指示されたと判断する(ステップS622、Yes判断)。なお、「戻る」ボタン2812(図28)が押下操作された場合、CPU302は、ステップS511に戻って、一つ前の画面である表示画面(図26)を表示する。 The CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the image selection screen (step S622). For example, when the CPU 302 detects that the “Next” button 2813 (FIG. 28) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the user's fingertip or stylus, the image selection is performed. It is determined that selection completion is instructed on the screen (step S622, Yes determination). When the “return” button 2812 (FIG. 28) is pressed, the CPU 302 returns to step S511 to display the previous display screen (FIG. 26).
 図27のステップS623において、第2送信端末30のCPU302は、クイズ設定画面をタッチパネルディスプレイ301に表示する。図29は、第2送信端末30のタッチパネルディスプレイ301にクイズ設定画面を表示させた場合の一例を示す図である。図29に示すクイズ設定画面には、作成しようとするクイズについて、「問題形式」に関する設定欄2901、「解答形式」に関する設定欄2902、および、「主題」に関する設定欄2903がそれぞれ表示される。 In step S623 of FIG. 27, the CPU 302 of the second transmission terminal 30 displays a quiz setting screen on the touch panel display 301. FIG. 29 is a diagram illustrating an example when a quiz setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. The quiz setting screen shown in FIG. 29 displays a setting field 2901 related to “question type”, a setting field 2902 related to “answer format”, and a setting field 2903 related to “subject” for each quiz to be created.
 例えば「問題形式」に関する設定欄2901では、問題文に完全な文章を用いる「問いかけ」形式、または、問題文に一部が欠けた文章を用いる「穴埋め」形式等をクイズの問題形式として設定可能である。なお、「問題形式」に関する設定項目は、「問いかけ」または「穴埋め」に限られない。また、「問題形式」に関する設定項目は、例えば、予め記憶しているデータから抽出したり、インターネットを介して他の装置から取得したりすることができる。 For example, in the setting field 2901 relating to the “question format”, a “question” format that uses a complete sentence for the question sentence, or a “fill-in” form that uses a sentence that is partially missing in the question sentence can be set as the question format for the quiz. It is. Note that the setting item related to “question type” is not limited to “question” or “fill-in”. In addition, setting items related to “problem format” can be extracted from data stored in advance or acquired from another device via the Internet, for example.
 また、例えば「解答形式」に関する設定欄2902では、4つの選択肢の中から解答を選択する「4択」形式、2つの選択肢の中から解答を選択する「2択」形式、または、解答を文字で入力する「文字入力」形式等を、クイズの解答形式として設定可能である。なお、「解答形式」に関する設定項目は、「4択」、「2択」または「文字入力」に限られない。また、「解答形式」に関する設定項目は、例えば、予め記憶しているデータから抽出したり、インターネットを介して他の装置から取得したりすることができる。 Also, for example, in the setting field 2902 regarding “answer format”, the “4 choice” format for selecting an answer from four choices, the “2 choice” format for selecting an answer from two choices, or the answer as a character "Character input" format entered with can be set as a quiz answer format. Note that the setting item related to “answer format” is not limited to “4 choices”, “2 choices”, or “character input”. In addition, the setting items related to “answer format” can be extracted from data stored in advance, or can be acquired from another device via the Internet.
 また、例えば「主題」に関する設定欄2903では、クイズの主題とするキーワードとして、「キャラクター」、「モノ」または「背景」等を設定可能である。クイズの主題とするキーワードは、例えばコンテンツ上のキャプチャ画像の位置に関連づけられたメタデータに基づいて抽出する。このため、キャプチャ画像と関連性が高いキーワードを主題の候補として提示することができる。なお、主題に関するキーワードは、予め記憶しているデータから抽出したり、インターネットを介して他の装置から取得したりすることができる。 Also, for example, in the setting field 2903 regarding “theme”, “character”, “thing”, “background”, or the like can be set as a keyword as a quiz theme. The keyword that is the subject of the quiz is extracted based on, for example, metadata associated with the position of the captured image on the content. For this reason, a keyword highly relevant to the captured image can be presented as a candidate for the subject. The keywords related to the subject can be extracted from data stored in advance or acquired from another device via the Internet.
 第2送信端末30のCPU302は、クイズ設定画面において設定完了が指示されたか否かを判断する(ステップS624)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている「クイズ作成」ボタン2912(図29)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、クイズ設定画面において設定完了が指示されたと判断する(ステップS624、Yes判断)。なお、「戻る」ボタン2911が押下操作された場合、CPU302は、ステップS621に戻って、一つ前の画面である画像選択画面(図28)を表示する。 The CPU 302 of the second transmission terminal 30 determines whether or not setting completion is instructed on the quiz setting screen (step S624). For example, when the CPU 302 detects that the “quiz creation” button 2912 (FIG. 29) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the fingertip of the user or the stylus, the quiz It is determined that setting completion is instructed on the setting screen (step S624, Yes determination). If the “return” button 2911 is pressed, the CPU 302 returns to step S621 and displays the previous image selection screen (FIG. 28).
 図27のステップS625において、第2送信端末30のCPU302は、上述したクイズ設定画面(図29)において設定された各種設定データ(問題形式、解答形式および主題)に基づいてクイズ画面を生成し、生成したクイズ画面をタッチパネルディスプレイ301に表示する。なお、クイズ画面の生成は、各種設定データ(問題形式、解答形式および主題)を入力して、クイズ画面を自動生成することのできるクイズ生成プログラム(図示しない)を用いて行うことができる。 In step S625 of FIG. 27, the CPU 302 of the second transmission terminal 30 generates a quiz screen based on the various setting data (question format, answer format and subject) set on the quiz setting screen (FIG. 29) described above, The generated quiz screen is displayed on the touch panel display 301. The quiz screen can be generated using a quiz generation program (not shown) that can automatically generate a quiz screen by inputting various setting data (question format, answer format and subject).
 図30は、第2送信端末30のタッチパネルディスプレイ301にクイズ画面を表示させた場合の一例を示す図である。図30に示すクイズ画面は、例えば、問題形式が「問いかけ」(「このキャラクターの名前は?」3002)であり、解答形式が「4択」(「『1.ジョン』、『2.ジョージ』、『3.ポール』、『4.リンゴ』」3003)であり、主題が「キャラクター」(「このキャラクターの名前は?」3002)である場合に作成されたクイズを示している。また、クイズ画面に表示された選択画像3001は、ステップS621においてユーザが選択した画像と同一である。 FIG. 30 is a diagram illustrating an example when a quiz screen is displayed on the touch panel display 301 of the second transmission terminal 30. In the quiz screen shown in FIG. 30, for example, the question format is “Question” (“What is the name of this character?” 3002), and the answer format is “4 choices” (“1. John”, “2. George”). , “3. Paul”, “4. Apple” (3003) and the theme is “Character” (“What is the name of this character?” 3002). Further, the selection image 3001 displayed on the quiz screen is the same as the image selected by the user in step S621.
 クイズ画面は、例えばコンテンツ上のキャプチャ画像の位置に関連づけられたメタデータに基づいて生成することができる。CPU302は、例えばメタデータの中からキャプチャ画像の時刻に登場しているキャラクターの名前を特定し、クイズの正解の選択肢に設定する。また、CPU302は、クイズの他の選択肢に関する情報を、辞書から抽出したり、インターネットを介して他の装置から取得したりすることができる。 The quiz screen can be generated based on metadata associated with the position of the captured image on the content, for example. For example, the CPU 302 identifies the name of the character appearing at the time of the captured image from the metadata, and sets it as the correct answer for the quiz. Further, the CPU 302 can extract information related to other options of the quiz from the dictionary or obtain it from another device via the Internet.
 また、クイズ画面においてユーザは「修正」ボタン3012を押下操作することにより、クイズ画面に表示されているクイズの内容を修正することができる。例えば、修正ボタン3012が押下操作されると、クイズ画面が修正モードに移行して、各文字の修正入力や、画像または図形等の位置、大きさまたは範囲等の修正が適宜可能となる(ステップS626、Yes判断)。なお、「戻る」ボタン3011が押下操作された場合、CPU302は、ステップS625に戻って、一つ前の画面であるクイズ設定画面(図29)を表示する。 In addition, the user can correct the contents of the quiz displayed on the quiz screen by pressing the “correct” button 3012 on the quiz screen. For example, when the correction button 3012 is pressed, the quiz screen shifts to the correction mode, and correction input of each character and correction of the position, size, range, or the like of an image or a figure can be appropriately performed (step). S626, Yes judgment). If the “return” button 3011 is pressed, the CPU 302 returns to step S625 and displays the quiz setting screen (FIG. 29), which is the previous screen.
 なお、修正モードにおいては、正解の選択肢の修正ができないことが望ましい。例えば、図30において、正解の選択肢である「2.ジョージ」の修正ができないように他の選択肢と表示態様を変えてもよい(例えば正解の選択肢を他の選択肢と異なる色で表示する等)。 In correct mode, it is desirable that correct options cannot be corrected. For example, in FIG. 30, another option and the display mode may be changed so that the correct option “2. George” cannot be corrected (for example, the correct answer option is displayed in a different color from the other options). .
 ユーザは「完了」ボタン3013を押下操作することにより、クイズ画面を完成させることができる(ステップS626、No判断)。ここで、クイズ画面を構成するためのデータはクイズ情報として、例えばフラッシュメモリ305に保持される。なお、クイズ情報はキャプチャ画像のデータ自体を含まない。 The user can complete the quiz screen by pressing the “complete” button 3013 (No in step S626). Here, the data for configuring the quiz screen is held in the flash memory 305 as quiz information, for example. Note that the quiz information does not include the captured image data itself.
 ステップS514aのクイズ情報作成処理が終了することにより、クイズ画面が完成したと判断すると、CPU302は、クイズ情報のデータに、ステップS502においてキャプチャされたキャプチャ画像のシーン識別情報を対応づけて、例えばフラッシュメモリ305の伝達情報データ305cに記録する(ステップS516a)。 If the CPU 302 determines that the quiz screen has been completed by completing the quiz information creation process in step S514a, the CPU 302 associates the quiz information data with the scene identification information of the captured image captured in step S502, for example, flash. The transmission information data 305c in the memory 305 is recorded (step S516a).
 例えば、CPU302は、図31に示すように、放送局ID111「B001」、番組ID112「P101」、再生時刻113「18:15:03」のデータを、クイズ情報114aの実データ「(Quiz001.html)」に対応付けて、伝達情報データ305cに記録する。なお、上述の通り、キャプチャ画像のデータ自体は、伝達情報データ305cに記録されない。なお、クイズ情報114aの実データのデータ形式は、html以外の形式(例えばCSS(Cascading Style Sheets))でも構わない。 For example, as shown in FIG. 31, the CPU 302 converts the data of the broadcast station ID 111 “B001”, the program ID 112 “P101”, and the reproduction time 113 “18:15:03” into the actual data “(Quiz001.html) of the quiz information 114a. ) "And recorded in the transmission information data 305c. As described above, the captured image data itself is not recorded in the transmission information data 305c. The data format of the actual data of the quiz information 114a may be a format other than html (for example, CSS (Cascading Style Style Sheets)).
 続いて、第2送信端末30のCPU302は、上記ステップS516aにてシーン識別情報を対応づけたクイズ情報を、受信端末2に送信する(ステップS517a)。例えば、CPU302は、図31に示した伝達情報データ305cのレコード1100のデータを、受信端末2に送信する。よって、クイズ情報が関連付けられる対象となったキャプチャ画像のデータ自体は、受信端末2に送信されない。 Subsequently, the CPU 302 of the second transmission terminal 30 transmits the quiz information associated with the scene identification information in the above step S516a to the reception terminal 2 (step S517a). For example, the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. Therefore, the captured image data itself to which the quiz information is associated is not transmitted to the receiving terminal 2.
 [4-3-2.受信端末2におけるクイズ画像の表示]
 図25のステップS534aにおいて、受信端末2のCPU402は、クイズ情報を第2送信端末30から受信して記録する。例えば、CPU402は、図31に示した伝達情報データ305cのレコード1100のデータを第2送信端末から受信して、外部接続型ハードディスクドライブ411の伝達情報データ405cに記録する。なお、受信端末2の伝達情報データ405cは、第2送信端末30において、図31に示したものと同様である。
[4-3-2. Display of quiz image on receiving terminal 2]
In step S534a in FIG. 25, the CPU 402 of the receiving terminal 2 receives and records quiz information from the second transmitting terminal 30. For example, the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 31 from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411. The transmission information data 405c of the receiving terminal 2 is the same as that shown in FIG. 31 in the second transmitting terminal 30.
 続いて受信端末2のCPU402は、上記ステップS533にてキャプチャ画像データ405bに記録したキャプチャ画像と、上記ステップS534aにて受信したクイズ情報とを合成してクイズ画像を生成する(ステップS535a)。 Subsequently, the CPU 402 of the receiving terminal 2 generates a quiz image by synthesizing the capture image recorded in the capture image data 405b in step S533 and the quiz information received in step S534a (step S535a).
 受信端末2のCPU402は、生成したクイズ画像を表示する(ステップS536a)。図32に示すように、CPU402は、例えば、受信端末2のユーザが視聴中のコンテンツ画面130上に、クイズ画像121を重畳表示させる。 The CPU 402 of the receiving terminal 2 displays the generated quiz image (step S536a). As illustrated in FIG. 32, for example, the CPU 402 causes the quiz image 121 to be superimposed and displayed on the content screen 130 being viewed by the user of the receiving terminal 2.
 [4-4.変形例]
 上記ステップS621(図27)に示した画像選択の変形例として、画像に代えて、画像とともに再生されている音声や、画像上において認識可能な文字等を選択できるようにしてもよい。この場合、音声や文字等に関するクイズが作成されることになる。
[4-4. Modified example]
As a modification of the image selection shown in step S621 (FIG. 27), instead of an image, a sound reproduced together with the image, a character recognizable on the image, or the like may be selected. In this case, a quiz relating to voice, characters, etc. is created.
 上記ステップS625(図27)に示したクイズ画面の変形例として、選択肢としてコンテンツ中の画像やアイコン(例えばサムネイル化した画像)を用いるようにしてもよい。この場合、選択肢と用いる画像は、再生時刻と座標位置によって特定し、各画像の再生時刻と座標位置に関する情報をクイズ情報に含めておき、受信端末2において各画像を特定して表示するように構成すればよい。 As a modification of the quiz screen shown in step S625 (FIG. 27), an image or icon (eg, a thumbnail image) in the content may be used as an option. In this case, the image to be used as an option is specified by the reproduction time and the coordinate position, information on the reproduction time and the coordinate position of each image is included in the quiz information, and each image is specified and displayed on the receiving terminal 2. What is necessary is just to comprise.
 上記ステップS514aのクイズ情報作成処理(図27)においては、ユーザが、画像を選択し(ステップS621)、クイズ設定を入力し(ステップS623)、クイズ画面を修正する(ステップS625)ことによってクイズ画面を作成する例を説明したが、第2送信端末30のCPU302が、自動的にクイズ画面を作成する構成としてもよい。この場合、ユーザは、図26において「クイズ作成」ボタン313を押下操作するだけでクイズを作成することができる。また、第2送信端末30のCPU302が、画像選択(ステップS621)、クイズ設定入力(ステップS623)および、クイズ画面修正(ステップS625)のうちの少なくとも1つを自動的に実行する構成としてもよい。 In the quiz information creation process (FIG. 27) in step S514a, the user selects an image (step S621), inputs quiz settings (step S623), and corrects the quiz screen (step S625), thereby quiz screen. However, the CPU 302 of the second transmission terminal 30 may automatically create a quiz screen. In this case, the user can create a quiz simply by pressing the “quiz creation” button 313 in FIG. In addition, the CPU 302 of the second transmission terminal 30 may automatically execute at least one of image selection (step S621), quiz setting input (step S623), and quiz screen correction (step S625). .
 上記ステップS621においては、選択フレーム2811で選択した画像をそのままクイズに利用したが、選択画像や全体画像を加工したものをクイズに用いてもよい。例えば、エッジ検出を施した画像や、第1の実施形態で示したように他の画像を重畳させた画像等をクイズに用いることができる。この場合、エッジ検出の範囲や位置の情報、重畳させる画像の情報がクイズ情報に含まれて受信端末2に送信される。 In step S621, the image selected in the selection frame 2811 is used as it is for the quiz, but a processed image of the selected image or the entire image may be used for the quiz. For example, an image on which edge detection is performed, an image on which another image is superimposed as shown in the first embodiment, or the like can be used for the quiz. In this case, the edge detection range and position information and the information of the image to be superimposed are included in the quiz information and transmitted to the receiving terminal 2.
 また、クイズの主題に応じた広告をクイズ画面に表示するようにしてもよい。なお、上記広告はユーザが手動で設定してもよいし、CPU302が自動で設定してもよい。さらに、表示する広告のリンク情報をクイズ情報に含めておき、受信端末2がリンク情報に基づいて広告を表示するようにしてもよい。 Also, an advertisement corresponding to the theme of the quiz may be displayed on the quiz screen. Note that the advertisement may be set manually by the user or automatically by the CPU 302. Furthermore, the link information of the advertisement to be displayed may be included in the quiz information, and the receiving terminal 2 may display the advertisement based on the link information.
 [5.第5の実施形態]
 上記実施形態においては、伝達情報とキャプチャ画像とを関連付けて表示させる例として、伝達情報がクイズ情報である例について説明したが、クイズ情報に代えて広告情報を作成してもよい。本実施形態においては、キャプチャ画像と関連付けて表示する伝達情報が、受信側ユーザに視聴させるための広告情報である例について説明する。なお、本実施の形態では、上記第1~4の実施形態と共通する機能部または各要素については、同じ符号を付して、その重複した説明を省略する。
[5. Fifth Embodiment]
In the above-described embodiment, an example in which the transmission information is quiz information has been described as an example of displaying the transmission information and the captured image in association with each other. However, advertisement information may be created instead of the quiz information. In the present embodiment, an example will be described in which transmission information displayed in association with a captured image is advertisement information for allowing a receiving-side user to view. In the present embodiment, functional units or elements that are common to the first to fourth embodiments are given the same reference numerals, and duplicate descriptions thereof are omitted.
 [5-1.システム構成]
 図33は、第5の実施形態による情報伝達システムのシステム構成の一例を示す図である。このシステム構成においては、第1の実施形態と同様に、送信端末1(第1送信端末20および第2送信端末30)および受信端末2が、ネットワークNを介して相互通信可能に接続されている。
[5-1. System configuration]
FIG. 33 is a diagram illustrating an example of a system configuration of an information transmission system according to the fifth embodiment. In this system configuration, as in the first embodiment, the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so as to be able to communicate with each other. .
 [5-2.処理概要]
 以下、第5の実施形態にかかる情報伝達システムの処理概要を、機能ブロック図を用いて説明する。図33における第1送信端末20、第2送信端末30および受信端末2の内部には、各装置に含まれる機能部を示す機能ブロック図が記載されている。
[5-2. Outline of processing]
Hereinafter, an outline of processing of the information transmission system according to the fifth embodiment will be described with reference to a functional block diagram. In the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2 in FIG. 33, a functional block diagram showing functional units included in each device is described.
 第1送信端末20は、第1の実施形態の図1に示した伝達情報送信部16および伝達情報受付部18に代えて、広告情報作成部182および広告情報送信部162を備える。受信端末2は、第1の実施形態の図1に示した画像重畳部24および伝達情報受信部26に代えて、画像合成部242および広告情報受信部262を備える。 The first transmission terminal 20 includes an advertisement information creation unit 182 and an advertisement information transmission unit 162 in place of the transmission information transmission unit 16 and the transmission information reception unit 18 illustrated in FIG. 1 of the first embodiment. The receiving terminal 2 includes an image composition unit 242 and an advertisement information receiving unit 262 instead of the image superimposing unit 24 and the transmission information receiving unit 26 illustrated in FIG. 1 of the first embodiment.
 第2送信端末30における広告情報作成部182は、第2送信端末30におけるキャプチャ画像表示部19が表示させたキャプチャ画像に関する広告の設定入力を受け付けて広告情報を生成する。第2送信端末30における広告情報送信部162は、第2送信端末30における広告情報作成部182にて生成された広告情報を受信端末2に送信する。 The advertisement information creation unit 182 in the second transmission terminal 30 receives advertisement setting input related to the capture image displayed by the capture image display unit 19 in the second transmission terminal 30 and generates advertisement information. The advertisement information transmission unit 162 in the second transmission terminal 30 transmits the advertisement information generated by the advertisement information creation unit 182 in the second transmission terminal 30 to the reception terminal 2.
 受信端末2における広告情報受信部262は、第2送信端末30における広告情報作成部162が設定入力を受け付けて生成した広告情報を、第2送信端末30から受信する。受信端末2における画像合成部242は、受信端末2における広告情報受信部262が受信した広告情報と、受信端末2における受信側キャプチャ画像取得部25が取得したキャプチャ画像とを合成して合成画像を生成する。受信端末2における画像表示部22は、受信端末2における画像合成部242が生成した合成画像を表示させて、この合成画像を受信端末2のユーザに視認させる。 The advertisement information receiving unit 262 in the receiving terminal 2 receives from the second transmitting terminal 30 the advertising information generated by the advertisement information creating unit 162 in the second transmitting terminal 30 receiving the setting input. The image combining unit 242 in the receiving terminal 2 combines the advertisement information received by the advertisement information receiving unit 262 in the receiving terminal 2 and the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2 to generate a combined image. Generate. The image display unit 22 in the receiving terminal 2 displays the synthesized image generated by the image synthesizing unit 242 in the receiving terminal 2 so that the user of the receiving terminal 2 can visually recognize the synthesized image.
 図33に示した広告情報送信部162、広告情報作成部182、画像合成部242および広告情報受信部262は、それぞれプログラムによって実現されるCPUの機能を含むものである。 33. Each of the advertisement information transmission unit 162, the advertisement information creation unit 182, the image composition unit 242 and the advertisement information reception unit 262 shown in FIG. 33 includes a CPU function realized by a program.
 [5-3.処理詳細]
 図34~図40を用いて、本実施形態における処理詳細を説明する。図34および図36は、それぞれ、本実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、受信端末2に伝達情報としての広告情報を送信する場合の処理を示すフローチャートの一例を示す図である。
[5-3. Processing details]
Details of processing in this embodiment will be described with reference to FIGS. FIG. 34 and FIG. 36 respectively show a case where the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) transmits advertisement information as transmission information to the reception terminal 2 in the information transmission system of the present embodiment. It is a figure which shows an example of the flowchart which shows the process of.
 図35は、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。図37は、第2送信端末30のタッチパネルディスプレイ301にリアルタイムコンテンツのキャプチャ画像を表示させた場合の一例を示す図である。図38は、第2送信端末30のタッチパネルディスプレイ301に広告設定画面を表示させた場合の一例を示す図である。図39は、第2送信端末30のタッチパネルディスプレイ301に広告画面を表示させた場合の一例を示す図である。 FIG. 35 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 37 is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 38 is a diagram illustrating an example when an advertisement setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 39 is a diagram illustrating an example when an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30.
 なお、図34に示すフローチャートの各処理において、ステップS511b、S514b、S516b、S517b、S534b、S535b、S536b以外の各処理は、図5に示した処理と同様である。以下においては、図34と図5との相違点を中心にして説明する。 In each process of the flowchart shown in FIG. 34, each process other than steps S511b, S514b, S516b, S517b, S534b, S535b, and S536b is the same as the process shown in FIG. In the following, the description will be focused on the difference between FIG. 34 and FIG.
 [5-3-1.第2送信端末における広告情報の作成]
 図34のフローチャートにおいて、第2送信端末30のCPU302は、広告作成ボタンがユーザにより押下されたか否かを判断する(ステップS511b)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている広告作成ボタン314(図35)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、広告作成ボタンが押下されたと判断する。
[5-3-1. Creation of advertisement information in the second sending terminal]
In the flowchart of FIG. 34, the CPU 302 of the second transmission terminal 30 determines whether or not the advertisement creation button has been pressed by the user (step S511b). For example, if the CPU 302 detects that the advertisement creation button 314 (FIG. 35) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the advertisement creation button It is determined that is pressed.
 広告作成ボタン314が押下されたと判断すると(ステップS511b、Yes判断)、第2送信端末30のCPU302は、第1の実施形態と同様に、キャプチャ指示データを第1送信端末20に送信する(ステップS512)。 If it is determined that the advertisement creation button 314 has been pressed (step S511b, Yes determination), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 as in the first embodiment (step S512).
 キャプチャ指示データを第1送信端末20に送信すると、第2送信端末30のCPU302は、広告情報作成処理を実行する(ステップS514b)。図35は、広告情報作成処理のフローチャートの一例を示す図である。図36のフローチャートにおいて、第2送信端末30のCPU302は、画像選択画面をタッチパネルディスプレイ301に表示する(ステップS631)。 When transmitting the capture instruction data to the first transmission terminal 20, the CPU 302 of the second transmission terminal 30 executes an advertisement information creation process (step S514b). FIG. 35 is a diagram illustrating an example of a flowchart of the advertisement information creation process. In the flowchart in FIG. 36, the CPU 302 of the second transmission terminal 30 displays an image selection screen on the touch panel display 301 (step S631).
 図37は、第2送信端末30のタッチパネルディスプレイ301に画像選択画面を表示させた場合の一例を示す図である。図37に示す画像選択画面には、第4の実施形態と同様に、画像の一部を選択するための部分選択ボタン2801および画像の全体を選択するための全体選択ボタン2802がそれぞれ表示される。部分選択ボタン2801を押下操作された場合、第4の実施形態と同様に、画像の一部を選択するための選択フレーム2811が画像上に表示される。 FIG. 37 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. As in the fourth embodiment, a partial selection button 2801 for selecting a part of an image and an entire selection button 2802 for selecting the entire image are displayed on the image selection screen shown in FIG. . When the partial selection button 2801 is pressed, a selection frame 2811 for selecting a part of the image is displayed on the image as in the fourth embodiment.
 第2送信端末30のCPU302は、画像選択画面において選択完了が指示されたか否かを判断する(ステップS632)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている「次へ」ボタン2813(図37)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合、画像選択画面において選択完了が指示されたと判断する(ステップS632、Yes判断)。 The CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the image selection screen (step S632). For example, when the CPU 302 detects that the “Next” button 2813 (FIG. 37) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the user's fingertip or stylus, the image selection is performed. It is determined that selection completion is instructed on the screen (step S632, Yes determination).
 図36のステップS633において、第2送信端末30のCPU302は、広告設定画面をタッチパネルディスプレイ301に表示する。図38は、第2送信端末30のタッチパネルディスプレイ301に広告設定画面を表示させた場合の一例を示す図である。図38に示す広告設定画面には、作成しようとする広告について、「商品選択」に関する設定欄3701、「購入先情報」に関する設定欄3702、および、「クチコミ情報」に関する設定欄3703がそれぞれ表示される。 36, in step S633, the CPU 302 of the second transmission terminal 30 displays an advertisement setting screen on the touch panel display 301. FIG. 38 is a diagram illustrating an example when an advertisement setting screen is displayed on the touch panel display 301 of the second transmission terminal 30. The advertisement setting screen shown in FIG. 38 displays a setting field 3701 related to “product selection”, a setting field 3702 related to “purchase information”, and a setting field 3703 related to “review information” for the advertisement to be created. The
 例えば「商品選択」に関する設定欄3701では、例えば図37のキャプチャ画像にかかる人物がリリースしている作品の一覧が選択可能に表示される。なお、キャプチャ画像にかかる人物は、例えばコンテンツ上のキャプチャ画像の位置に関連付けられたメタデータから特定することができる。例えばキャプチャ画像にかかる人物が歌手である場合、この歌手がリリースしている音楽CDが商品選択欄に表示される。この歌手がリリースしている音楽CDの情報は、例えば、予め記憶しているデータから抽出したり、インターネットを介して他の装置から取得したりすることができる。なお、「商品選択」に関する設定項目は、音楽CDに限られない。 For example, in the setting column 3701 regarding “product selection”, for example, a list of works released by a person related to the captured image in FIG. 37 is displayed so as to be selectable. Note that the person related to the captured image can be specified from, for example, metadata associated with the position of the captured image on the content. For example, when the person on the captured image is a singer, a music CD released by the singer is displayed in the product selection field. The information on the music CD released by the singer can be extracted from data stored in advance or acquired from another device via the Internet, for example. Note that the setting item regarding “product selection” is not limited to the music CD.
 また、例えば「購入先情報」に関する設定欄3702では、商品選択欄に表示した商品の購入先に関する情報が選択可能に表示される。なお、「購入先情報」に関する設定項目は、「URL1」、「URL2」または「URL3」に限られない。また、「購入先情報」に関する設定項目は、例えば、予め記憶しているデータから抽出したり、インターネットを介して他の装置から取得したりすることができる。 Also, for example, in the setting field 3702 regarding “purchase information”, information related to the purchase destination of the product displayed in the product selection field is displayed in a selectable manner. Note that the setting item related to “purchase information” is not limited to “URL1”, “URL2”, or “URL3”. In addition, setting items related to “purchase information” can be extracted from data stored in advance, or can be acquired from another device via the Internet.
 また、例えば「クチコミ」に関する設定欄3703では、図37のキャプチャ画像にかかる人物(歌手)についてのいわゆるクチコミ情報を入力可能である。なお、クチコミ情報は、予め記憶しているデータから抽出したり、インターネットを介して他の装置から取得したりすることができる。 Also, for example, in the setting field 3703 regarding “review”, it is possible to input so-called review information about the person (singer) related to the captured image in FIG. The word-of-mouth information can be extracted from prestored data or acquired from another device via the Internet.
 第2送信端末30のCPU302は、広告設定画面において設定完了が指示されたか否かを判断する(ステップS634)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている「広告作成」ボタン3712(図38)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、広告設定画面において設定完了が指示されたと判断する(ステップS634、Yes判断)。なお、「戻る」ボタン3711が押下操作された場合、CPU302は、ステップS631に戻って、一つ前の画面である画像選択画面(図37)を表示する。 The CPU 302 of the second transmission terminal 30 determines whether setting completion is instructed on the advertisement setting screen (step S634). For example, when the CPU 302 detects that the “create advertisement” button 3712 (FIG. 38) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the fingertip of the user or the stylus, the advertisement is displayed. It is determined that setting completion is instructed on the setting screen (step S634, Yes determination). If the “return” button 3711 is pressed, the CPU 302 returns to step S631 and displays the previous image selection screen (FIG. 37).
 図36のステップS635において、第2送信端末30のCPU302は、上述した広告設定画面(図38)において設定された各種設定データ(商品選択、購入先情報およびクチコミ情報)に基づいて広告画面を生成し、生成した広告画面をタッチパネルディスプレイ301に表示する。なお、広告画面の生成は、各種設定データ(商品選択、購入先情報およびクチコミ情報)を入力して、広告画面を自動生成することのできる広告生成プログラム(図示しない)を用いて行うことができる。 In step S635 of FIG. 36, the CPU 302 of the second transmission terminal 30 generates an advertisement screen based on the various setting data (product selection, purchaser information and word-of-mouth information) set on the advertisement setting screen (FIG. 38) described above. Then, the generated advertisement screen is displayed on the touch panel display 301. The advertisement screen can be generated using an advertisement generation program (not shown) that can automatically generate an advertisement screen by inputting various setting data (product selection, purchaser information and review information). .
 図39は、第2送信端末30のタッチパネルディスプレイ301に広告画面を表示させた場合の一例を示す図である。図39に示す広告画面は、例えば、商品情報が「ベスト1(A子)」3801であり、購入先情報が「URL1」3802であり、クチコミ情報が「歌唱力がスゴイ!…」3803である場合に作成された広告を示している。 FIG. 39 is a diagram illustrating an example when an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30. 39, for example, the product information is “Best 1 (A child)” 3801, the purchase destination information is “URL1” 3802, and the word-of-mouth information is “Singing power is amazing!” 3803. The case shows the created advertisement.
 また、第4の実施形態と同様に、広告画面においてユーザは「修正」ボタン3812を押下操作することにより、広告画面に表示されている広告の内容を修正することができる。例えば、修正ボタン3812が押下操作されると、広告画面が修正モードに移行して、各文字の修正入力や、画像または図形等の位置、大きさまたは範囲等の修正が適宜可能となる(ステップS636、Yes判断)。なお、「戻る」ボタン3811が押下操作された場合、CPU302は、ステップS633に戻って、一つ前の画面である広告設定画面(図38)を表示する。 As in the fourth embodiment, the user can correct the content of the advertisement displayed on the advertisement screen by pressing the “modify” button 3812 on the advertisement screen. For example, when the correction button 3812 is pressed, the advertisement screen shifts to a correction mode, and correction input of each character and correction of the position, size, range, etc. of an image or a figure can be made as appropriate (step). S636, Yes judgment). When the “return” button 3811 is pressed, the CPU 302 returns to step S633 and displays the advertisement setting screen (FIG. 38), which is the previous screen.
 なお、修正モードにおいては、商品情報の修正ができないことが望ましい。例えば、図38において、商品情報である「ベスト1(A子)」の修正ができないようにすることが望ましい。 In the correction mode, it is desirable that the product information cannot be corrected. For example, in FIG. 38, it is desirable that “best 1 (A child)”, which is product information, cannot be corrected.
 ユーザは「完了」ボタン3813を押下操作することにより、広告画面を完成させることができる(ステップS636、No判断)。ここで、広告画面を構成するためのデータは広告情報として、例えばフラッシュメモリ305に保持される。 The user can complete the advertisement screen by pressing the “complete” button 3813 (No in step S636). Here, data for configuring the advertisement screen is held as advertisement information in, for example, the flash memory 305.
 ステップS514bの広告情報作成処理が終了することにより、広告画面が完成したと判断すると、CPU302は、広告情報のデータに、ステップS502においてキャプチャされたキャプチャ画像のシーン識別情報を対応づけて、例えばフラッシュメモリ305の伝達情報データ305cに記録する(ステップS516b)。 If it is determined that the advertisement screen has been completed by completing the advertisement information creation process in step S514b, the CPU 302 associates the scene identification information of the captured image captured in step S502 with the advertisement information data, for example, flash. It is recorded in the transmission information data 305c of the memory 305 (step S516b).
 伝達情報データ305cの例は、図31と基本的に同様である。但し、本実施形態においては、クイズ情報114aの実データ「(Quiz001.html)」に代えて、例えば広告情報114bの実データ「(Adv001.html)」を記録する。 An example of the transmission information data 305c is basically the same as FIG. However, in this embodiment, instead of the actual data “(Quiz001.html)” of the quiz information 114a, for example, actual data “(Adv001.html)” of the advertisement information 114b is recorded.
 続いて、第2送信端末30のCPU302は、上記ステップS516bにてシーン識別情報を対応づけた広告情報を、受信端末2に送信する(ステップS517b)。例えば、CPU302は、図31に示した伝達情報データ305cのレコード1100のデータを、受信端末2に送信する。よって、広告情報が関連付けられる対象となったキャプチャ画像のデータ自体は、受信端末2に送信されない。 Subsequently, the CPU 302 of the second transmission terminal 30 transmits the advertisement information associated with the scene identification information in the above step S516b to the reception terminal 2 (step S517b). For example, the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. Therefore, the captured image data itself to which the advertisement information is associated is not transmitted to the receiving terminal 2.
 [5-3-2.受信端末2における広告画像の表示]
 図34のステップS534bにおいて、受信端末2のCPU402は、広告情報を第2送信端末30から受信して記録する。例えば、CPU402は、図31に示した伝達情報データ305cのレコード1100のデータを第2送信端末から受信して、外部接続型ハードディスクドライブ411の伝達情報データ405cに記録する。
[5-3-2. Display of advertisement image on receiving terminal 2]
In step S534b of FIG. 34, the CPU 402 of the receiving terminal 2 receives the advertisement information from the second transmitting terminal 30 and records it. For example, the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 31 from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411.
 続いて受信端末2のCPU402は、上記ステップS533にてキャプチャ画像データ405bに記録したキャプチャ画像と、上記ステップS534bにて受信した広告情報とを合成して広告画像を生成する(ステップS535b)。 Subsequently, the CPU 402 of the receiving terminal 2 generates an advertisement image by combining the captured image recorded in the captured image data 405b in step S533 and the advertisement information received in step S534b (step S535b).
 受信端末2のCPU402は、生成した広告画像を表示する(ステップS536b)。図40に示すように、CPU402は、例えば、受信端末2のユーザが視聴中のコンテンツ画面130上に、広告画像122を重畳表示させる。 The CPU 402 of the receiving terminal 2 displays the generated advertisement image (step S536b). As illustrated in FIG. 40, for example, the CPU 402 causes the advertisement image 122 to be superimposed and displayed on the content screen 130 being viewed by the user of the receiving terminal 2.
 受信端末2のユーザが、広告画像122に表示されたリンク情報(例えば「URL1」3802)を経由して実際に商品購入を行った場合には、広告画像122を作成した送信端末1のユーザに報酬が支払われるアフィリエイトを導入してもよい。これにより、送信端末のユーザに広告作成のための強い動機付けを与えることができる。 When the user of the receiving terminal 2 actually purchases a product via the link information (for example, “URL1” 3802) displayed on the advertising image 122, the user of the transmitting terminal 1 who created the advertising image 122 is notified. You may introduce an affiliate to be paid a reward. Thereby, the strong motivation for advertisement preparation can be given to the user of a transmission terminal.
 [5-4.変形例]
 上記ステップS631(図36)に示した画像選択の変形例として、画像に代えて、画像とともに再生されている音声や、画像上において認識可能な文字等を選択できるようにしてもよい。この場合、音声や文字等に関する広告が作成されることになる。
[5-4. Modified example]
As a modification of the image selection shown in step S631 (FIG. 36), instead of an image, sound reproduced along with the image, characters recognizable on the image, or the like may be selected. In this case, an advertisement related to voice, characters, etc. is created.
 上記ステップS514bの広告情報作成処理(図34)においては、ユーザが、画像を選択し(ステップS631)、広告設定を入力し(ステップS623)、広告画面を修正する(ステップS635)ことによって広告画面を作成する例を説明したが、第2送信端末30のCPU302が、自動的に広告画面を作成する構成としてもよい。この場合、ユーザは、図35において「広告作成」ボタン314を押下操作するだけで広告を作成することができる。また、第2送信端末30のCPU302が、画像選択(ステップS631)、広告設定入力(ステップS633)および、広告画面修正(ステップS635)のうちの少なくとも1つを自動的に実行する構成としてもよい。 In the advertisement information creation process (FIG. 34) in step S514b, the user selects an image (step S631), inputs advertisement settings (step S623), and modifies the advertisement screen (step S635). However, the CPU 302 of the second transmission terminal 30 may automatically create an advertisement screen. In this case, the user can create an advertisement simply by pressing the “Create Advertisement” button 314 in FIG. Further, the CPU 302 of the second transmission terminal 30 may automatically execute at least one of image selection (step S631), advertisement setting input (step S633), and advertisement screen modification (step S635). .
 [6.第6の実施形態]
 上記実施形態においては、伝達情報とキャプチャ画像とを関連付けて表示させる例として、伝達情報がクイズ情報または広告情報である例について説明したが、クイズ情報または広告情報に代えてゲームを作成してもよい。本実施形態においては、キャプチャ画像と関連付けて表示する伝達情報が、受信側ユーザにゲームを実行させるためのゲーム情報である例について説明する。なお、本実施の形態では、上記第1~5の実施形態と共通する機能部または各要素については、同じ符号を付して、その重複した説明を省略する。
[6. Sixth Embodiment]
In the above embodiment, the example in which the transmission information is quiz information or advertising information has been described as an example of displaying the transmission information and the captured image in association with each other. However, even if a game is created instead of the quiz information or advertising information, Good. In the present embodiment, an example will be described in which the transmission information displayed in association with the captured image is game information for causing the receiving user to execute the game. In the present embodiment, the same reference numerals are given to the functional units or elements common to the first to fifth embodiments, and a duplicate description thereof will be omitted.
 [6-1.システム構成]
 図41は、第6の実施形態による情報伝達システムのシステム構成の一例を示す図である。このシステム構成においては、第1の実施形態と同様に、送信端末1(第1送信端末20および第2送信端末30)および受信端末2が、ネットワークNを介して相互通信可能に接続されている。
[6-1. System configuration]
FIG. 41 is a diagram illustrating an example of a system configuration of an information transmission system according to the sixth embodiment. In this system configuration, as in the first embodiment, the transmission terminal 1 (the first transmission terminal 20 and the second transmission terminal 30) and the reception terminal 2 are connected via a network N so as to be able to communicate with each other. .
 [6-2.処理概要]
 以下、第6の実施形態にかかる情報伝達システムの処理概要を、機能ブロック図を用いて説明する。図41における第1送信端末20、第2送信端末30および受信端末2の内部には、各装置に含まれる機能部を示す機能ブロック図が記載されている。
[6-2. Outline of processing]
Hereinafter, an outline of processing of the information transmission system according to the sixth embodiment will be described with reference to a functional block diagram. In the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2 in FIG. 41, a functional block diagram showing functional units included in each device is described.
 第1送信端末20は、第1の実施形態の図1に示した伝達情報送信部16および伝達情報受付部18に代えて、ゲーム情報作成部183およびゲーム情報送信部163を備える。受信端末2は、第1の実施形態の図1に示した画像重畳部24および伝達情報受信部26に代えて、画像合成部243およびゲーム情報受信部263を備える。 The first transmission terminal 20 includes a game information creation unit 183 and a game information transmission unit 163 instead of the transmission information transmission unit 16 and the transmission information reception unit 18 illustrated in FIG. 1 of the first embodiment. The receiving terminal 2 includes an image composition unit 243 and a game information receiving unit 263 instead of the image superimposing unit 24 and the transmission information receiving unit 26 illustrated in FIG. 1 of the first embodiment.
 第2送信端末30におけるゲーム情報作成部183は、第2送信端末30におけるキャプチャ画像表示部19が表示させたキャプチャ画像に関するゲームの設定入力を受け付けてゲーム情報を生成する。第2送信端末30におけるゲーム情報送信部163は、第2送信端末30におけるゲーム情報作成部183にて生成されたゲーム情報を受信端末2に送信する。 The game information creation unit 183 in the second transmission terminal 30 receives game setting input related to the captured image displayed by the capture image display unit 19 in the second transmission terminal 30 and generates game information. The game information transmission unit 163 in the second transmission terminal 30 transmits the game information generated by the game information creation unit 183 in the second transmission terminal 30 to the reception terminal 2.
 受信端末2におけるゲーム情報受信部263は、第2送信端末30におけるゲーム情報作成部163が設定入力を受け付けて生成したゲーム情報を、第2送信端末30から受信する。受信端末2における画像合成部243は、受信端末2におけるゲーム情報受信部263が受信した広告情報と、受信端末2における受信側キャプチャ画像取得部25が取得したキャプチャ画像とを合成して合成画像を生成する。受信端末2における画像表示部22は、受信端末2における画像合成部242が生成した合成画像を表示させて、この合成画像を受信端末2のユーザに視認させる。 The game information receiving unit 263 in the receiving terminal 2 receives from the second transmitting terminal 30 the game information generated by the game information creating unit 163 in the second transmitting terminal 30 receiving the setting input. The image combining unit 243 in the receiving terminal 2 combines the advertisement information received by the game information receiving unit 263 in the receiving terminal 2 and the captured image acquired by the receiving side capture image acquiring unit 25 in the receiving terminal 2 to generate a combined image. Generate. The image display unit 22 in the receiving terminal 2 displays the synthesized image generated by the image synthesizing unit 242 in the receiving terminal 2 so that the user of the receiving terminal 2 can visually recognize the synthesized image.
 図41に示したゲーム情報送信部163、ゲーム情報作成部183、画像合成部243およびゲーム情報受信部263は、それぞれプログラムによって実現されるCPUの機能を含むものである。 41. The game information transmission unit 163, the game information creation unit 183, the image synthesis unit 243, and the game information reception unit 263 shown in FIG. 41 each include a CPU function realized by a program.
 [6-3.処理詳細]
 図42~図48を用いて、本実施形態における処理詳細を説明する。図42および図44は、それぞれ、本実施形態の情報伝達システムにおいて、送信端末1(第1送信端末20および第2送信端末30)が、受信端末2に伝達情報としてのゲーム情報を送信する場合の処理を示すフローチャートの一例を示す図である。
[6-3. Processing details]
Details of processing in this embodiment will be described with reference to FIGS. FIG. 42 and FIG. 44 show a case where the transmission terminal 1 (first transmission terminal 20 and second transmission terminal 30) transmits game information as transmission information to the reception terminal 2 in the information transmission system of the present embodiment, respectively. It is a figure which shows an example of the flowchart which shows the process of.
 図43は、第2送信端末30のタッチパネルディスプレイ301に表示画面を表示させた場合の一例を示す図である。図45は、第2送信端末30のタッチパネルディスプレイ301にリアルタイムコンテンツのキャプチャ画像を表示させた場合の一例を示す図である。図46は、第2送信端末30のタッチパネルディスプレイ301にゲーム選択画面を表示させた場合の一例を示す図である。図47は、第2送信端末30のタッチパネルディスプレイ301にゲーム画面を表示させた場合の一例を示す図である。 FIG. 43 is a diagram illustrating an example when a display screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 45 is a diagram illustrating an example when a captured image of real-time content is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 46 is a diagram illustrating an example when a game selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. FIG. 47 is a diagram illustrating an example when a game screen is displayed on the touch panel display 301 of the second transmission terminal 30.
 なお、図43に示すフローチャートの各処理において、ステップS511c、S514c、S516c、S517c、S534c、S535c、S536c以外の各処理は、図5に示した処理と同様である。以下においては、図43と図5との相違点を中心にして説明する。 In each process of the flowchart shown in FIG. 43, each process other than steps S511c, S514c, S516c, S517c, S534c, S535c, and S536c is the same as the process shown in FIG. In the following, description will be made focusing on the differences between FIG. 43 and FIG.
 [6-3-1.第2送信端末におけるゲーム情報の作成]
 図43のフローチャートにおいて、第2送信端末30のCPU302は、ゲーム作成ボタンがユーザにより押下されたか否かを判断する(ステップS511c)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されているゲーム作成ボタン315(図43)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、ゲーム作成ボタンが押下されたと判断する。
[6-3-1. Creation of game information in second transmission terminal]
In the flowchart of FIG. 43, the CPU 302 of the second transmission terminal 30 determines whether or not the game creation button has been pressed by the user (step S511c). For example, when the CPU 302 detects that the game creation button 315 (FIG. 43) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the game creation button It is determined that is pressed.
 ゲーム作成ボタン314が押下されたと判断すると(ステップS511c、Yes判断)、第2送信端末30のCPU302は、第1の実施形態と同様に、キャプチャ指示データを第1送信端末20に送信する(ステップS512)。 If it is determined that the game creation button 314 has been pressed (step S511c, Yes determination), the CPU 302 of the second transmission terminal 30 transmits capture instruction data to the first transmission terminal 20 as in the first embodiment (step S512).
 キャプチャ指示データを第1送信端末20に送信すると、第2送信端末30のCPU302は、ゲーム情報作成処理を実行する(ステップS514c)。図44は、ゲーム情報作成処理のフローチャートの一例を示す図である。図44のフローチャートにおいて、第2送信端末30のCPU302は、画像選択画面をタッチパネルディスプレイ301に表示する(ステップS641)。 When the capture instruction data is transmitted to the first transmission terminal 20, the CPU 302 of the second transmission terminal 30 executes a game information creation process (step S514c). FIG. 44 is a diagram illustrating an example of a flowchart of the game information creation process. 44, the CPU 302 of the second transmission terminal 30 displays an image selection screen on the touch panel display 301 (step S641).
 図45は、第2送信端末30のタッチパネルディスプレイ301に画像選択画面を表示させた場合の一例を示す図である。図45に示す画像選択画面には、第4の実施形態と同様に、画像の一部を選択するための部分選択ボタン2801および画像の全体を選択するための全体選択ボタン2802がそれぞれ表示される。部分選択ボタン2801を押下操作された場合、第4の実施形態と同様に、画像の一部を選択するための選択フレーム2811が画像上に表示される。 FIG. 45 is a diagram illustrating an example when an image selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. The image selection screen shown in FIG. 45 displays a partial selection button 2801 for selecting a part of the image and an entire selection button 2802 for selecting the entire image, respectively, as in the fourth embodiment. . When the partial selection button 2801 is pressed, a selection frame 2811 for selecting a part of the image is displayed on the image as in the fourth embodiment.
 第2送信端末30のCPU302は、画像選択画面において選択完了が指示されたか否かを判断する(ステップS642)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている「次へ」ボタン2813(図45)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合、画像選択画面において選択完了が指示されたと判断する(ステップS642、Yes判断)。 The CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the image selection screen (step S642). For example, when the CPU 302 detects that the “next” button 2813 (FIG. 45) displayed on the touch panel display 301 of the second transmission terminal 30 is pressed with the user's fingertip or stylus, the image selection is performed. It is determined that selection completion is instructed on the screen (step S642, Yes determination).
 図44のステップS643において、第2送信端末30のCPU302は、ゲーム選択画面をタッチパネルディスプレイ301に表示する。図46は、第2送信端末30のタッチパネルディスプレイ301にゲーム選択画面を表示させた場合の一例を示す図である。図46に示すゲーム選択画面は、作成しようとするゲームの種類が選択できるように構成されている。例えばゲーム選択画面は、「15パズル」4601、「間違い探し」4602または「神経衰弱」4603等の各ゲームが選択可能に構成されている。 44, in step S643, the CPU 302 of the second transmission terminal 30 displays a game selection screen on the touch panel display 301. FIG. 46 is a diagram illustrating an example when a game selection screen is displayed on the touch panel display 301 of the second transmission terminal 30. The game selection screen shown in FIG. 46 is configured so that the type of game to be created can be selected. For example, the game selection screen is configured such that each game such as “15 puzzle” 4601, “find mistake” 4602, or “nervation” 4603 can be selected.
 第2送信端末30のCPU302は、ゲーム選択画面において選択完了が指示されたか否かを判断する(ステップS634)。例えば、CPU302は、第2送信端末30のタッチパネルディスプレイ301に表示されている「ゲーム作成」ボタン4612(図46)が、ユーザの指先またはスタイラス等で押下操作されたことを検出した場合に、広告設定画面において設定完了が指示されたと判断する(ステップS644、Yes判断)。なお、「戻る」ボタン4611が押下操作された場合、CPU302は、ステップS641に戻って、一つ前の画面である画像選択画面(図45)を表示する。 The CPU 302 of the second transmission terminal 30 determines whether or not selection completion is instructed on the game selection screen (step S634). For example, when the CPU 302 detects that the “create game” button 4612 (FIG. 46) displayed on the touch panel display 301 of the second transmission terminal 30 has been pressed with the user's fingertip or stylus, the advertisement It is determined that setting completion has been instructed on the setting screen (step S644, Yes determination). If the “return” button 4611 is pressed, the CPU 302 returns to step S641 and displays the previous image selection screen (FIG. 45).
 図44のステップS645において、第2送信端末30のCPU302は、上述したゲーム選択画面(図46)において選択されたゲーム種類(「15パズル」、「間違い探し」または「神経衰弱」)に基づいてゲーム画面を生成し、生成したゲーム画面をタッチパネルディスプレイ301に表示する。なお、ゲーム画面の生成は、ゲーム種類(「15パズル」、「間違い探し」または「神経衰弱」)を入力して、ゲーム画面を自動生成することのできるゲーム生成プログラム(図示しない)を用いて行うことができる。 In step S645 of FIG. 44, the CPU 302 of the second transmission terminal 30 is based on the game type (“15 puzzle”, “look for mistake” or “nervation”) selected on the above-described game selection screen (FIG. 46). A game screen is generated, and the generated game screen is displayed on the touch panel display 301. The game screen is generated by using a game generation program (not shown) that can automatically generate a game screen by inputting a game type (“15 puzzle”, “find mistake” or “nervation”). It can be carried out.
 図47は、第2送信端末30のタッチパネルディスプレイ301に広告画面を表示させた場合の一例を示す図である。図47に示すゲーム画面は、例えば、キャプチャ画像を均等に16のブロックに分割して右下ブロックを削除した画像4701を表示し、各ブロックをスライドさせて元の画像を完成させる15パズルゲームを示している。 FIG. 47 is a diagram illustrating an example in which an advertisement screen is displayed on the touch panel display 301 of the second transmission terminal 30. The game screen shown in FIG. 47 displays, for example, a 15 puzzle game in which the captured image is equally divided into 16 blocks and an image 4701 in which the lower right block is deleted is displayed, and each block is slid to complete the original image. Show.
 また、第4の実施形態と同様に、ゲーム画面においてユーザは「修正」ボタン4712を押下操作することにより、ゲーム画面に表示されている広告の内容を修正することができる。例えば、修正ボタン4712が押下操作されると、ゲーム画面が修正モードに移行して、各文字の修正入力や、画像または図形等の位置、大きさまたは範囲等の修正が適宜可能となる(ステップS646、Yes判断)。なお、「戻る」ボタン4711が押下操作された場合、CPU302は、ステップS643に戻って、一つ前の画面であるゲーム選択画面(図44)を表示する。 Similarly to the fourth embodiment, the user can correct the content of the advertisement displayed on the game screen by pressing the “correct” button 4712 on the game screen. For example, when the correction button 4712 is pressed, the game screen shifts to the correction mode, and correction input of each character and correction of the position, size, range, or the like of an image or a figure can be appropriately made (step) S646, Yes judgment). If the “return” button 4711 is pressed, the CPU 302 returns to step S643 and displays the previous game selection screen (FIG. 44).
 ユーザは「完了」ボタン4713を押下操作することにより、ゲーム画面を完成させることができる(ステップS646、No判断)。ここで、ゲーム画面を構成するためのデータはゲーム情報として、例えばフラッシュメモリ305に保持される。 The user can complete the game screen by pressing the “complete” button 4713 (step S646, No determination). Here, data for configuring the game screen is held as game information, for example, in the flash memory 305.
 ステップS514cのゲーム情報作成処理が終了することにより、ゲーム画面が完成したと判断すると、CPU302は、ゲーム情報のデータに、ステップS502においてキャプチャされたキャプチャ画像のシーン識別情報を対応づけて、例えばフラッシュメモリ305の伝達情報データ305cに記録する(ステップS516c)。 When it is determined that the game screen is completed by completing the game information creation process in step S514c, the CPU 302 associates the scene identification information of the captured image captured in step S502 with the game information data, for example, flash. The transmission information data 305c in the memory 305 is recorded (step S516c).
 伝達情報データ305cの例は、図31と基本的に同様である。但し、本実施形態においては、クイズ情報114aの実データ「(Quiz001.html)」に代えて、例えばゲーム情報114cの実データ「(Game001.html)」を記録する。なお、ゲーム情報114aの実データのデータ形式は、html以外の形式(例えばCSS(Cascading Style Sheets)やFlash)でも構わない。 An example of the transmission information data 305c is basically the same as FIG. However, in this embodiment, instead of the actual data “(Quiz001.html)” of the quiz information 114a, for example, actual data “(Game001.html)” of the game information 114c is recorded. It should be noted that the data format of the actual data of the game information 114a may be a format other than html (for example, CSS (Cascading Style Style Sheets) or Flash).
 続いて、第2送信端末30のCPU302は、上記ステップS516cにてシーン識別情報を対応づけた広告情報を、受信端末2に送信する(ステップS517c)。例えば、CPU302は、図31に示した伝達情報データ305cのレコード1100のデータを、受信端末2に送信する。よって、ゲーム情報が関連付けられる対象となったキャプチャ画像のデータ自体は、受信端末2に送信されない。具体的には、対象となるキャプチャ画像をどのように用いてゲーム画像を生成するかといった情報がゲーム情報に含まれて送信される。 Subsequently, the CPU 302 of the second transmission terminal 30 transmits the advertisement information associated with the scene identification information in step S516c to the reception terminal 2 (step S517c). For example, the CPU 302 transmits the data of the record 1100 of the transmission information data 305c illustrated in FIG. Therefore, the captured image data itself to which the game information is associated is not transmitted to the receiving terminal 2. Specifically, information such as how to use a target captured image to generate a game image is included in the game information and transmitted.
 [6-3-2.受信端末2におけるゲームの表示]
 図42のステップS534cにおいて、受信端末2のCPU402は、ゲーム情報を第2送信端末30から受信して記録する。例えば、CPU402は、図31に示した伝達情報データ305cのレコード1100のデータを第2送信端末から受信して、外部接続型ハードディスクドライブ411の伝達情報データ405cに記録する。
[6-3-2. Display of game on receiving terminal 2]
In step S534c of FIG. 42, the CPU 402 of the receiving terminal 2 receives and records game information from the second transmitting terminal 30. For example, the CPU 402 receives the data of the record 1100 of the transmission information data 305c shown in FIG. 31 from the second transmission terminal and records it in the transmission information data 405c of the external connection type hard disk drive 411.
 続いて受信端末2のCPU402は、上記ステップS533にてキャプチャ画像データ405bに記録したキャプチャ画像と、上記ステップS534cにて受信したゲーム情報とを合成して広告画像を生成する(ステップS535c)。 Subsequently, the CPU 402 of the receiving terminal 2 generates an advertisement image by combining the captured image recorded in the captured image data 405b in step S533 and the game information received in step S534c (step S535c).
 受信端末2のCPU402は、生成したゲーム画像を表示する(ステップS536c)。図48に示すように、CPU402は、例えば、受信端末2のユーザが視聴中のコンテンツ画面130上に、ゲーム画像123を重畳表示させる。 The CPU 402 of the receiving terminal 2 displays the generated game image (step S536c). As shown in FIG. 48, for example, the CPU 402 causes the game image 123 to be superimposed and displayed on the content screen 130 that the user of the receiving terminal 2 is viewing.
 [6-4.変形例]
 上記において、「15パズル」のゲームの例を説明したが、キャプチャ画像を用いた他のゲームにも適用可能である。例えば、上記「間違い探し」のゲームの場合、元の画像の一部を変更した別の画像を生成し、ユーザに異なる部分を指摘させるゲームとすることができる。また、「神経衰弱」ゲームの場合、キャプチャ画像のキャラクターをトランプの絵柄に採用して、同じ絵柄のカードをめくって当てるゲームとすることができる。
[6-4. Modified example]
In the above description, an example of a “15 puzzle” game has been described, but the present invention can also be applied to other games using captured images. For example, in the case of the “search for mistakes” game, another image in which a part of the original image is changed can be generated and the user can point out a different part. Further, in the case of the “nervation” game, the character of the captured image can be adopted as a pattern of playing cards, and the card with the same pattern can be turned over and hit.
 上記ステップS514cのゲーム情報作成処理(図44)においては、ユーザが、画像を選択し(ステップS641)、ゲームを選択し(ステップS643)、ゲーム画面を修正する(ステップS645)ことによってゲーム画面を作成する例を説明したが、第2送信端末30のCPU302が、自動的にゲーム画面を作成する構成としてもよい。この場合、ユーザは、図43において「ゲーム作成」ボタン315を押下操作するだけでゲームを作成することができる。また、第2送信端末30のCPU302が、画像選択(ステップS641)、ゲーム選択(ステップS643)および、ゲーム画面修正(ステップS645)のうちの少なくとも1つを自動的に実行する構成としてもよい。 In the game information creation process (FIG. 44) in step S514c, the user selects an image (step S641), selects a game (step S643), and modifies the game screen (step S645). Although the example which produces is demonstrated, it is good also as a structure which CPU302 of the 2nd transmission terminal 30 produces a game screen automatically. In this case, the user can create a game simply by pressing the “create game” button 315 in FIG. The CPU 302 of the second transmission terminal 30 may automatically execute at least one of image selection (step S641), game selection (step S643), and game screen correction (step S645).
 [7.その他の実施形態]
 上記各実施形態において、受信端末1が複数のチューナーを有する装置(マルチチューナー対応機器)である場合、ユーザが視聴している番組に関わらず、バックグラウンドで各処理を実行することができる。また、送信端末1からシーン識別情報を受信した場合、受信側キャプチャ画像取得部25のみを実行させておき、ユーザが対象コンテンツを視聴した場合に画像重畳部24または画像合成部241を実行させるようにしてもよい。さらに、伝達情報受信部26、クイズ情報受信部261、広告情報受信部262、ゲーム情報受信部263は、送信端末1からシーン識別情報を受信した場合にのみ送信端末からの伝達情報、クイズ情報、広告情報またはゲーム情報を取得するようにしてもよい。これにより、効率的な処理が可能となる。
[7. Other Embodiments]
In each of the above embodiments, when the receiving terminal 1 is a device (multi-tuner compatible device) having a plurality of tuners, each process can be executed in the background regardless of the program that the user is viewing. In addition, when scene identification information is received from the transmission terminal 1, only the reception-side captured image acquisition unit 25 is executed, and when the user views the target content, the image superimposing unit 24 or the image composition unit 241 is executed. It may be. Further, the transmission information receiving unit 26, the quiz information receiving unit 261, the advertisement information receiving unit 262, and the game information receiving unit 263 receive the transmission information, quiz information, and the like from the transmission terminal only when the scene identification information is received from the transmission terminal 1. Advertisement information or game information may be acquired. Thereby, efficient processing becomes possible.
 上記各実施形態においては、テレビ受像機を用いた情報伝達システムについて説明したが、送信端末1、第1送信端末20、第2送信端末30、受信端末2のうち少なくとも1つがレコーダ機器、タブレット端末またはスマートフォンであってもよい。 In each of the above embodiments, an information transmission system using a television receiver has been described. However, at least one of the transmission terminal 1, the first transmission terminal 20, the second transmission terminal 30, and the reception terminal 2 is a recorder device, a tablet terminal. Or it may be a smartphone.
 上記第2の実施形態と上記第4の実施形態とを組み合わせて、SNSサーバにクイズ情報を投稿する構成としてもよい。これにより、複数のユーザ間でクイズ情報を共有し、各ユーザが各自の著作権コンテンツを利用してクイズを表示させることができる。また、SNSサーバに代えて、ホームサーバに伝達情報(例えばクイズ情報等)を記録する構成としてもよい。これにより、各ユーザは、家族間で共有するコンテンツを用いて伝達情報を(例えばクイズ等)を表示させることができる。 It is good also as a structure which posts quiz information to a SNS server combining the said 2nd Embodiment and the said 4th Embodiment. Thereby, quiz information can be shared among a plurality of users, and each user can display a quiz using their copyright content. Moreover, it is good also as a structure which replaces with a SNS server and records transmission information (for example, quiz information etc.) in a home server. Thereby, each user can display transmission information (for example, a quiz etc.) using the content shared between families.
 上記実施形態においては、送信端末1において各種の伝達情報を作成する例を説明したが、伝達情報はコンテンツに応じて予め作成したものを用いてもよい。例えば、送信端末1が放送業者のサーバである場合、受信端末2に配信しているコンテンツを取得することができれば、取得したコンテンツに応じた伝達情報を送信することができる。この場合、伝達情報に記録されたシーン識別情報を受信端末2に自動送信するように構成すればよい。これにより、放送事業者側で受信端末のユーザに対するクイズ、広告、ゲームなどの情報をコンテンツの再生に合わせて配信することができ、コンテンツ配信における娯楽性を高めることができる。 In the above embodiment, an example in which various types of transmission information are created in the transmission terminal 1 has been described. However, transmission information may be created in advance according to content. For example, when the transmission terminal 1 is a broadcaster's server, if the content distributed to the reception terminal 2 can be acquired, the transmission information corresponding to the acquired content can be transmitted. In this case, the scene identification information recorded in the transmission information may be automatically transmitted to the receiving terminal 2. As a result, information such as quizzes, advertisements, and games for the user of the receiving terminal can be distributed on the broadcast provider side in accordance with the reproduction of the content, and entertainment in content distribution can be enhanced.
 上記実施形態においては、図1、図14~図17、図18A~図18E、図19、図24に示した各機能ブロックを、ソフトウェアを実行するCPUの処理によって実現している。しかし、その一部もしくは全てを、ロジック回路等のハードウェアによって実現することができる。なお、プログラムの一部の処理をさらに、OSに実行させることができる。 In the above embodiment, the functional blocks shown in FIGS. 1, 14 to 17, 18A to 18E, 19, and 24 are realized by processing of a CPU that executes software. However, some or all of them can be realized by hardware such as a logic circuit. Note that it is possible to cause the OS to further execute a part of the processing of the program.
 以上、本発明の実施形態について説明したが、本発明は上述した実施形態に限るものではない。また、本発明は開示した範囲で種々の変更が可能であり、ここに開示された技術的手段または実施形態を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれるものである。また、本発明の実施形態に記載された効果は、本発明から生じる最も好適な効果を列挙したに過ぎず、本発明による効果は、本発明の実施形態に記載されたものに限定されるものではない。 As mentioned above, although embodiment of this invention was described, this invention is not restricted to embodiment mentioned above. The present invention can be modified in various ways within the disclosed scope, and embodiments obtained by appropriately combining the technical means or embodiments disclosed herein are also included in the technical scope of the present invention. . The effects described in the embodiments of the present invention are only the most preferable effects resulting from the present invention, and the effects of the present invention are limited to those described in the embodiments of the present invention. is not.
 本願は、送信端末で入力された情報を受信端末に伝達する情報伝達システムに有用である。 The present application is useful for an information transmission system that transmits information input at a transmission terminal to a reception terminal.
1 送信端末
2 受信端末
3 放送局
11 リアルタイムコンテンツ受信部
12 画像表示部
13 送信側バッファ部
14 シーン識別情報送信部
15 送信側キャプチャ画像取得部
16 伝達情報送信部
17 キャプチャ指示受付部
18 伝達情報受付部
19 キャプチャ画像表示部
20 第1送信端末
21 リアルタイムコンテンツ受信部
22 画像表示部
23 受信側バッファ部
24 画像重畳部
25 受信側キャプチャ画像取得部
26 伝達情報受信部
27 シーン識別情報受信部
30 第2送信端末
N ネットワーク
DESCRIPTION OF SYMBOLS 1 Transmission terminal 2 Reception terminal 3 Broadcasting station 11 Real-time content reception part 12 Image display part 13 Transmission side buffer part 14 Scene identification information transmission part 15 Transmission side capture image acquisition part 16 Transmission information transmission part 17 Capture instruction reception part 18 Transmission information reception Unit 19 capture image display unit 20 first transmission terminal 21 real-time content reception unit 22 image display unit 23 reception side buffer unit 24 image superposition unit 25 reception side capture image acquisition unit 26 transmission information reception unit 27 scene identification information reception unit 30 second Sending terminal N network

Claims (22)

  1.  動画コンテンツを再生する再生部と、
     前記再生部で再生された動画コンテンツを表示する表示部と、
     前記表示された動画コンテンツのうちいずれかのシーンを示す画像をキャプチャするメモリと、
     前記キャプチャされた画像に関連する伝達情報の入力を受け付ける受付部と、
     前記メモリでキャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得するプロセッサと、
     前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを受信端末に対して送信する送信部と、を備える送信端末であって、
     前記受信端末は、前記再生部で再生された動画コンテンツを前記送信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうち前記送信部から受信したシーン特定情報で特定されるシーンを示す画像をキャプチャし、前記キャプチャされた画像とともに前記送信部から受信した伝達情報を表示する、
     送信端末。
    A playback unit for playing back video content;
    A display unit for displaying the video content reproduced by the reproduction unit;
    A memory that captures an image indicating any scene of the displayed video content;
    A reception unit that receives input of transmission information related to the captured image;
    A processor that generates or obtains scene identification information that is information identifying an image captured in the memory;
    A transmission unit comprising: a transmission unit that transmits the transmission information and scene specification information for specifying an image related to the transmission information to a reception terminal;
    The receiving terminal obtains the moving image content reproduced by the reproducing unit from another source different from the transmitting terminal, and of the obtained moving image content, specifies a scene specified by the scene specifying information received from the transmitting unit. Capturing the image shown and displaying the transmitted information received from the transmitter along with the captured image;
    Sending terminal.
  2.  動画コンテンツを再生する再生部と、
     前記再生部で再生された動画コンテンツを表示する表示部と、
     伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを送信端末から受信する受信部と、
     前記表示された動画コンテンツのうち、前記送信端末から受信したシーン特定情報で特定されるシーンを示す画像をキャプチャするメモリと、を備える受信端末であって、
     前記表示部は、前記キャプチャされた画像とともに前記送信端末から受信した伝達情報を表示し、
     前記送信端末は、前記再生部で再生された動画コンテンツを前記受信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうちいずれかのシーンを示す画像をキャプチャし、前記キャプチャされた画像に関連する伝達情報の入力を受け付け、前記キャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得し、前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを前記受信端末に対して送信する、
     受信端末。
    A playback unit for playing back video content;
    A display unit for displaying the video content reproduced by the reproduction unit;
    A receiving unit that receives the transmission information and scene specifying information for specifying an image related to the transmission information from the transmission terminal;
    A memory that captures an image indicating a scene specified by the scene specifying information received from the transmitting terminal among the displayed moving image content, and a receiving terminal comprising:
    The display unit displays the transmission information received from the transmission terminal together with the captured image,
    The transmitting terminal obtains the moving image content reproduced by the reproducing unit from another source different from the receiving terminal, captures an image indicating any scene of the obtained moving image content, and the captured Scene specifying information for receiving input of transmission information related to an image, generating or acquiring scene specifying information that is information for specifying the captured image, and specifying the transmission information and an image related to the transmission information; To the receiving terminal,
    Receiving terminal.
  3.  前記送信端末の前記送信部は、前記伝達情報および前記シーン特定情報を、管理サーバを介して前記受信端末に対して送信し、
     前記受信端末は、前記管理サーバを介して、前記伝達情報および前記シーン特定情報を前記送信部から受信する、
     請求項1に記載の送信端末。
    The transmitting unit of the transmitting terminal transmits the transmission information and the scene specifying information to the receiving terminal via a management server,
    The receiving terminal receives the transmission information and the scene specifying information from the transmission unit via the management server.
    The transmission terminal according to claim 1.
  4.  前記送信端末は、前記伝達情報および前記シーン特定情報を、管理サーバを介して前記受信端末に対して送信し、
     前記受信端末の前記受信部は、管理サーバを介して、前記伝達情報および前記シーン特定情報を前記送信端末から受信する、
     請求項2に記載の受信端末。
    The transmitting terminal transmits the transmission information and the scene specifying information to the receiving terminal via a management server,
    The receiving unit of the receiving terminal receives the transmission information and the scene specifying information from the transmitting terminal via a management server.
    The receiving terminal according to claim 2.
  5.  前記送信端末は、前記動画コンテンツの一部または全部を一時的に記憶するバッファ部をさらに備え、前記メモリは前記バッファ部に記憶された動画コンテンツ中のシーンをキャプチャする、
     請求項1または3に記載の送信端末。
    The transmitting terminal further includes a buffer unit that temporarily stores part or all of the moving image content, and the memory captures a scene in the moving image content stored in the buffer unit.
    The transmission terminal according to claim 1 or 3.
  6.  前記受信端末は、前記動画コンテンツの一部または全部を一時的に記憶するバッファ部をさらに備え、前記メモリは前記バッファ部に記憶された動画コンテンツ中のシーンをキャプチャする、
     請求項2または4に記載の受信端末。
    The receiving terminal further includes a buffer unit that temporarily stores a part or all of the video content, and the memory captures a scene in the video content stored in the buffer unit.
    The receiving terminal according to claim 2 or 4.
  7.  前記送信端末または前記受信端末は、前記動画コンテンツを受信するコンテンツ受信部をさらに備える、
     請求項1、3または5のいずれか一項に記載の送信端末。
    The transmission terminal or the reception terminal further includes a content reception unit that receives the video content.
    The transmission terminal according to claim 1, 3 or 5.
  8.  前記送信端末または前記受信端末は、前記動画コンテンツを受信するコンテンツ受信部をさらに備える、
     請求項2、4または6のいずれか一項に記載の受信端末。
    The transmission terminal or the reception terminal further includes a content reception unit that receives the video content.
    The receiving terminal as described in any one of Claim 2, 4 or 6.
  9.  前記送信端末は、第1送信端末および第2送信端末を含み、
      前記第1送信端末は、前記再生部と、前記表示部と、を備え、
      前記第2送信端末は、前記メモリと、前記受付部と、を備える、
     請求項1、3、5または7のいずれか一項に記載の送信端末。
    The transmitting terminal includes a first transmitting terminal and a second transmitting terminal,
    The first transmission terminal includes the reproduction unit and the display unit,
    The second transmission terminal includes the memory and the reception unit.
    The transmission terminal according to any one of claims 1, 3, 5 and 7.
  10.  前記受信端末は、第1受信端末および第2受信端末を含み、
      前記第1受信端末は、前記再生部と、前記受信部と、を備え、
      前記第2受信端末は、前記表示部と、前記メモリと、を備える、
     請求項2、4、6または8のいずれか一項に記載の受信端末。
    The receiving terminal includes a first receiving terminal and a second receiving terminal,
    The first receiving terminal includes the reproducing unit and the receiving unit,
    The second receiving terminal includes the display unit and the memory.
    The receiving terminal according to any one of claims 2, 4, 6, and 8.
  11.  前記シーン識別情報は、前記送信端末により発行された識別番号を含む、
     請求項1、3、5、7または9のいずれか一項に記載の送信端末。
    The scene identification information includes an identification number issued by the transmission terminal.
    The transmission terminal according to any one of claims 1, 3, 5, 7, and 9.
  12.  前記シーン識別情報は、前記送信端末により発行された識別番号を含む、
     請求項2、4、6、8または10のいずれか一項に記載の受信端末。
    The scene identification information includes an identification number issued by the transmission terminal.
    The receiving terminal according to any one of claims 2, 4, 6, 8, or 10.
  13.  前記伝達情報は、前記受信端末においてクイズを出題するためのクイズ情報である、
     請求項1、3、5、7、9または11のいずれか一項に記載の送信端末。
    The transmission information is quiz information for giving a quiz at the receiving terminal.
    The transmission terminal according to any one of claims 1, 3, 5, 7, 9, or 11.
  14.  前記伝達情報は、前記受信端末においてクイズを出題するためのクイズ情報である、
     請求項2、4、6、8、10または12のいずれか一項に記載の受信端末。
    The transmission information is quiz information for giving a quiz at the receiving terminal.
    The receiving terminal according to any one of claims 2, 4, 6, 8, 10, or 12.
  15.  前記伝達情報は、前記受信端末において広告を視聴させるための広告情報である、
     請求項1、3、5、7、9または11のいずれか一項に記載の送信端末。
    The transmission information is advertisement information for allowing an advertisement to be viewed on the receiving terminal.
    The transmission terminal according to any one of claims 1, 3, 5, 7, 9, or 11.
  16.  前記伝達情報は、前記受信端末において広告を視聴させるための広告情報である、
     請求項2、4、6、8、10または12のいずれか一項に記載の受信端末。
    The transmission information is advertisement information for allowing an advertisement to be viewed on the receiving terminal.
    The receiving terminal according to any one of claims 2, 4, 6, 8, 10, or 12.
  17.  前記伝達情報は、前記受信端末においてゲームを実行するためのゲーム情報である、
     請求項1、3、5、7、9または11のいずれか一項に記載の送信端末。
    The transmission information is game information for executing a game on the receiving terminal.
    The transmission terminal according to any one of claims 1, 3, 5, 7, 9, or 11.
  18.  前記伝達情報は、前記受信端末においてゲームを実行するためのゲーム情報である、
     請求項2、4、6、8、10または12のいずれか一項に記載の受信端末。
    The transmission information is game information for executing a game on the receiving terminal.
    The receiving terminal according to any one of claims 2, 4, 6, 8, 10, or 12.
  19.  送信端末を、コンピュータを用いて実現するためのプログラムであって、
     動画コンテンツを再生する再生処理と、
     前記再生処理で再生された動画コンテンツを表示する表示処理と、
     前記表示された動画コンテンツのうちいずれかのシーンを示す画像をメモリにキャプチャする処理と、
     前記キャプチャされた画像に関連する伝達情報の入力を受け付ける受付処理と、
     前記メモリでキャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得する処理と、
     前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを受信端末に対して送信する送信処理と、を前記コンピュータに実行させ、
     前記受信端末は、前記送信端末で再生された動画コンテンツを前記送信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうち前記送信端末から受信したシーン特定情報で特定されるシーンを示す画像をキャプチャし、前記キャプチャされた画像とともに前記送信端末から受信した伝達情報を表示する、
     プログラム。
    A program for realizing a transmission terminal using a computer,
    Playback processing to play video content,
    A display process for displaying the video content reproduced in the reproduction process;
    A process of capturing an image indicating any scene of the displayed video content in a memory;
    A reception process for receiving input of transmission information related to the captured image;
    Processing for generating or acquiring scene specifying information which is information for specifying an image captured in the memory;
    Causing the computer to execute transmission processing for transmitting the transmission information and scene specification information for specifying an image related to the transmission information to a receiving terminal;
    The receiving terminal obtains the moving image content reproduced by the transmitting terminal from another source different from the transmitting terminal, and the scene specified by the scene specifying information received from the transmitting terminal among the obtained moving image content. Capturing an image indicating and displaying the communication information received from the transmitting terminal together with the captured image;
    program.
  20.  受信端末を、コンピュータを用いて実現するためのプログラムであって、
     動画コンテンツを再生する再生処理と、
     前記再生処理で再生された動画コンテンツを表示する表示処理と、
     伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを送信端末から受信する受信処理と、
     前記表示された動画コンテンツのうち、前記送信端末から受信したシーン特定情報で特定されるシーンを示す画像をメモリにキャプチャする処理と、を前記コンピュータに実行させ、
     前記表示処理は、前記キャプチャされた画像とともに前記送信端末から受信した伝達情報を表示し、
     前記送信端末は、前記受信端末で再生された動画コンテンツを前記受信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうちいずれかのシーンを示す画像をキャプチャし、前記キャプチャされた画像に関連する伝達情報の入力を受け付け、前記キャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得し、前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを前記受信端末に対して送信する、
     プログラム。
    A program for realizing a receiving terminal using a computer,
    Playback processing to play video content,
    A display process for displaying the video content reproduced in the reproduction process;
    A reception process for receiving the transmission information and scene specifying information for specifying an image related to the transmission information from the transmission terminal;
    A process of capturing, in a memory, an image indicating a scene specified by the scene specifying information received from the transmitting terminal among the displayed moving image content;
    The display process displays transmission information received from the transmitting terminal together with the captured image,
    The transmitting terminal obtains the moving image content reproduced by the receiving terminal from another source different from the receiving terminal, captures an image indicating any scene of the acquired moving image content, and the captured Scene specifying information for receiving input of transmission information related to an image, generating or acquiring scene specifying information that is information for specifying the captured image, and specifying the transmission information and an image related to the transmission information; To the receiving terminal,
    program.
  21.  送信端末と受信端末とを用いた情報伝達方法であって、
     前記送信端末は、
      動画コンテンツを再生する再生工程と、
      前記再生工程で再生された動画コンテンツを表示する表示工程と、
      前記表示された動画コンテンツのうちいずれかのシーンを示す画像をメモリにキャプチャする工程と、
      前記キャプチャされた画像に関連する伝達情報の入力を受け付ける受付工程と、
      前記メモリでキャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得する工程と、
      前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを受信端末に対して送信する送信工程と、を実行し、
     前記受信端末は、前記送信端末で再生された動画コンテンツを前記送信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうち前記送信端末から受信したシーン特定情報で特定されるシーンを示す画像をキャプチャし、前記キャプチャされた画像とともに前記送信端末から受信した伝達情報を表示する、
     情報伝達方法。
    An information transmission method using a transmitting terminal and a receiving terminal,
    The transmitting terminal is
    A playback process for playing back video content;
    A display step for displaying the video content reproduced in the reproduction step;
    Capturing an image indicating any scene of the displayed moving image content in a memory;
    A receiving step for receiving input of transmission information related to the captured image;
    Generating or acquiring scene specifying information which is information for specifying an image captured in the memory;
    Transmitting the transmission information and scene specifying information for specifying an image related to the transmission information to a receiving terminal, and
    The receiving terminal obtains the moving image content reproduced by the transmitting terminal from another source different from the transmitting terminal, and the scene specified by the scene specifying information received from the transmitting terminal among the obtained moving image content. Capturing an image indicating and displaying the communication information received from the transmitting terminal together with the captured image;
    Information transmission method.
  22.  送信端末と受信端末とを用いた情報伝達方法であって、
     前記受信端末は、
      動画コンテンツを再生する再生工程と、
      前記再生工程で再生された動画コンテンツを表示する表示工程と、
      伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを送信端末から受信する受信工程と、
      前記表示された動画コンテンツのうち、前記送信端末から受信したシーン特定情報で特定されるシーンを示す画像をメモリにキャプチャする工程と、を実行し、
      前記表示工程は、前記キャプチャされた画像とともに前記送信端末から受信した伝達情報を表示し、
     前記送信端末は、前記受信端末で再生された動画コンテンツを前記受信端末とは異なる別のソースから入手し、前記入手した動画コンテンツのうちいずれかのシーンを示す画像をキャプチャし、前記キャプチャされた画像に関連する伝達情報の入力を受け付け、前記キャプチャされた画像を特定する情報であるシーン特定情報を生成しまたは取得し、前記伝達情報とその伝達情報に関連する画像を特定するシーン特定情報とを前記受信端末に対して送信する、
     情報伝達方法。
    An information transmission method using a transmitting terminal and a receiving terminal,
    The receiving terminal is
    A playback process for playing back video content;
    A display step for displaying the video content reproduced in the reproduction step;
    A reception step of receiving from the transmitting terminal the transmission information and scene specifying information for specifying an image related to the transmission information;
    A step of capturing, in a memory, an image indicating a scene specified by the scene specifying information received from the transmitting terminal among the displayed moving image content,
    The display step displays the transmission information received from the transmission terminal together with the captured image,
    The transmitting terminal obtains the moving image content reproduced by the receiving terminal from another source different from the receiving terminal, captures an image indicating any scene of the acquired moving image content, and the captured Scene specifying information for receiving input of transmission information related to an image, generating or acquiring scene specifying information that is information for specifying the captured image, and specifying the transmission information and an image related to the transmission information; To the receiving terminal,
    Information transmission method.
PCT/JP2012/005492 2011-11-04 2012-08-30 Transmission terminal, reception terminal, and method for sending information WO2013065221A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-242484 2011-11-04
JP2011242484 2011-11-04

Publications (1)

Publication Number Publication Date
WO2013065221A1 true WO2013065221A1 (en) 2013-05-10

Family

ID=48191606

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/005492 WO2013065221A1 (en) 2011-11-04 2012-08-30 Transmission terminal, reception terminal, and method for sending information

Country Status (2)

Country Link
JP (1) JPWO2013065221A1 (en)
WO (1) WO2013065221A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015025891A1 (en) * 2013-08-21 2015-02-26 シャープ株式会社 Moving image display control device, display control method, television receiver, program, and recording medium
JP2015050749A (en) * 2013-09-04 2015-03-16 日本放送協会 Receiver, cooperative terminal device and program
JP2015080081A (en) * 2013-10-16 2015-04-23 シャープ株式会社 Image display device, method for controlling image display device, television receiver, program, and recording medium
JP6058845B1 (en) * 2015-03-31 2017-01-11 株式会社トライグループ Portable terminal device and program
JP2017182646A (en) * 2016-03-31 2017-10-05 大日本印刷株式会社 Information processing device, program and information processing method
JP2020137845A (en) * 2019-02-28 2020-09-03 株式会社ドワンゴ Terminal, server, and program
JP2020177689A (en) * 2016-03-31 2020-10-29 大日本印刷株式会社 Information processing apparatus, program, and information processing method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011132472A1 (en) * 2010-04-22 2011-10-27 シャープ株式会社 Electronic apparatus, display method, and computer readable storage medium storing display program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4577085B2 (en) * 2005-05-17 2010-11-10 ソニー株式会社 Video processing apparatus and video processing method
JP2011210051A (en) * 2010-03-30 2011-10-20 Sharp Corp Network system, communication method, and communication terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011132472A1 (en) * 2010-04-22 2011-10-27 シャープ株式会社 Electronic apparatus, display method, and computer readable storage medium storing display program

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015025891A1 (en) * 2013-08-21 2015-02-26 シャープ株式会社 Moving image display control device, display control method, television receiver, program, and recording medium
JP2015062279A (en) * 2013-08-21 2015-04-02 シャープ株式会社 Moving image display control device, display control device, television receiver, program, and recording medium
JP2015050749A (en) * 2013-09-04 2015-03-16 日本放送協会 Receiver, cooperative terminal device and program
JP2015080081A (en) * 2013-10-16 2015-04-23 シャープ株式会社 Image display device, method for controlling image display device, television receiver, program, and recording medium
WO2015056610A1 (en) * 2013-10-16 2015-04-23 シャープ株式会社 Image display device, method for controlling image display device, television receiver, program, and recording medium
JP6058845B1 (en) * 2015-03-31 2017-01-11 株式会社トライグループ Portable terminal device and program
JP2017182646A (en) * 2016-03-31 2017-10-05 大日本印刷株式会社 Information processing device, program and information processing method
JP2020177689A (en) * 2016-03-31 2020-10-29 大日本印刷株式会社 Information processing apparatus, program, and information processing method
JP2020137845A (en) * 2019-02-28 2020-09-03 株式会社ドワンゴ Terminal, server, and program

Also Published As

Publication number Publication date
JPWO2013065221A1 (en) 2015-04-02

Similar Documents

Publication Publication Date Title
WO2013065221A1 (en) Transmission terminal, reception terminal, and method for sending information
JP4346688B2 (en) Audio visual system, headend and receiver unit
CN102859486B (en) Zoom display navigates
US20110310100A1 (en) Three-dimensional shape user interface for media content delivery systems and methods
JP4548297B2 (en) Information processing apparatus and information processing method
TWI610180B (en) Cooperative provision of personalized user functions using shared and personal devices
US20140141877A1 (en) Methods and systems for visually distinguishing objects appearing in a media asset
ES2434259T3 (en) Social television service
JP5871564B2 (en) Bookmark management device, bookmark management system, information processing terminal, and program
JP5073535B2 (en) CONTENT REPRODUCTION DEVICE, CONTENT REPRODUCTION METHOD, CONTENT REPRODUCTION SYSTEM, CONTENT REPRODUCTION PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
CN104270681B (en) video information playing method and device
JP6760667B2 (en) Information processing equipment, information processing methods and information processing programs
JP6031685B2 (en) Video receiving apparatus and video receiving method
JP2015037290A (en) Video control device, video display device, video control system, and method
JP2006515090A (en) A device that positions the cursor by shooting on the monitor screen
JP6006962B2 (en) Content display control apparatus and program
JP6765557B1 (en) Content distribution system, display terminal and content distribution method
CN112565892B (en) Method for identifying roles of video programs and related equipment
CN111726692B (en) Interactive playing method of audio-video data
JP6461290B1 (en) Content providing server, content providing program, content providing system, and user program
JP2007096866A (en) Display device, tabulation system, and information providing system or the like
TW201721585A (en) Image superimposition system and method thereof
JP2016096512A (en) Receiving device, information processing method, and program
JP2014064145A (en) Comment providing device, comment display device, comment providing system, comment providing method, program, and recording medium
WO2013179364A1 (en) Program-information display device, program-information output device, and program-information display method

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2013512676

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12845034

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12845034

Country of ref document: EP

Kind code of ref document: A1