WO2013093968A1 - Content processing device and status read-out method - Google Patents

Content processing device and status read-out method Download PDF

Info

Publication number
WO2013093968A1
WO2013093968A1 PCT/JP2011/007147 JP2011007147W WO2013093968A1 WO 2013093968 A1 WO2013093968 A1 WO 2013093968A1 JP 2011007147 W JP2011007147 W JP 2011007147W WO 2013093968 A1 WO2013093968 A1 WO 2013093968A1
Authority
WO
WIPO (PCT)
Prior art keywords
audio
video
output
content
unit
Prior art date
Application number
PCT/JP2011/007147
Other languages
French (fr)
Japanese (ja)
Inventor
卓也 森田
鈴木 秀和
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to PCT/JP2011/007147 priority Critical patent/WO2013093968A1/en
Publication of WO2013093968A1 publication Critical patent/WO2013093968A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43632Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wired protocol, e.g. IEEE 1394
    • H04N21/43635HDMI
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44231Monitoring of peripheral device or external card, e.g. to detect processing problems in a handheld device or the failure of an external recording device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • G09G2370/045Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller using multiple communication channels, e.g. parallel and serial
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/12Use of DVI or HDMI protocol in interfaces along the display data pipeline

Definitions

  • the present invention relates to a method of reading out a state notified from an external device in a system in which a content processing apparatus capable of viewing content and an external device are connected via a communication interface, and content processing having this state reading function Relates to the device.
  • an object of the present invention is to read and listen to the state of the external device without interrupting the display of the content video when a state change occurs in the external device operating in the background while viewing the content. It is providing the method of notifying a person, and the content processing apparatus which performs the said method.
  • the present invention is directed to a content processing apparatus that outputs content video and audio.
  • the content processing apparatus of the present invention receives a control information for notifying a state output from the external device in a communication unit that communicates with the external device via a predetermined interface.
  • the control unit that determines and controls the audio output of the content processing device based on the control information, and continuously outputs the video of the viewing content according to the control of the control unit, and the external device notified by the control information
  • An audio / video output processing unit is provided for outputting audio for reading out the state.
  • a communication interface conforming to the HDMI standard is applicable.
  • it is suitable to use an HDMI-CEC command for the control information.
  • the following control can be easily realized by using the HDMI-CEC command.
  • control unit controls the video / audio output processing unit to output audio that reads the state of the external device in response to an instruction to start audio data output by the HDMI-CEC command, and the video / audio output processing unit Under the control of the control unit, the audio output is switched from the audio of the viewing content to the audio that reads the state of the external device.
  • control unit controls the video / audio output processing unit to output audio of the viewing content in response to an instruction to end audio data output by the HDMI-CEC command, and the video / audio output processing unit
  • the sound output may be switched from the sound for reading the state of the external device to the sound of the viewing content.
  • control unit controls the video / audio output processing unit to output audio that reads the state of the external device in response to an instruction to start audio data output by the HDMI-CEC command, and the video / audio output processing unit
  • the voice for reading the state of the external device is synthesized with the voice of the viewing content being outputted as voice.
  • control unit controls the video / audio output processing unit to output audio of the viewing content in response to an instruction to end audio data output by the HDMI-CEC command, and the video / audio output processing unit
  • the synthesis of the voice that reads out the state of the external device to the voice of the viewing content that is output as a voice may be canceled.
  • the control unit uses the HDMI-CEC command.
  • the audio specified by the index is acquired from the storage unit
  • the video / audio output processing unit is controlled to output the acquired audio as audio
  • the video / audio output processing unit controls the control unit. Accordingly, it is also possible to synthesize the acquired audio with the audio of the viewing content being output as audio.
  • the control unit extracts a character string that represents the state of the external device stored in the HDMI-CEC command, and converts the conversion unit.
  • the audio / video output processing unit is controlled so as to output the converted audio as audio, and the audio / video output processing unit converts the audio into the audio of the viewing content being output according to the control of the control unit. It is also possible to synthesize synthesized speech.
  • the processing realized by the content processing apparatus includes a step of receiving control information from an external device via a predetermined interface, a step of determining whether the control information is information for notifying a state of the external device, And when the control information is information for notifying the state of the external device, it is understood as a state reading method including a step of reading out the state of the external device notified by the control information by voice while continuing to output the video of the viewing content. Can do.
  • the state of the external device is read out by voice without interrupting the display of the content video, Notification can be made.
  • FIG. 1 is a diagram showing a configuration of a content processing apparatus 100 according to the first embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of control information used in the content processing apparatus 100.
  • FIG. 3 is a flowchart for explaining the processing procedure of the state reading method by voice switching performed in the content processing apparatus 100.
  • FIG. 4 is a flowchart for explaining the processing procedure of the state reading method by speech synthesis performed in the content processing apparatus 100.
  • FIG. 5 is a diagram showing a configuration of a content processing apparatus 200 according to the second embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of control information used in the content processing apparatus 200.
  • FIG. 7 is a diagram illustrating an example of a table stored in the audio file storage unit 211.
  • FIG. 8 is a flowchart for explaining the processing procedure of the state reading method performed in the content processing apparatus 200.
  • FIG. 9 is a diagram showing a configuration of a content processing apparatus 300 according to the third embodiment of the present invention.
  • FIG. 10 is a diagram illustrating an example of control information used in the content processing apparatus 300.
  • FIG. 11 is a flowchart for explaining the processing procedure of the state reading method performed in the content processing apparatus 300.
  • FIG. 12 is a diagram showing a configuration of a content processing apparatus 400 according to another embodiment of the present invention.
  • the present invention is applied to a system including a content processing apparatus that can input a content signal and perform video display and audio output, and an external apparatus that is communicably connected to the content processing apparatus. Then, the present invention uses a speech reading process without interrupting the display of the content video when a state change occurs in the external device that was operating regardless of the content viewing while viewing the content on the content processing device. Thus, the state of the external device is notified to the content viewer. From this, the external device can be regarded as a “status notification device”, and the content processing device can be regarded as a “status reading device”.
  • Examples of the content processing device include a device having a screen such as a television device or a personal computer (personal computer).
  • Examples of the content signal include signals including video and audio, such as a terrestrial digital signal, a BS digital signal, and an MPEG (moving picture expert group) signal.
  • External devices include, for example, players, recorders, theater amplifiers, digital cameras, movies, door phones, sensor cameras, personal computers, tablet terminals, smartphones, facsimiles, network servers, refrigerators, washing machines, air conditioners, cleaning robots, rice cookers, dishwashers and dryers. And machines that may cause some state change such as a bathroom water heater.
  • Status changes include, for example, content / video / still image / data recording, copy (dubbing), download, and upload processing completion / processing errors (insufficient capacity, battery exhausted, connection impossible, etc.), telephone / facsimile incoming calls, Mail arrival, information update, automatic setting change, sensor detection, button press detection, filter clogging, ice making, washing, drying, cleaning, rice cooking, temperature control, hot water start / stop, and device failure And so on.
  • HDMI communication conforming to the High-Definition Multimedia Interface (HDMI) standard is applied to a communication interface that connects the content processing apparatus and an external apparatus.
  • HDMI communication conforming to the High-Definition Multimedia Interface (HDMI) standard is applied to a communication interface that connects the content processing apparatus and an external apparatus.
  • FIG. 1 is a diagram showing a configuration of a content processing apparatus 100 according to the first embodiment of the present invention.
  • a content processing apparatus 100 according to the first embodiment illustrated in FIG. 1 includes a tuner unit 101, a video input unit 102, an audio input unit 103, an HDMI communication unit 104, a video input unit 105, and an audio input unit 106.
  • the content processing apparatus 100 is connected to the external apparatus 500 via the HDMI cable 501.
  • the tuner unit 101 receives a content signal (television signal) broadcast on the ground.
  • the video input unit 102 extracts and inputs video data from the content signal received by the tuner unit 101.
  • the audio input unit 103 extracts and inputs audio data from the content signal received by the tuner unit 101.
  • the HDMI communication unit 104 receives the HDMI signal output from the external device 500 via the HDMI cable 501. Further, the HDMI communication unit 104 outputs predetermined control information (described later) accompanying the received HDMI signal to the audio output control unit 107.
  • the video input unit 105 extracts video data from the HDMI signal received by the HDMI communication unit 104 and inputs it.
  • the audio input unit 106 extracts and inputs audio data from the HDMI signal received by the HDMI communication unit 104.
  • the audio output control unit 107 receives control information from the HDMI communication unit 104 and controls switching of the video / audio output processing unit 108 based on the control information.
  • the video / audio output processing unit 108 includes video data of a content signal output from the video input unit 102, audio data of a content signal output from the audio input unit 103, video data of an HDMI signal output from the video input unit 105, The audio data of the HDMI signal output from the audio input unit 106 is input. Then, the video / audio output processing unit 108 outputs video data and audio data according to the control instruction of the audio output control unit 107 to the video output unit 109 and the audio output unit 110.
  • the video output unit 109 displays a video on a screen (not shown) or the like according to the video data output from the video / audio output processing unit 108.
  • the audio output unit 110 outputs audio to a speaker (not shown) or the like according to the audio data output from the video / audio output processing unit 108.
  • FIG. 2 is a diagram illustrating an example of control information used in the content processing apparatus 100 according to the first embodiment.
  • 3 and 4 are flowcharts for explaining the processing procedure of the state reading method performed by the HDMI communication unit 104, the audio output control unit 107, and the video / audio output processing unit 108 of the content processing apparatus 100 according to the first embodiment. It is.
  • Control information First, as a premise, in the first embodiment, a command of “HDMI-CEC (consumer electronics control)” adopted in the HDMI standard used as a communication interface for connecting the content processing apparatus 100 and the external apparatus 500 is used. Is used as control information.
  • HDMI-CEC consumer electronics control
  • HDMI-CEC is a device control protocol that allows vendors using the HDMI standard (for example, manufacturers of the content processing apparatus 100 and the external apparatus 500) to define commands independently.
  • the command frame used in HDMI-CEC is composed of a 1-byte header area (header), a 1-byte operation code area (opcode), and a 14-byte vendor-defined area (vendor unique). Is done.
  • the address of the destination device for transmitting the HDMI-CEC command that is, the address of the content processing device 100 is stored.
  • information for example, “0x89” for determining that the HDMI-CEC command is a vendor command is stored.
  • 1-byte information for example, “0x0a” indicating that the HDMI-CEC command is “notification state of external device (notice state)” and “audio data indicating the device status” 1 byte of information indicating “start output (start)” (for example, “0x01”; FIG. 2A), or 1 byte indicating “end of output of audio data indicating device status (end)” (For example, “0x02”; FIG. 2B) is stored.
  • the state reading process using the above-described HDMI-CEC command includes a method for realizing the state reading process by voice switching (FIG. 3) and a method for realizing the state reading process by voice synthesis (FIG. 4). There is.
  • the state reading process by voice switching shown in FIG. 3 is started when the HDMI communication unit 104 receives control information, that is, an HDMI-CEC command.
  • the HDMI communication unit 104 confirms the first byte in the vendor definition area of the received HDMI-CEC command, and determines whether or not the command notifies the device status (step S301). If it is determined that the command is for notifying the apparatus state (“Yes” in step S301), the HDMI communication unit 104 outputs an HDMI-CEC command to the audio output control unit 107.
  • the audio output control unit 107 confirms the second byte in the vendor definition area of the HDMI-CEC command acquired from the HDMI communication unit 104, and determines whether or not output of audio data related to the status notification by the external device 500 has been started. (Step S302).
  • the audio output control unit 107 When it is determined that the external device 500 starts to output audio data (“Yes” in step S302), the audio output control unit 107 outputs a video / audio output processing unit so that the audio data of the HDMI signal is output to the audio output unit 110. 108 is controlled. The video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109 and the audio data of the HDMI signal to the audio output unit 110 in accordance with an instruction from the audio output control unit 107. Then, the connection between each input unit and output unit is switched (step S303).
  • the audio output control unit 107 confirms the second byte in the vendor-defined area of the HDMI-CEC command acquired from the HDMI communication unit 104, and determines whether or not the output of audio data by the external device 500 is completed ( Step S304).
  • the audio output control unit 107 outputs a video / audio output processing unit so that the audio data of the content signal is output to the audio output unit 110.
  • the video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109 and the audio data of the content signal to the audio output unit 110 in accordance with an instruction from the audio output control unit 107. Then, the connection between each input unit and output unit is switched (step S305).
  • step S301 if the HDMI-CEC command does not notify the device status (“No” in step S301), the content processing device 100 performs a predetermined process instructed by the HDMI-CEC command. It will be executed (step S310).
  • the state reading process by speech synthesis shown in FIG. 4 differs from the process of FIG. 3 in steps S403 and S405.
  • the audio output control unit 107 sends the audio data of the HDMI signal together with the audio data of the content signal to the audio output unit 110.
  • the video / audio output processing unit 108 is controlled so as to be output.
  • the video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109, and outputs the audio of the HDMI signal to the audio data of the content signal to the audio output unit 110.
  • Audio data synthesis processing is performed so that the data is superimposed and output (step S403).
  • the audio output control unit 107 outputs the video / audio so that the audio data of the content signal is output to the audio output unit 110.
  • the processing unit 108 is controlled.
  • the video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109 and the audio data of the content signal to the audio output unit 110 in accordance with an instruction from the audio output control unit 107. Then, the voice data synthesis process is canceled (step S405).
  • the audio data received together with the state change is The audio data of the signal is switched and output, or the audio data of the content signal is synthesized and output.
  • the audio data of the signal is switched and output, or the audio data of the content signal is synthesized and output.
  • FIG. 5 is a diagram showing a configuration of a content processing apparatus 200 according to the second embodiment of the present invention.
  • a content processing apparatus 200 according to the second embodiment illustrated in FIG. 5 includes a tuner unit 101, a video input unit 102, an audio input unit 103, an HDMI communication unit 104, a video input unit 105, and an audio input unit 106.
  • the content processing apparatus 200 is connected to the external apparatus 500 via the HDMI cable 501.
  • the content processing apparatus 200 includes a content processing apparatus 100 (FIG. 1) according to the first embodiment, an audio output control unit 207, an audio file storage unit 211, and The configuration of the video / audio output processing unit 208 is different. Except for these different configurations, the content processing apparatus 100 is the same as the content processing apparatus 100, and thus the same reference numerals are assigned and description thereof is omitted. Hereinafter, the content processing apparatus 200 according to the second embodiment will be described focusing on the audio output control unit 207, the audio file storage unit 211, and the video / audio output processing unit 208.
  • Control information The HDMI-CEC command is also used as control information in the second embodiment, but the information stored in the vendor definition area is different from that in the first embodiment.
  • 1-byte information for example, “notice state” indicating that the HDMI-CEC command is “notice state of the external device” is displayed in the vendor definition area. 0x0a ") and 1-byte information (for example,” 1 ”) indicating" index corresponding to the device state "is stored.
  • the index corresponding to the device state is information obtained by replacing the device state with a bit value.
  • the device state corresponding to the index is stored as audio data in the audio file storage unit 211.
  • FIG. 7 is a diagram illustrating an example of a table stored in the audio file storage unit 211.
  • the audio file storage unit 211 stores a plurality of indexes and audio data (expressed in equivalent terms in the figure) as a pair of audio files.
  • the audio file may be preset in the content processing apparatus 200, may be uploaded from the external apparatus 500 when first connected to the external apparatus 500, or is connected to the content processing apparatus 200. It may be downloaded from a server or the like via a network.
  • FIG. 8 shows a state reading method performed by the HDMI communication unit 104, the audio output control unit 207, the audio file storage unit 211, and the video / audio output processing unit 208 of the content processing apparatus 200 according to the second embodiment. It is a flowchart explaining the process sequence of.
  • the state reading process shown in FIG. 8 is started when the HDMI communication unit 104 receives control information, that is, an HDMI-CEC command.
  • the HDMI communication unit 104 checks the first byte in the vendor definition area of the received HDMI-CEC command, and determines whether or not the command notifies the device status (step S301). If it is determined that the command is a command for notifying the apparatus status (“Yes” in step S301), the HDMI communication unit 104 outputs an HDMI-CEC command to the audio output control unit 207.
  • the audio output control unit 207 confirms the second byte in the vendor definition area of the HDMI-CEC command acquired from the HDMI communication unit 104, and determines the stored index (step S801).
  • the audio output control unit 207 searches the audio file storage unit 211 to identify the audio file whose index matches, acquires the audio data stored in the identified audio file, and performs the audio / video output process. The data is output to the unit 208 (step S802). Then, the audio output control unit 207 controls the video / audio output processing unit 208 so that the output audio data is output to the audio output unit 110 together with the audio data of the content signal.
  • the video / audio output processing unit 208 outputs the video data of the content signal to the video output unit 109 in accordance with an instruction from the audio output control unit 207, and the audio data provided from the audio output control unit 207 to the audio data of the content signal Is synthesized so as to be superimposed and output (step S803).
  • the audio data corresponding to the state change that may occur in the external device 500 is stored in advance.
  • the audio data corresponding to the change in state is combined with the audio data of the content signal and output.
  • the content processing apparatus 200 since the content processing apparatus 200 stores the audio data corresponding to the state change in advance, it is not necessary to switch the audio data of the content signal to the audio data received from the external apparatus 500.
  • FIG. 9 is a diagram showing a configuration of a content processing apparatus 300 according to the third embodiment of the present invention.
  • a content processing apparatus 300 according to the third embodiment illustrated in FIG. 9 includes a tuner unit 101, a video input unit 102, an audio input unit 103, an HDMI communication unit 104, a video input unit 105, and an audio input unit 106.
  • the content processing device 300 is connected to the external device 500 via the HDMI cable 501.
  • the content processing apparatus 300 includes a content processing apparatus 100 (FIG. 1) according to the first embodiment, an audio output control unit 307, a character string conversion unit 311, and The configuration of the video / audio output processing unit 308 is different. Except for these different configurations, the content processing apparatus 100 is the same as the content processing apparatus 100, and thus the same reference numerals are assigned and description thereof is omitted.
  • the content processing apparatus 300 according to the third embodiment will be described focusing on the audio output control unit 307, the character string conversion unit 311, and the video / audio output processing unit 308.
  • control information In the third embodiment, the HDMI-CEC command is used as control information, but the information stored in the vendor definition area is different from that in the first embodiment. As shown in FIG. 10, in the third embodiment, 1-byte information (for example, “notice state” indicating that the HDMI-CEC command is “notice state of the external device” is displayed in the vendor definition area.
  • 1-byte information for example, 0x03 " indicating" the number of commands required to create one character string (total) ", and” command number (count) indicating the transmission order " 1-byte information (for example, “0x01”, “0x02”, “0x03”) and “characters indicating the device status” in a predetermined character code (for example, EUC code, JIS code, ASCII code (ASCII code)) 11 bytes of information (for example, “Dubbing was” (FIG. 10A), “completed” (FIG. 10B)), “to DV D ”(FIG. 10C)) is stored.
  • EUC code for example, EUC code, JIS code, ASCII code (ASCII code)
  • the audio output control unit 307 extracts characters stored in the vendor-defined area of the HDMI-CEC command, and reconstructs a plurality of characters transmitted by being divided into a plurality of HDMI-CEC commands into one character string And document it.
  • the character string conversion unit 311 has a function of converting one character string represented by a predetermined character code into voice data.
  • FIG. 11 is a state reading method performed by the HDMI communication unit 104, the audio output control unit 307, the character string conversion unit 311 and the video / audio output processing unit 308 of the content processing apparatus 300 according to the third embodiment. It is a flowchart explaining the process sequence of.
  • the state reading process shown in FIG. 11 is started when the HDMI communication unit 104 receives control information, that is, an HDMI-CEC command.
  • the HDMI communication unit 104 checks the first byte in the vendor definition area of the received HDMI-CEC command, and determines whether or not the command notifies the device status (step S301). If it is determined that the command is a command for notifying the apparatus status (“Yes” in step S301), the HDMI communication unit 104 outputs an HDMI-CEC command to the audio output control unit 307.
  • the audio output control unit 307 confirms the 4th to 14th bytes in the vendor-defined area of the HDMI-CEC command acquired from the HDMI communication unit 104, and extracts the character represented by the stored predetermined character code. (Step S1101).
  • the audio output control unit 307 extracts all characters transmitted by the HDMI-CEC command. The determination that all the characters have been extracted is based on the fact that the “command number (count)” stored in the third byte in the vendor-defined area of the HDMI-CEC command is stored in the second byte in the vendor-defined area. Whether or not it coincides with “the number of necessary commands (total)” (step S1102). Then, the voice output control unit 307 reconstructs (concatenates) the extracted plurality of characters into one character string, and outputs it to the character string conversion unit 311 (step S1103).
  • the character string conversion unit 311 receives one reconstructed character string from the voice output control unit 307, converts this one character string into voice data, and returns it to the voice output control unit 307 (step S1104).
  • the audio output control unit 307 outputs the converted audio data to the video / audio output processing unit 308. Then, the audio output control unit 307 controls the video / audio output processing unit 308 so that the output audio data is output to the audio output unit 110 together with the audio data of the content signal.
  • the video / audio output processing unit 308 outputs the video data of the content signal to the video output unit 109 according to the instruction of the audio output control unit 307, and the audio data supplied from the audio output control unit 307 to the audio data of the content signal Is synthesized so as to be superimposed and output (step S1105).
  • the character string received together with the state change is converted into audio data.
  • the audio data of the content signal is synthesized and output.
  • the content processing device 300 since the content processing device 300 generates audio data corresponding to the state change from the character string attached to the command, it is not necessary to switch the audio data of the content signal to the audio data received from the external device 500.
  • HDMI communication is applied to the interface between the content processing apparatus and the external apparatus, and control information is transmitted using the HDMI-CEC command.
  • the interface between the content processing apparatus and the external apparatus is not limited to this embodiment, and other communication interfaces can be used as long as signals including video data, audio data, and control information can be transmitted and received. It is feasible.
  • Ethernet (registered trademark) communication adopted in the HDMI 1.4 standard can be used for transmission of control information.
  • 12 includes an Ethernet (registered trademark) communication unit 412 so that Ethernet (registered trademark) communication via a general LAN cable 502 (for example, an Internet protocol TV (IPTV)) is provided.
  • Ethernet (registered trademark) communication) can also be used for transmission of video / audio data and control information. If Ethernet (registered trademark) communication is used, a large volume of audio data compressed by an AAC (Advanced Audio Coding) method, an MP3 (MPEG1 Audio Layer3) method, or the like can be transmitted at high speed.
  • AAC Advanced Audio Coding
  • MP3 MPEG1 Audio Layer3
  • each functional block constituting the content processing apparatus in each embodiment of the present invention includes hardware such as a central processing unit (CPU), a storage device (memory (ROM, RAM, etc.), hard disk, etc.), and an input / output device. It is realized by using resources, and is typically embodied as an integrated circuit IC (also referred to as an LSI, a system LSI, a super LSI, an ultra LSI, or the like). These functional blocks may be individually made into one chip, or may be made into one chip so as to include a part or all of them. Further, the method of circuit integration is not limited to ICs, and implementation using dedicated circuitry or general purpose processors is also possible.
  • an FPGA Field Programmable Gate Array
  • a reconfigurable processor that can reconfigure the connection and setting of the circuit cells inside the IC may be used.
  • integrated circuit technology such as biotechnology
  • functional blocks may naturally be integrated using this technology.
  • the state reading method executed by the content processing device in each of the embodiments of the present invention described above is performed by the CPU interpreting and executing predetermined program data that can execute the state reading method procedure stored in the storage device. It may be realized.
  • the program data may be introduced into the storage device via a recording medium such as a CD-ROM or a flexible disk, or may be directly executed from the recording medium.
  • the recording medium refers to a semiconductor memory such as a ROM, a RAM, or a flash memory, a magnetic disk memory such as a flexible disk or a hard disk, an optical disk memory such as a CD-ROM, a DVD, or a BD, and a memory card.
  • the recording medium is a concept including a communication medium such as a telephone line or a conveyance path.
  • the present invention can be used in a system in which a content processing apparatus that outputs content video / audio and an external apparatus are connected via a communication interface. Suitable for when you want to notify.
  • Video input unit 103 105
  • Video input unit 103 106
  • Video input unit 104 HDMI communication unit 107, 207, 307 Audio output control unit 108, 208, 308
  • Video / audio output processing unit 109 Video Output unit 110 Audio output unit 211 Audio file storage unit 311 Character string conversion unit 412 Ethernet (registered trademark) communication unit 500 External device 501 HDMI cable 502 LAN cable

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A sound output control unit (107) determines whether or not an HDMI communications unit (104) has received, from an external device (500), an HDMI­-CEC command indicating the commencement of output of sound data notifying the device status. If the HDMI-CEC command has been received, the sound output control unit (107) controls the switching of a video sound output processing unit (108), such that video data for the content signal being viewed is output to a video output unit (109) and sound data for the HDMI signal from the external device (500) is output to a sound output unit (110).

Description

コンテンツ処理装置および状態読み上げ方法Content processing apparatus and state reading method
 本発明は、コンテンツの視聴が可能なコンテンツ処理装置と外部装置とが通信インタフェースで接続されたシステムにおいて、外部装置から通知される状態を音声で読み上げる方法、およびこの状態読み上げ機能を備えたコンテンツ処理装置に関する。 The present invention relates to a method of reading out a state notified from an external device in a system in which a content processing apparatus capable of viewing content and an external device are connected via a communication interface, and content processing having this state reading function Relates to the device.
 テレビジョン装置で放送されるコンテンツを視聴している最中に、ブルーレイディスク(BD)レコーダなどの外部装置において別のコンテンツを録画したり、ハードディスクからBDディスクへコンテンツをダビングしたりする行為は、日常的に行われている。このような外部装置での行為は、通常、テレビジョン装置によるコンテンツ視聴に関係なく独立して動作される。このため、視聴者は、テレビジョン装置におけるコンテンツの視聴中には、「録画終了」などの外部装置で生じた状態の変化を知ることができない、という問題があった。 While viewing content broadcast on a television device, the act of recording another content on an external device such as a Blu-ray Disc (BD) recorder or dubbing content from a hard disk to a BD disc is: It is done on a daily basis. Such an action on the external device is normally operated independently regardless of content viewing by the television device. For this reason, there is a problem that the viewer cannot know the change in the state that has occurred in the external device such as “end of recording” while viewing the content on the television device.
 そこで、この問題を解決するために、外部装置において「録画終了」などの状態変化が生じた場合に、「録画終了」などの文字列をテレビジョン装置の画面上にオンスクリーンで表示したり(OSD)、スピーカなどを介して「録画終了」の音声を読み上げたりする技術が、様々提案されている(例えば、特許文献1を参照)。 Therefore, in order to solve this problem, when a state change such as “recording end” occurs in the external device, a character string such as “recording end” is displayed on the screen of the television device on-screen ( Various techniques for reading out the sound of “recording end” via an OSD or a speaker have been proposed (see, for example, Patent Document 1).
特開2007-248862号公報JP 2007-248862 A
 しかし、上述した従来の技術では、外部装置の状態をテレビジョン装置において画面表示する場合も、音声読み上げする場合も、テレビジョン装置の画面を一旦外部装置側に切り換える必要がある。このため、外部装置の状態を知るには、テレビジョン装置で視聴していたコンテンツの映像表示を一時中断せざるを得なくなるという問題が残る。 However, in the above-described conventional technology, it is necessary to temporarily switch the screen of the television device to the external device side when the screen of the state of the external device is displayed on the television device or when the voice is read out. For this reason, in order to know the state of the external device, there remains a problem that the video display of the content being viewed on the television device must be suspended.
 それ故に、本発明の目的は、コンテンツを視聴中にバックグラウンドで動作させている外部装置に状態変化が生じた場合、外部装置の状態をコンテンツ映像の表示を中断させることなく音声で読み上げて視聴者に通知する方法、および当該方法を実行するコンテンツ処理装置を提供することである。 Therefore, an object of the present invention is to read and listen to the state of the external device without interrupting the display of the content video when a state change occurs in the external device operating in the background while viewing the content. It is providing the method of notifying a person, and the content processing apparatus which performs the said method.
 本発明は、コンテンツの映像および音声を出力するコンテンツ処理装置に向けられている。そして、上記問題を解決するために、本発明のコンテンツ処理装置は、所定のインタフェースを介して外部装置と通信する通信部、通信部における、外部装置が出力する状態を通知する制御情報の受信を判断し、制御情報に基づいてコンテンツ処理装置の音声出力を制御する制御部、および制御部の制御に従って、視聴コンテンツの映像を継続して映像出力し、かつ、制御情報によって通知された外部装置の状態を読み上げる音声を音声出力する、映像音声出力処理部を備えている。 The present invention is directed to a content processing apparatus that outputs content video and audio. In order to solve the above problem, the content processing apparatus of the present invention receives a control information for notifying a state output from the external device in a communication unit that communicates with the external device via a predetermined interface. The control unit that determines and controls the audio output of the content processing device based on the control information, and continuously outputs the video of the viewing content according to the control of the control unit, and the external device notified by the control information An audio / video output processing unit is provided for outputting audio for reading out the state.
 所定のインタフェースとしては、HDMI規格に準拠する通信インタフェースが適用可能である。この場合、制御情報に、HDMI-CECコマンドを用いることが適している。HDMI-CECコマンドを用いることで、次のような制御を容易に実現できる。 As the predetermined interface, a communication interface conforming to the HDMI standard is applicable. In this case, it is suitable to use an HDMI-CEC command for the control information. The following control can be easily realized by using the HDMI-CEC command.
 例えば、制御部が、HDMI-CECコマンドによる音声データ出力開始の指示に応じて、外部装置の状態を読み上げる音声を音声出力するように映像音声出力処理部を制御し、映像音声出力処理部が、制御部の制御に従って、音声出力を視聴コンテンツの音声から外部装置の状態を読み上げる音声に切り換える。この制御では、制御部が、HDMI-CECコマンドによる音声データ出力終了の指示に応じて、視聴コンテンツの音声を音声出力するように映像音声出力処理部を制御し、映像音声出力処理部が、制御部の制御に従って、音声出力を外部装置の状態を読み上げる音声から視聴コンテンツの音声に切り換えればよい。 For example, the control unit controls the video / audio output processing unit to output audio that reads the state of the external device in response to an instruction to start audio data output by the HDMI-CEC command, and the video / audio output processing unit Under the control of the control unit, the audio output is switched from the audio of the viewing content to the audio that reads the state of the external device. In this control, the control unit controls the video / audio output processing unit to output audio of the viewing content in response to an instruction to end audio data output by the HDMI-CEC command, and the video / audio output processing unit According to the control of the unit, the sound output may be switched from the sound for reading the state of the external device to the sound of the viewing content.
 あるいは、制御部が、HDMI-CECコマンドによる音声データ出力開始の指示に応じて、外部装置の状態を読み上げる音声を音声出力するように映像音声出力処理部を制御し、映像音声出力処理部が、制御部の制御に従って、音声出力している視聴コンテンツの音声に外部装置の状態を読み上げる音声を合成する。この制御では、制御部が、HDMI-CECコマンドによる音声データ出力終了の指示に応じて、視聴コンテンツの音声を音声出力するように映像音声出力処理部を制御し、映像音声出力処理部が、制御部の制御に従って、音声出力している視聴コンテンツの音声への外部装置の状態を読み上げる音声の合成を解除すればよい。 Alternatively, the control unit controls the video / audio output processing unit to output audio that reads the state of the external device in response to an instruction to start audio data output by the HDMI-CEC command, and the video / audio output processing unit Under the control of the control unit, the voice for reading the state of the external device is synthesized with the voice of the viewing content being outputted as voice. In this control, the control unit controls the video / audio output processing unit to output audio of the viewing content in response to an instruction to end audio data output by the HDMI-CEC command, and the video / audio output processing unit According to the control of the unit, the synthesis of the voice that reads out the state of the external device to the voice of the viewing content that is output as a voice may be canceled.
 または、コンテンツ処理装置が、インデックスと外部装置の状態を表す音声とが一意に対応付けられた1つ以上の音声ファイルを格納する格納部を備えていれば、制御部が、HDMI-CECコマンドによるインデックスの指示に応じて、インデックスで特定される音声を格納部から取得し、取得された音声を音声出力するように映像音声出力処理部を制御し、映像音声出力処理部が、制御部の制御に従って、音声出力している視聴コンテンツの音声に取得された音声を合成することも可能である。 Alternatively, if the content processing apparatus includes a storage unit that stores one or more audio files in which the index and the audio representing the state of the external device are uniquely associated, the control unit uses the HDMI-CEC command. In response to the index instruction, the audio specified by the index is acquired from the storage unit, the video / audio output processing unit is controlled to output the acquired audio as audio, and the video / audio output processing unit controls the control unit. Accordingly, it is also possible to synthesize the acquired audio with the audio of the viewing content being output as audio.
 さらには、コンテンツ処理装置が、文字列を音声に変換する変換部を備えていれば、制御部が、HDMI-CECコマンドに格納される外部装置の状態を表す文字列を抽出して変換部を用いて音声に変換し、変換された音声を音声出力するように映像音声出力処理部を制御し、映像音声出力処理部が、制御部の制御に従って、音声出力している視聴コンテンツの音声に変換された音声を合成することも可能である。 Furthermore, if the content processing device includes a conversion unit that converts a character string into sound, the control unit extracts a character string that represents the state of the external device stored in the HDMI-CEC command, and converts the conversion unit. The audio / video output processing unit is controlled so as to output the converted audio as audio, and the audio / video output processing unit converts the audio into the audio of the viewing content being output according to the control of the control unit. It is also possible to synthesize synthesized speech.
 なお、上記コンテンツ処理装置が実現する処理は、所定のインタフェースを介して外部装置から制御情報を受信するステップ、制御情報が、外部装置の状態を通知する情報であるか否かを判断するステップ、および制御情報が外部装置の状態を通知する情報である場合、視聴コンテンツの映像出力を継続しつつ、制御情報によって通知された外部装置の状態を音声で読み上げるステップを含む、状態読み上げ方法として捉えることができる。 The processing realized by the content processing apparatus includes a step of receiving control information from an external device via a predetermined interface, a step of determining whether the control information is information for notifying a state of the external device, And when the control information is information for notifying the state of the external device, it is understood as a state reading method including a step of reading out the state of the external device notified by the control information by voice while continuing to output the video of the viewing content. Can do.
 上記本発明によれば、コンテンツを視聴中にバックグラウンドで動作させている外部装置に状態変化が生じた場合、外部装置の状態をコンテンツ映像の表示を中断させることなく音声で読み上げて視聴者に通知することが可能となる。 According to the present invention, when a state change occurs in an external device operating in the background while viewing content, the state of the external device is read out by voice without interrupting the display of the content video, Notification can be made.
図1は、本発明の第1の実施形態に係るコンテンツ処理装置100の構成を示す図である。FIG. 1 is a diagram showing a configuration of a content processing apparatus 100 according to the first embodiment of the present invention. 図2は、コンテンツ処理装置100で使用する制御情報の一例を示す図である。FIG. 2 is a diagram illustrating an example of control information used in the content processing apparatus 100. 図3は、コンテンツ処理装置100で行われる音声切り換えによる状態読み上げ方法の処理手順を説明するフローチャートである。FIG. 3 is a flowchart for explaining the processing procedure of the state reading method by voice switching performed in the content processing apparatus 100. 図4は、コンテンツ処理装置100で行われる音声合成による状態読み上げ方法の処理手順を説明するフローチャートである。FIG. 4 is a flowchart for explaining the processing procedure of the state reading method by speech synthesis performed in the content processing apparatus 100. 図5は、本発明の第2の実施形態に係るコンテンツ処理装置200の構成を示す図である。FIG. 5 is a diagram showing a configuration of a content processing apparatus 200 according to the second embodiment of the present invention. 図6は、コンテンツ処理装置200で使用する制御情報の一例を示す図である。FIG. 6 is a diagram illustrating an example of control information used in the content processing apparatus 200. 図7は、音声ファイル格納部211に格納されるテーブルの一例を示す図である。FIG. 7 is a diagram illustrating an example of a table stored in the audio file storage unit 211. 図8は、コンテンツ処理装置200で行われる状態読み上げ方法の処理手順を説明するフローチャートである。FIG. 8 is a flowchart for explaining the processing procedure of the state reading method performed in the content processing apparatus 200. 図9は、本発明の第3の実施形態に係るコンテンツ処理装置300の構成を示す図である。FIG. 9 is a diagram showing a configuration of a content processing apparatus 300 according to the third embodiment of the present invention. 図10は、コンテンツ処理装置300で使用する制御情報の一例を示す図である。FIG. 10 is a diagram illustrating an example of control information used in the content processing apparatus 300. 図11は、コンテンツ処理装置300で行われる状態読み上げ方法の処理手順を説明するフローチャートである。FIG. 11 is a flowchart for explaining the processing procedure of the state reading method performed in the content processing apparatus 300. 図12は、本発明の他の実施形態に係るコンテンツ処理装置400の構成を示す図である。FIG. 12 is a diagram showing a configuration of a content processing apparatus 400 according to another embodiment of the present invention.
  <本発明の概念>
 本発明は、コンテンツ信号を入力して映像表示および音声出力を行うことができるコンテンツ処理装置と、このコンテンツ処理装置と通信可能に接続される外部装置とから、構成されるシステムに適用される。そして、本発明は、コンテンツ処理装置でコンテンツを視聴中にコンテンツ視聴とは関係なく動作していた外部装置に状態変化が生じた場合、コンテンツ映像の表示を中断させることなく音声による読み上げ処理を用いて、外部装置の状態をコンテンツの視聴者に通知することを特徴とする。このことから、外部装置は「状態通知装置」と捉えることができ、またコンテンツ処理装置は「状態読み上げ装置」と捉えることができる。
<Concept of the present invention>
The present invention is applied to a system including a content processing apparatus that can input a content signal and perform video display and audio output, and an external apparatus that is communicably connected to the content processing apparatus. Then, the present invention uses a speech reading process without interrupting the display of the content video when a state change occurs in the external device that was operating regardless of the content viewing while viewing the content on the content processing device. Thus, the state of the external device is notified to the content viewer. From this, the external device can be regarded as a “status notification device”, and the content processing device can be regarded as a “status reading device”.
 コンテンツ処理装置としては、例えばテレビジョン装置やパーソナルコンピュータ(パソコン)などの、画面を備えた機器が挙げられる。
 コンテンツ信号としては、例えば地上波デジタル信号、BSデジタル信号、およびMPEG(moving picture expert group)信号などの、映像と音声とが含まれる信号が考えられる。
Examples of the content processing device include a device having a screen such as a television device or a personal computer (personal computer).
Examples of the content signal include signals including video and audio, such as a terrestrial digital signal, a BS digital signal, and an MPEG (moving picture expert group) signal.
 外部装置としては、例えばプレーヤー、レコーダ、シアターアンプ、デジタルカメラ、ムービー、ドアホン、センサーカメラ、パソコン、タブレット端末、スマートフォン、ファクシミリ、ネットワークサーバ、冷蔵庫、洗濯機、エアコン、掃除ロボット、炊飯器、食器洗い乾燥機、および浴室給湯器などの何らかの状態変化を生じる可能性がある機器が挙げられる。
 状態変化としては、例えばコンテンツ/動画/静止画/データの録画、コピー(ダビング)、ダウンロード、およびアップロードの処理完了/処理エラー(容量不足、バッテリー切れ、接続不可など)、また電話/ファクシミリ着信、メール着信、情報更新、設定自動変更、センサー感知、ボタン押下感知、フィルター目詰まり、また製氷、洗濯、乾燥、掃除、炊飯、温度制御、およびお湯はりの動作開始/動作終了、さらには装置の故障などが考えられる。
External devices include, for example, players, recorders, theater amplifiers, digital cameras, movies, door phones, sensor cameras, personal computers, tablet terminals, smartphones, facsimiles, network servers, refrigerators, washing machines, air conditioners, cleaning robots, rice cookers, dishwashers and dryers. And machines that may cause some state change such as a bathroom water heater.
Status changes include, for example, content / video / still image / data recording, copy (dubbing), download, and upload processing completion / processing errors (insufficient capacity, battery exhausted, connection impossible, etc.), telephone / facsimile incoming calls, Mail arrival, information update, automatic setting change, sensor detection, button press detection, filter clogging, ice making, washing, drying, cleaning, rice cooking, temperature control, hot water start / stop, and device failure And so on.
 以下の各実施形態では、典型的なシステム構成例として、テレビジョン装置に相当するコンテンツ処理装置と、ハードディスク(HDD)レコーダに相当する外部装置とで構成されるシステムを想定する。そして、このコンテンツ処理装置と外部装置とを接続する通信インタフェースに、HDMI(High-Definition Multimedia Interface)規格に準拠したHDMI通信を適用した場合を一例として説明する。 In the following embodiments, as a typical system configuration example, a system including a content processing device corresponding to a television device and an external device corresponding to a hard disk (HDD) recorder is assumed. An example will be described in which HDMI communication conforming to the High-Definition Multimedia Interface (HDMI) standard is applied to a communication interface that connects the content processing apparatus and an external apparatus.
  <第1の実施形態>
 図1は、本発明の第1の実施形態に係るコンテンツ処理装置100の構成を示す図である。図1に示す第1の実施形態に係るコンテンツ処理装置100は、チューナ部101と、映像入力部102と、音声入力部103と、HDMI通信部104と、映像入力部105と、音声入力部106と、音声出力制御部107と、映像音声出力処理部108と、映像出力部109と、音声出力部110とを備えている。このコンテンツ処理装置100は、HDMIケーブル501を介して外部装置500と接続されている。
<First Embodiment>
FIG. 1 is a diagram showing a configuration of a content processing apparatus 100 according to the first embodiment of the present invention. A content processing apparatus 100 according to the first embodiment illustrated in FIG. 1 includes a tuner unit 101, a video input unit 102, an audio input unit 103, an HDMI communication unit 104, a video input unit 105, and an audio input unit 106. An audio output control unit 107, a video / audio output processing unit 108, a video output unit 109, and an audio output unit 110. The content processing apparatus 100 is connected to the external apparatus 500 via the HDMI cable 501.
 チューナ部101は、地上波などで放送されてくるコンテンツ信号(テレビジョン信号)を受信する。映像入力部102は、チューナ部101で受信されたコンテンツ信号から映像データを抽出して入力する。音声入力部103は、チューナ部101で受信されたコンテンツ信号から音声データを抽出して入力する。HDMI通信部104は、外部装置500が出力するHDMI信号を、HDMIケーブル501を介して受信する。また、HDMI通信部104は、受信したHDMI信号に付随する所定の制御情報(後述する)を、音声出力制御部107に出力する。映像入力部105は、HDMI通信部104で受信されたHDMI信号から映像データを抽出して入力する。音声入力部106は、HDMI通信部104で受信されたHDMI信号から音声データを抽出して入力する。 The tuner unit 101 receives a content signal (television signal) broadcast on the ground. The video input unit 102 extracts and inputs video data from the content signal received by the tuner unit 101. The audio input unit 103 extracts and inputs audio data from the content signal received by the tuner unit 101. The HDMI communication unit 104 receives the HDMI signal output from the external device 500 via the HDMI cable 501. Further, the HDMI communication unit 104 outputs predetermined control information (described later) accompanying the received HDMI signal to the audio output control unit 107. The video input unit 105 extracts video data from the HDMI signal received by the HDMI communication unit 104 and inputs it. The audio input unit 106 extracts and inputs audio data from the HDMI signal received by the HDMI communication unit 104.
 音声出力制御部107は、HDMI通信部104から制御情報を入力し、この制御情報に基づいて映像音声出力処理部108の切り換えを制御する。映像音声出力処理部108は、映像入力部102が出力するコンテンツ信号の映像データと、音声入力部103が出力するコンテンツ信号の音声データと、映像入力部105が出力するHDMI信号の映像データと、音声入力部106が出力するHDMI信号の音声データとを入力する。そして、映像音声出力処理部108は、音声出力制御部107の制御指示に従った映像データおよび音声データを、映像出力部109および音声出力部110に出力する。映像出力部109は、映像音声出力処理部108が出力する映像データに従って、画面(図示せず)などに映像を表示させる。音声出力部110は、映像音声出力処理部108が出力する音声データに従って、スピーカ(図示せず)などに音声を出力する。 The audio output control unit 107 receives control information from the HDMI communication unit 104 and controls switching of the video / audio output processing unit 108 based on the control information. The video / audio output processing unit 108 includes video data of a content signal output from the video input unit 102, audio data of a content signal output from the audio input unit 103, video data of an HDMI signal output from the video input unit 105, The audio data of the HDMI signal output from the audio input unit 106 is input. Then, the video / audio output processing unit 108 outputs video data and audio data according to the control instruction of the audio output control unit 107 to the video output unit 109 and the audio output unit 110. The video output unit 109 displays a video on a screen (not shown) or the like according to the video data output from the video / audio output processing unit 108. The audio output unit 110 outputs audio to a speaker (not shown) or the like according to the audio data output from the video / audio output processing unit 108.
 以下、上記構成による本発明の第1の実施形態に係るコンテンツ処理装置100で実行される状態読み上げ方法を、図2~図4をさらに参照して説明する。
 図2は、第1の実施形態に係るコンテンツ処理装置100で使用する制御情報の一例を示す図である。図3および図4は、第1の実施形態に係るコンテンツ処理装置100のHDMI通信部104、音声出力制御部107、および映像音声出力処理部108によって行われる状態読み上げ方法の処理手順を説明するフローチャートである。
Hereinafter, a state reading method executed by the content processing apparatus 100 according to the first embodiment of the present invention having the above-described configuration will be described with further reference to FIGS.
FIG. 2 is a diagram illustrating an example of control information used in the content processing apparatus 100 according to the first embodiment. 3 and 4 are flowcharts for explaining the processing procedure of the state reading method performed by the HDMI communication unit 104, the audio output control unit 107, and the video / audio output processing unit 108 of the content processing apparatus 100 according to the first embodiment. It is.
・制御情報
 まず前提として、この第1の実施形態では、コンテンツ処理装置100と外部装置500とを接続する通信インタフェースとして用いるHDMI規格に採用されている「HDMI-CEC(consumer electronics control)」のコマンドを、制御情報として利用する。
Control information First, as a premise, in the first embodiment, a command of “HDMI-CEC (consumer electronics control)” adopted in the HDMI standard used as a communication interface for connecting the content processing apparatus 100 and the external apparatus 500 is used. Is used as control information.
 HDMI-CECは、HDMI規格を利用するベンダー(例えば、コンテンツ処理装置100および外部装置500の製造元)が独自にコマンドを定義することができる機器制御プロトコルである。HDMI-CECで使用されるコマンドフレームは、図2に示すように、1バイトのヘッダ領域(header)、1バイトのオペレーションコード領域(opcode)、および14バイトのベンダー定義領域(vender unique)から構成される。 HDMI-CEC is a device control protocol that allows vendors using the HDMI standard (for example, manufacturers of the content processing apparatus 100 and the external apparatus 500) to define commands independently. As shown in FIG. 2, the command frame used in HDMI-CEC is composed of a 1-byte header area (header), a 1-byte operation code area (opcode), and a 14-byte vendor-defined area (vendor unique). Is done.
 ヘッダ領域には、HDMI-CECコマンドを伝える宛先の装置、すなわちコンテンツ処理装置100のアドレスが格納される。オペレーションコード領域には、このHDMI-CECコマンドがベンダーコマンドであることを定める情報(例えば「0x89」)が格納される。ベンダー定義領域には、HDMI-CECコマンドが「外部装置の状態を通知するもの(notice state)」であることを表す1バイトの情報(例えば「0x0a」)と、「装置状態を示す音声データの出力を開始する(start)」ことを表す1バイトの情報(例えば「0x01」;図2(a))、または「装置状態を示す音声データの出力を終了する(end)」ことを表す1バイトの情報(例えば「0x02」;図2(b))とが、格納される。 In the header area, the address of the destination device for transmitting the HDMI-CEC command, that is, the address of the content processing device 100 is stored. In the operation code area, information (for example, “0x89”) for determining that the HDMI-CEC command is a vendor command is stored. In the vendor definition area, 1-byte information (for example, “0x0a”) indicating that the HDMI-CEC command is “notification state of external device (notice state)” and “audio data indicating the device status” 1 byte of information indicating “start output (start)” (for example, “0x01”; FIG. 2A), or 1 byte indicating “end of output of audio data indicating device status (end)” (For example, “0x02”; FIG. 2B) is stored.
・状態読み上げ処理
 上述したHDMI-CECコマンドを利用した状態読み上げ処理としては、音声切り換えによって状態読み上げ処理を実現する方法(図3)と、音声合成によって状態読み上げ処理を実現する方法(図4)とがある。
State reading process The state reading process using the above-described HDMI-CEC command includes a method for realizing the state reading process by voice switching (FIG. 3) and a method for realizing the state reading process by voice synthesis (FIG. 4). There is.
 図3に示した音声切り換えによる状態読み上げ処理は、HDMI通信部104が制御情報、すなわちHDMI-CECコマンドを受信すると開始される。
 まず、HDMI通信部104は、受信したHDMI-CECコマンドのベンダー定義領域における最初の1バイトを確認し、コマンドが装置状態を通知するものであるか否かを判断する(ステップS301)。装置状態を通知するコマンドであると判断した場合(ステップS301で「Yes」)、HDMI通信部104は、HDMI-CECコマンドを音声出力制御部107に出力する。音声出力制御部107は、HDMI通信部104から取得したHDMI-CECコマンドのベンダー定義領域における2バイト目を確認し、外部装置500による状態通知に関する音声データの出力が開始されたか否かを判断する(ステップS302)。外部装置500による音声データの出力開始を判断すると(ステップS302で「Yes」)、音声出力制御部107は、HDMI信号の音声データが音声出力部110へ出力されるように、映像音声出力処理部108を制御する。映像音声出力処理部108は、音声出力制御部107の指示に従って、映像出力部109へはコンテンツ信号の映像データが出力され、かつ、音声出力部110へはHDMI信号の音声データが出力されるように、それぞれの入力部と出力部との接続を切り換える(ステップS303)。
The state reading process by voice switching shown in FIG. 3 is started when the HDMI communication unit 104 receives control information, that is, an HDMI-CEC command.
First, the HDMI communication unit 104 confirms the first byte in the vendor definition area of the received HDMI-CEC command, and determines whether or not the command notifies the device status (step S301). If it is determined that the command is for notifying the apparatus state (“Yes” in step S301), the HDMI communication unit 104 outputs an HDMI-CEC command to the audio output control unit 107. The audio output control unit 107 confirms the second byte in the vendor definition area of the HDMI-CEC command acquired from the HDMI communication unit 104, and determines whether or not output of audio data related to the status notification by the external device 500 has been started. (Step S302). When it is determined that the external device 500 starts to output audio data (“Yes” in step S302), the audio output control unit 107 outputs a video / audio output processing unit so that the audio data of the HDMI signal is output to the audio output unit 110. 108 is controlled. The video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109 and the audio data of the HDMI signal to the audio output unit 110 in accordance with an instruction from the audio output control unit 107. Then, the connection between each input unit and output unit is switched (step S303).
 その後、音声出力制御部107は、HDMI通信部104から取得したHDMI-CECコマンドのベンダー定義領域における2バイト目を確認し、外部装置500による音声データの出力が終了したか否かを判断する(ステップS304)。外部装置500による音声データの出力終了を判断すると(ステップS304で「Yes」)、音声出力制御部107は、コンテンツ信号の音声データが音声出力部110へ出力されるように、映像音声出力処理部108を制御する。映像音声出力処理部108は、音声出力制御部107の指示に従って、映像出力部109へはコンテンツ信号の映像データが出力され、かつ、音声出力部110へはコンテンツ信号の音声データが出力されるように、それぞれの入力部と出力部との接続を切り換える(ステップS305)。 Thereafter, the audio output control unit 107 confirms the second byte in the vendor-defined area of the HDMI-CEC command acquired from the HDMI communication unit 104, and determines whether or not the output of audio data by the external device 500 is completed ( Step S304). When it is determined that the output of the audio data by the external device 500 has ended (“Yes” in step S304), the audio output control unit 107 outputs a video / audio output processing unit so that the audio data of the content signal is output to the audio output unit 110. 108 is controlled. The video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109 and the audio data of the content signal to the audio output unit 110 in accordance with an instruction from the audio output control unit 107. Then, the connection between each input unit and output unit is switched (step S305).
 なお、上記ステップS301において、HDMI-CECコマンドが装置状態を通知するものではなかった場合には(ステップS301で「No」)、コンテンツ処理装置100によってこのHDMI-CECコマンドが指示する所定の処理が実行されることになる(ステップS310)。 In step S301, if the HDMI-CEC command does not notify the device status (“No” in step S301), the content processing device 100 performs a predetermined process instructed by the HDMI-CEC command. It will be executed (step S310).
 一方、図4に示した音声合成による状態読み上げ処理は、ステップS403およびS405の処理が図3の処理と異なる。
 すなわち、外部装置500による状態通知に関する音声データの出力開始を判断すると(ステップS302で「Yes」)、音声出力制御部107は、HDMI信号の音声データがコンテンツ信号の音声データと共に音声出力部110へ出力されるように、映像音声出力処理部108を制御する。映像音声出力処理部108は、音声出力制御部107の指示に従って、映像出力部109へはコンテンツ信号の映像データが出力され、かつ、音声出力部110へはコンテンツ信号の音声データにHDMI信号の音声データが重畳されて出力されるように、音声データの合成処理を行う(ステップS403)。また、外部装置500による音声データの出力終了を判断すると(ステップS304で「Yes」)、音声出力制御部107は、コンテンツ信号の音声データが音声出力部110へ出力されるように、映像音声出力処理部108を制御する。映像音声出力処理部108は、音声出力制御部107の指示に従って、映像出力部109へはコンテンツ信号の映像データが出力され、かつ、音声出力部110へはコンテンツ信号の音声データが出力されるように、音声データの合成処理を解除する(ステップS405)。
On the other hand, the state reading process by speech synthesis shown in FIG. 4 differs from the process of FIG. 3 in steps S403 and S405.
In other words, when it is determined that the external device 500 starts to output audio data related to the status notification (“Yes” in step S302), the audio output control unit 107 sends the audio data of the HDMI signal together with the audio data of the content signal to the audio output unit 110. The video / audio output processing unit 108 is controlled so as to be output. In accordance with an instruction from the audio output control unit 107, the video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109, and outputs the audio of the HDMI signal to the audio data of the content signal to the audio output unit 110. Audio data synthesis processing is performed so that the data is superimposed and output (step S403). When it is determined that the output of the audio data by the external device 500 is finished (“Yes” in step S304), the audio output control unit 107 outputs the video / audio so that the audio data of the content signal is output to the audio output unit 110. The processing unit 108 is controlled. The video / audio output processing unit 108 outputs the video data of the content signal to the video output unit 109 and the audio data of the content signal to the audio output unit 110 in accordance with an instruction from the audio output control unit 107. Then, the voice data synthesis process is canceled (step S405).
 以上のように、本発明の第1の実施形態に係るコンテンツ処理装置100および状態読み上げ方法によれば、外部装置500から状態変化が通知された場合に、状態変化と共に受信する音声データを、コンテンツ信号の音声データと切り換えて出力するか、コンテンツ信号の音声データに合成して出力する。
 これにより、コンテンツを視聴中にバックグラウンドで動作していた外部装置に生じた状態変化を、コンテンツ映像の表示を中断させることなく音声で読み上げて視聴者に通知することが可能となる。
As described above, according to the content processing device 100 and the state reading method according to the first embodiment of the present invention, when the state change is notified from the external device 500, the audio data received together with the state change is The audio data of the signal is switched and output, or the audio data of the content signal is synthesized and output.
As a result, it is possible to read out a change in state that has occurred in the external device that was operating in the background while viewing the content, and to notify the viewer by reading it out without interrupting the display of the content video.
  <第2の実施形態>
 図5は、本発明の第2の実施形態に係るコンテンツ処理装置200の構成を示す図である。図5に示す第2の実施形態に係るコンテンツ処理装置200は、チューナ部101と、映像入力部102と、音声入力部103と、HDMI通信部104と、映像入力部105と、音声入力部106と、音声出力制御部207と、音声ファイル格納部211と、映像音声出力処理部208と、映像出力部109と、音声出力部110とを備えている。このコンテンツ処理装置200は、HDMIケーブル501を介して外部装置500と接続されている。
<Second Embodiment>
FIG. 5 is a diagram showing a configuration of a content processing apparatus 200 according to the second embodiment of the present invention. A content processing apparatus 200 according to the second embodiment illustrated in FIG. 5 includes a tuner unit 101, a video input unit 102, an audio input unit 103, an HDMI communication unit 104, a video input unit 105, and an audio input unit 106. An audio output control unit 207, an audio file storage unit 211, a video / audio output processing unit 208, a video output unit 109, and an audio output unit 110. The content processing apparatus 200 is connected to the external apparatus 500 via the HDMI cable 501.
 図5に示すように、第2の実施形態に係るコンテンツ処理装置200は、第1の実施形態に係るコンテンツ処理装置100(図1)と、音声出力制御部207、音声ファイル格納部211、および映像音声出力処理部208の構成が異なる。これらの異なる構成以外は、コンテンツ処理装置100と同じであるため、同一の参照符号を付して説明を省略する。
 以下、音声出力制御部207、音声ファイル格納部211、および映像音声出力処理部208を中心に、第2の実施形態に係るコンテンツ処理装置200を説明する。
As shown in FIG. 5, the content processing apparatus 200 according to the second embodiment includes a content processing apparatus 100 (FIG. 1) according to the first embodiment, an audio output control unit 207, an audio file storage unit 211, and The configuration of the video / audio output processing unit 208 is different. Except for these different configurations, the content processing apparatus 100 is the same as the content processing apparatus 100, and thus the same reference numerals are assigned and description thereof is omitted.
Hereinafter, the content processing apparatus 200 according to the second embodiment will be described focusing on the audio output control unit 207, the audio file storage unit 211, and the video / audio output processing unit 208.
・制御情報
 この第2の実施形態でも、HDMI-CECコマンドを制御情報として使用するが、ベンダー定義領域に格納する情報が上記第1の実施形態とは異なる。図6に示すように、第2の実施形態では、ベンダー定義領域に、HDMI-CECコマンドが「外部装置の状態を通知するもの(notice state)」であることを表す1バイトの情報(例えば「0x0a」)と、「装置状態に対応したインデックス(index)」を表す1バイトの情報(例えば「1」)とが、格納される。
Control information The HDMI-CEC command is also used as control information in the second embodiment, but the information stored in the vendor definition area is different from that in the first embodiment. As shown in FIG. 6, in the second embodiment, 1-byte information (for example, “notice state” indicating that the HDMI-CEC command is “notice state of the external device” is displayed in the vendor definition area. 0x0a ") and 1-byte information (for example," 1 ") indicating" index corresponding to the device state "is stored.
 この装置状態に対応したインデックスとは、装置状態をビット値に置き換えた情報である。インデックスに対応した装置状態は、音声ファイル格納部211に音声データとして格納されている。
 図7は、音声ファイル格納部211に格納されるテーブルの一例を示す図である。図7に示すように、この音声ファイル格納部211には、インデックスと音声データ(図では等価的な文言で表現している)とが一対になった音声ファイルとして複数格納されている。この音声ファイルは、例えば、コンテンツ処理装置200に予めプリセットされていてもよいし、外部装置500と最初に接続されたときに外部装置500からアップロードされてもよいし、コンテンツ処理装置200に接続されるネットワークを経由してサーバなどからダウンロードされてもよい。
The index corresponding to the device state is information obtained by replacing the device state with a bit value. The device state corresponding to the index is stored as audio data in the audio file storage unit 211.
FIG. 7 is a diagram illustrating an example of a table stored in the audio file storage unit 211. As shown in FIG. 7, the audio file storage unit 211 stores a plurality of indexes and audio data (expressed in equivalent terms in the figure) as a pair of audio files. For example, the audio file may be preset in the content processing apparatus 200, may be uploaded from the external apparatus 500 when first connected to the external apparatus 500, or is connected to the content processing apparatus 200. It may be downloaded from a server or the like via a network.
・状態読み上げ処理
 図8は、第2の実施形態に係るコンテンツ処理装置200のHDMI通信部104、音声出力制御部207、音声ファイル格納部211、および映像音声出力処理部208によって行われる状態読み上げ方法の処理手順を説明するフローチャートである。
State Reading Process FIG. 8 shows a state reading method performed by the HDMI communication unit 104, the audio output control unit 207, the audio file storage unit 211, and the video / audio output processing unit 208 of the content processing apparatus 200 according to the second embodiment. It is a flowchart explaining the process sequence of.
 図8に示した状態読み上げ処理は、HDMI通信部104が制御情報、すなわちHDMI-CECコマンドを受信すると開始される。
 HDMI通信部104は、受信したHDMI-CECコマンドのベンダー定義領域における最初の1バイトを確認し、コマンドが装置状態を通知するものであるか否かを判断する(ステップS301)。装置状態を通知するコマンドであると判断した場合(ステップS301で「Yes」)、HDMI通信部104は、HDMI-CECコマンドを音声出力制御部207に出力する。音声出力制御部207は、HDMI通信部104から取得したHDMI-CECコマンドのベンダー定義領域における2バイト目を確認し、格納されているインデックスを判断する(ステップS801)。インデックスを判断すると、音声出力制御部207は、音声ファイル格納部211を検索してインデックスが一致する音声ファイルを特定し、特定した音声ファイルに格納されている音声データを取得し、映像音声出力処理部208へ出力する(ステップS802)。そして、音声出力制御部207は、出力した音声データがコンテンツ信号の音声データと共に音声出力部110へ出力されるように、映像音声出力処理部208を制御する。映像音声出力処理部208は、音声出力制御部207の指示に従って、映像出力部109へはコンテンツ信号の映像データが出力され、かつ、コンテンツ信号の音声データに音声出力制御部207から与えられる音声データが重畳されて出力されるように、音声データの合成処理を行う(ステップS803)。
The state reading process shown in FIG. 8 is started when the HDMI communication unit 104 receives control information, that is, an HDMI-CEC command.
The HDMI communication unit 104 checks the first byte in the vendor definition area of the received HDMI-CEC command, and determines whether or not the command notifies the device status (step S301). If it is determined that the command is a command for notifying the apparatus status (“Yes” in step S301), the HDMI communication unit 104 outputs an HDMI-CEC command to the audio output control unit 207. The audio output control unit 207 confirms the second byte in the vendor definition area of the HDMI-CEC command acquired from the HDMI communication unit 104, and determines the stored index (step S801). When the index is determined, the audio output control unit 207 searches the audio file storage unit 211 to identify the audio file whose index matches, acquires the audio data stored in the identified audio file, and performs the audio / video output process. The data is output to the unit 208 (step S802). Then, the audio output control unit 207 controls the video / audio output processing unit 208 so that the output audio data is output to the audio output unit 110 together with the audio data of the content signal. The video / audio output processing unit 208 outputs the video data of the content signal to the video output unit 109 in accordance with an instruction from the audio output control unit 207, and the audio data provided from the audio output control unit 207 to the audio data of the content signal Is synthesized so as to be superimposed and output (step S803).
 以上のように、本発明の第2の実施形態に係るコンテンツ処理装置200および状態読み上げ方法によれば、外部装置500に生じる可能性がある状態変化に対応する音声データを予め格納しておき、外部装置500から状態変化が通知された場合に、状態変化に応じた音声データをコンテンツ信号の音声データに合成して出力する。
 これにより、コンテンツを視聴中にバックグラウンドで動作していた外部装置に生じた状態変化を、コンテンツ映像の表示を中断させることなく音声で読み上げて視聴者に通知することが可能となる。
 また、コンテンツ処理装置200は、状態変化に応じた音声データを予め格納しているので、コンテンツ信号の音声データを外部装置500から受信する音声データに切り換える必要がなくなる。
As described above, according to the content processing device 200 and the state reading method according to the second embodiment of the present invention, the audio data corresponding to the state change that may occur in the external device 500 is stored in advance. When a change in state is notified from the external device 500, the audio data corresponding to the change in state is combined with the audio data of the content signal and output.
As a result, it is possible to read out a change in state that has occurred in the external device that was operating in the background while viewing the content, and to notify the viewer by reading it out without interrupting the display of the content video.
In addition, since the content processing apparatus 200 stores the audio data corresponding to the state change in advance, it is not necessary to switch the audio data of the content signal to the audio data received from the external apparatus 500.
  <第3の実施形態>
 図9は、本発明の第3の実施形態に係るコンテンツ処理装置300の構成を示す図である。図9に示す第3の実施形態に係るコンテンツ処理装置300は、チューナ部101と、映像入力部102と、音声入力部103と、HDMI通信部104と、映像入力部105と、音声入力部106と、音声出力制御部307と、文字列変換部311と、映像音声出力処理部308と、映像出力部109と、音声出力部110とを備えている。このコンテンツ処理装置300は、HDMIケーブル501を介して外部装置500と接続されている。
<Third Embodiment>
FIG. 9 is a diagram showing a configuration of a content processing apparatus 300 according to the third embodiment of the present invention. A content processing apparatus 300 according to the third embodiment illustrated in FIG. 9 includes a tuner unit 101, a video input unit 102, an audio input unit 103, an HDMI communication unit 104, a video input unit 105, and an audio input unit 106. An audio output control unit 307, a character string conversion unit 311, a video / audio output processing unit 308, a video output unit 109, and an audio output unit 110. The content processing device 300 is connected to the external device 500 via the HDMI cable 501.
 図9に示すように、第3の実施形態に係るコンテンツ処理装置300は、第1の実施形態に係るコンテンツ処理装置100(図1)と、音声出力制御部307、文字列変換部311、および映像音声出力処理部308の構成が異なる。これらの異なる構成以外は、コンテンツ処理装置100と同じであるため、同一の参照符号を付して説明を省略する。
 以下、音声出力制御部307、文字列変換部311、および映像音声出力処理部308を中心に、第3の実施形態に係るコンテンツ処理装置300を説明する。
As shown in FIG. 9, the content processing apparatus 300 according to the third embodiment includes a content processing apparatus 100 (FIG. 1) according to the first embodiment, an audio output control unit 307, a character string conversion unit 311, and The configuration of the video / audio output processing unit 308 is different. Except for these different configurations, the content processing apparatus 100 is the same as the content processing apparatus 100, and thus the same reference numerals are assigned and description thereof is omitted.
Hereinafter, the content processing apparatus 300 according to the third embodiment will be described focusing on the audio output control unit 307, the character string conversion unit 311, and the video / audio output processing unit 308.
・制御情報
 この第3の実施形態でも、HDMI-CECコマンドを制御情報として使用するが、ベンダー定義領域に格納する情報が上記第1の実施形態とは異なる。図10に示すように、第3の実施形態では、ベンダー定義領域に、HDMI-CECコマンドが「外部装置の状態を通知するもの(notice state)」であることを表す1バイトの情報(例えば「0x0a」)と、「1つの文字列を作成するために必要なコマンドの数(total)」を表す1バイトの情報(例えば「0x03」)と、「送信順を示すコマンド番号(count)」を表す1バイトの情報(例えば「0x01」、「0x02」、「0x03」)と、「装置状態を示す文字」を所定の文字コード(例えば、EUCコード、JISコード、アスキーコード(ASCII code))で表す11バイトの情報(例えば「Dubbing was」(図10(a))、「completed」(図10(b))、「to DVD」(図10(c)))とが、格納される。
Control information In the third embodiment, the HDMI-CEC command is used as control information, but the information stored in the vendor definition area is different from that in the first embodiment. As shown in FIG. 10, in the third embodiment, 1-byte information (for example, “notice state” indicating that the HDMI-CEC command is “notice state of the external device” is displayed in the vendor definition area. 0x0a "), 1-byte information (for example," 0x03 ") indicating" the number of commands required to create one character string (total) ", and" command number (count) indicating the transmission order " 1-byte information (for example, “0x01”, “0x02”, “0x03”) and “characters indicating the device status” in a predetermined character code (for example, EUC code, JIS code, ASCII code (ASCII code)) 11 bytes of information (for example, “Dubbing was” (FIG. 10A), “completed” (FIG. 10B)), “to DV D ”(FIG. 10C)) is stored.
 音声出力制御部307は、HDMI-CECコマンドのベンダー定義領域に格納された文字を抽出すると共に、複数のHDMI-CECコマンドに分割されて送信されてくる複数の文字を1つの文字列に再構成して文章化する。
 文字列変換部311は、所定の文字コードで表された1つの文字列を音声データに変換する機能を有している。
The audio output control unit 307 extracts characters stored in the vendor-defined area of the HDMI-CEC command, and reconstructs a plurality of characters transmitted by being divided into a plurality of HDMI-CEC commands into one character string And document it.
The character string conversion unit 311 has a function of converting one character string represented by a predetermined character code into voice data.
・状態読み上げ処理
 図11は、第3の実施形態に係るコンテンツ処理装置300のHDMI通信部104、音声出力制御部307、文字列変換部311、および映像音声出力処理部308によって行われる状態読み上げ方法の処理手順を説明するフローチャートである。
State Reading Process FIG. 11 is a state reading method performed by the HDMI communication unit 104, the audio output control unit 307, the character string conversion unit 311 and the video / audio output processing unit 308 of the content processing apparatus 300 according to the third embodiment. It is a flowchart explaining the process sequence of.
 図11に示した状態読み上げ処理は、HDMI通信部104が制御情報、すなわちHDMI-CECコマンドを受信すると開始される。
 HDMI通信部104は、受信したHDMI-CECコマンドのベンダー定義領域における最初の1バイトを確認し、コマンドが装置状態を通知するものであるか否かを判断する(ステップS301)。装置状態を通知するコマンドであると判断した場合(ステップS301で「Yes」)、HDMI通信部104は、HDMI-CECコマンドを音声出力制御部307に出力する。音声出力制御部307は、HDMI通信部104から取得したHDMI-CECコマンドのベンダー定義領域における4バイト目から14バイト目を確認し、格納されている所定の文字コードで表された文字を抽出する(ステップS1101)。音声出力制御部307は、HDMI-CECコマンドで送信されてくる全ての文字を抽出する。なお、この全ての文字が抽出されたことの判断は、HDMI-CECコマンドのベンダー定義領域における3バイト目に格納された「コマンド番号(count)」が、ベンダー定義領域における2バイト目に格納された「必要なコマンドの数(total)」と一致するか否かで行われる(ステップS1102)。そして、音声出力制御部307は、抽出された複数の文字を1つの文字列に再構成(連結)して、文字列変換部311に出力する(ステップS1103)。
The state reading process shown in FIG. 11 is started when the HDMI communication unit 104 receives control information, that is, an HDMI-CEC command.
The HDMI communication unit 104 checks the first byte in the vendor definition area of the received HDMI-CEC command, and determines whether or not the command notifies the device status (step S301). If it is determined that the command is a command for notifying the apparatus status (“Yes” in step S301), the HDMI communication unit 104 outputs an HDMI-CEC command to the audio output control unit 307. The audio output control unit 307 confirms the 4th to 14th bytes in the vendor-defined area of the HDMI-CEC command acquired from the HDMI communication unit 104, and extracts the character represented by the stored predetermined character code. (Step S1101). The audio output control unit 307 extracts all characters transmitted by the HDMI-CEC command. The determination that all the characters have been extracted is based on the fact that the “command number (count)” stored in the third byte in the vendor-defined area of the HDMI-CEC command is stored in the second byte in the vendor-defined area. Whether or not it coincides with “the number of necessary commands (total)” (step S1102). Then, the voice output control unit 307 reconstructs (concatenates) the extracted plurality of characters into one character string, and outputs it to the character string conversion unit 311 (step S1103).
 文字列変換部311は、音声出力制御部307から再構成された1つの文字列を入力し、この1つの文字列を音声データに変換して音声出力制御部307へ返送する(ステップS1104)。変換された音声データが文字列変換部311から返送されると、音声出力制御部307は、変換された音声データを映像音声出力処理部308へ出力する。そして、音声出力制御部307は、出力した音声データがコンテンツ信号の音声データと共に音声出力部110へ出力されるように、映像音声出力処理部308を制御する。映像音声出力処理部308は、音声出力制御部307の指示に従って、映像出力部109へはコンテンツ信号の映像データが出力され、かつ、コンテンツ信号の音声データに音声出力制御部307から与えられる音声データが重畳されて出力されるように、音声データの合成処理を行う(ステップS1105)。 The character string conversion unit 311 receives one reconstructed character string from the voice output control unit 307, converts this one character string into voice data, and returns it to the voice output control unit 307 (step S1104). When the converted audio data is returned from the character string conversion unit 311, the audio output control unit 307 outputs the converted audio data to the video / audio output processing unit 308. Then, the audio output control unit 307 controls the video / audio output processing unit 308 so that the output audio data is output to the audio output unit 110 together with the audio data of the content signal. The video / audio output processing unit 308 outputs the video data of the content signal to the video output unit 109 according to the instruction of the audio output control unit 307, and the audio data supplied from the audio output control unit 307 to the audio data of the content signal Is synthesized so as to be superimposed and output (step S1105).
 以上のように、本発明の第3の実施形態に係るコンテンツ処理装置300によれば、外部装置500から状態変化が通知された場合に、状態変化と共に受信する文字列を音声データに変換して、コンテンツ信号の音声データに合成して出力する。
 これにより、コンテンツを視聴中にバックグラウンドで動作していた外部装置に生じた状態変化を、コンテンツ映像の表示を中断させることなく音声で読み上げて視聴者に通知することが可能となる。
 また、コンテンツ処理装置300は、状態変化に応じた音声データをコマンドに添付された文字列から生成するので、コンテンツ信号の音声データを外部装置500から受信する音声データに切り換える必要がなくなる。
As described above, according to the content processing apparatus 300 according to the third embodiment of the present invention, when a state change is notified from the external apparatus 500, the character string received together with the state change is converted into audio data. The audio data of the content signal is synthesized and output.
As a result, it is possible to read out a change in state that has occurred in the external device that was operating in the background while viewing the content, and to notify the viewer by reading it out without interrupting the display of the content video.
Further, since the content processing device 300 generates audio data corresponding to the state change from the character string attached to the command, it is not necessary to switch the audio data of the content signal to the audio data received from the external device 500.
  <外部装置との接続形態>
 上記第1~第3の実施形態では、コンテンツ処理装置と外部装置とのインタフェースにHDMI通信を適用し、HDMI-CECコマンドを利用して制御情報を伝送する場合を説明した。しかし、本発明は、コンテンツ処理装置と外部装置とのインタフェースはこの実施形態に限られるものではなく、映像データ、音声データ、および制御情報を含む信号の送受信ができれば他の通信インタフェースを用いても実現可能である。
<Connection form with external device>
In the first to third embodiments, the case has been described in which HDMI communication is applied to the interface between the content processing apparatus and the external apparatus, and control information is transmitted using the HDMI-CEC command. However, according to the present invention, the interface between the content processing apparatus and the external apparatus is not limited to this embodiment, and other communication interfaces can be used as long as signals including video data, audio data, and control information can be transmitted and received. It is feasible.
 例えば、同じHDMI規格であっても、HDMIver1.4の規格に採用されたイーサネット(登録商標)通信を制御情報の伝送に用いることができる。また、図12に示すコンテンツ処理装置400のように、イーサネット(登録商標)通信部412を備えることで、一般的なLANケーブル502を介したイーサネット(登録商標)通信(例えば、インターネットプロトコルTV(IPTV)用のイーサネット(登録商標)通信)も、映像/音声データや制御情報の伝送に用いることができる。イーサネット(登録商標)通信を用いれば、AAC(Advanced Audio Coding)方式やMP3(MPEG1 Audio Layer3)方式などで圧縮された大容量の音声データを高速に伝送することができる。さらには、有線LAN通信だけでなく、無線LAN通信も適用可能である。 For example, even with the same HDMI standard, Ethernet (registered trademark) communication adopted in the HDMI 1.4 standard can be used for transmission of control information. 12 includes an Ethernet (registered trademark) communication unit 412 so that Ethernet (registered trademark) communication via a general LAN cable 502 (for example, an Internet protocol TV (IPTV)) is provided. Ethernet (registered trademark) communication) can also be used for transmission of video / audio data and control information. If Ethernet (registered trademark) communication is used, a large volume of audio data compressed by an AAC (Advanced Audio Coding) method, an MP3 (MPEG1 Audio Layer3) method, or the like can be transmitted at high speed. Furthermore, not only wired LAN communication but also wireless LAN communication is applicable.
 なお、本発明の各実施形態におけるコンテンツ処理装置を構成する各機能ブロックは、中央処理装置(CPU)、記憶装置(メモリ(ROM、RAM等)、ハードディスク等)、および入出力装置などのハードウエア資源を用いることで実現され、典型的には集積回路であるIC(LSI、システムLSI、スーパーLSI、ウルトラLSI等とも称される)として具現化される。これらの機能ブロックは、個別に1チップ化されてもよいし、一部または全部を含むように1チップ化されてもよい。
 また、集積回路化の手法は、ICに限るものではなく、専用回路または汎用プロセッサで実現してもよい。また、IC製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)や、IC内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。
 さらには、半導体技術の進歩または派生する別の技術により、ICに置き換わる集積回路化の技術(バイオ技術など)が登場すれば、当然その技術を用いて機能ブロックの集積化を行ってもよい。
In addition, each functional block constituting the content processing apparatus in each embodiment of the present invention includes hardware such as a central processing unit (CPU), a storage device (memory (ROM, RAM, etc.), hard disk, etc.), and an input / output device. It is realized by using resources, and is typically embodied as an integrated circuit IC (also referred to as an LSI, a system LSI, a super LSI, an ultra LSI, or the like). These functional blocks may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
Further, the method of circuit integration is not limited to ICs, and implementation using dedicated circuitry or general purpose processors is also possible. Also, an FPGA (Field Programmable Gate Array) that can be programmed after manufacturing the IC or a reconfigurable processor that can reconfigure the connection and setting of the circuit cells inside the IC may be used.
Further, if integrated circuit technology (such as biotechnology) that replaces ICs appears as a result of the advancement of semiconductor technology or another technology derived therefrom, functional blocks may naturally be integrated using this technology.
 また、上述した本発明の各実施形態におけるコンテンツ処理装置が実行する状態読み上げ方法は、記憶装置に格納された状態読み上げ方法手順を実行可能な所定のプログラムデータが、CPUによって解釈実行されることで実現されてもよい。この場合、プログラムデータは、CD-ROMやフレキシブルディスク等の記録媒体を介して記憶装置内に導入されてもよいし、記録媒体上から直接実行されてもよい。なお、記録媒体は、ROMやRAMやフラッシュメモリ等の半導体メモリ、フレキシブルディスクやハードディスク等の磁気ディスクメモリ、CD-ROMやDVDやBD等の光ディスクメモリ、およびメモリカードなどをいう。また、記録媒体は、電話回線や搬送路などの通信媒体も含む概念である。 In addition, the state reading method executed by the content processing device in each of the embodiments of the present invention described above is performed by the CPU interpreting and executing predetermined program data that can execute the state reading method procedure stored in the storage device. It may be realized. In this case, the program data may be introduced into the storage device via a recording medium such as a CD-ROM or a flexible disk, or may be directly executed from the recording medium. The recording medium refers to a semiconductor memory such as a ROM, a RAM, or a flash memory, a magnetic disk memory such as a flexible disk or a hard disk, an optical disk memory such as a CD-ROM, a DVD, or a BD, and a memory card. The recording medium is a concept including a communication medium such as a telephone line or a conveyance path.
 本発明は、コンテンツ映像/音声を出力するコンテンツ処理装置と外部装置とが通信インタフェースで接続されたシステムに利用可能であり、特にコンテンツ視聴中に外部装置に生じた状態変化を音声読み上げによって視聴者に通知したい場合に適している。 INDUSTRIAL APPLICABILITY The present invention can be used in a system in which a content processing apparatus that outputs content video / audio and an external apparatus are connected via a communication interface. Suitable for when you want to notify.
100、200、300、400 コンテンツ処理装置
101 チューナ部
102、105 映像入力部
103、106 音声入力部
104 HDMI通信部
107、207、307 音声出力制御部
108、208、308 映像音声出力処理部
109 映像出力部
110 音声出力部
211 音声ファイル格納部
311 文字列変換部
412 イーサネット(登録商標)通信部
500 外部装置
501 HDMIケーブル
502 LANケーブル
100, 200, 300, 400 Content processing apparatus 101 Tuner unit 102, 105 Video input unit 103, 106 Audio input unit 104 HDMI communication unit 107, 207, 307 Audio output control unit 108, 208, 308 Video / audio output processing unit 109 Video Output unit 110 Audio output unit 211 Audio file storage unit 311 Character string conversion unit 412 Ethernet (registered trademark) communication unit 500 External device 501 HDMI cable 502 LAN cable

Claims (10)

  1.  コンテンツの映像および音声を出力するコンテンツ処理装置であって、
     所定のインタフェースを介して外部装置と通信する通信部、
     前記通信部における、前記外部装置が出力する状態を通知する制御情報の受信を判断し、当該制御情報に基づいてコンテンツ処理装置の音声出力を制御する制御部、および
     前記制御部の制御に従って、視聴コンテンツの映像を継続して映像出力し、かつ、前記制御情報によって通知された前記外部装置の状態を読み上げる音声を音声出力する、映像音声出力処理部を備える、コンテンツ処理装置。
    A content processing apparatus that outputs video and audio of content,
    A communication unit that communicates with an external device via a predetermined interface;
    The communication unit determines reception of control information notifying the state output by the external device, and controls the audio output of the content processing device based on the control information, and viewing according to the control of the control unit A content processing apparatus comprising: a video / audio output processing unit that continuously outputs a video of content and outputs a sound that reads out the state of the external device notified by the control information.
  2.  前記所定のインタフェースは、HDMI規格に準拠する通信インタフェースであることを特徴とする、請求項1に記載のコンテンツ処理装置。 The content processing apparatus according to claim 1, wherein the predetermined interface is a communication interface conforming to the HDMI standard.
  3.  前記制御部は、前記制御情報にHDMI-CECコマンドを用いることを特徴とする、請求項2に記載のコンテンツ処理装置。 3. The content processing apparatus according to claim 2, wherein the control unit uses an HDMI-CEC command for the control information.
  4.  前記制御部は、前記HDMI-CECコマンドによる音声データ出力開始の指示に応じて、前記外部装置の状態を読み上げる音声を音声出力するように前記映像音声出力処理部を制御し、
     前記映像音声出力処理部は、前記制御部の制御に従って、音声出力を前記視聴コンテンツの音声から前記外部装置の状態を読み上げる音声に切り換えることを特徴とする、請求項3に記載のコンテンツ処理装置。
    The control unit controls the video / audio output processing unit to output audio that reads out the state of the external device in response to an instruction to start audio data output by the HDMI-CEC command,
    The content processing apparatus according to claim 3, wherein the video / audio output processing unit switches the audio output from the audio of the viewing content to audio that reads the state of the external device under the control of the control unit.
  5.  前記制御部は、前記HDMI-CECコマンドによる音声データ出力終了の指示に応じて、前記視聴コンテンツの音声を音声出力するように前記映像音声出力処理部を制御し、
     前記映像音声出力処理部は、前記制御部の制御に従って、音声出力を前記外部装置の状態を読み上げる音声から前記視聴コンテンツの音声に切り換えることを特徴とする、請求項4に記載のコンテンツ処理装置。
    The control unit controls the video / audio output processing unit to output audio of the viewing content in response to an instruction to end audio data output by the HDMI-CEC command,
    5. The content processing apparatus according to claim 4, wherein the video / audio output processing unit switches the audio output from an audio reading the state of the external device to an audio of the viewing content according to control of the control unit.
  6.  前記制御部は、前記HDMI-CECコマンドによる音声データ出力開始の指示に応じて、前記外部装置の状態を読み上げる音声を音声出力するように前記映像音声出力処理部を制御し、
     前記映像音声出力処理部は、前記制御部の制御に従って、音声出力している前記視聴コンテンツの音声に前記外部装置の状態を読み上げる音声を合成することを特徴とする、請求項3に記載のコンテンツ処理装置。
    The control unit controls the video / audio output processing unit to output audio that reads out the state of the external device in response to an instruction to start audio data output by the HDMI-CEC command,
    4. The content according to claim 3, wherein the video / audio output processing unit synthesizes audio for reading the state of the external device with audio of the viewing content being audio-output under the control of the control unit. 5. Processing equipment.
  7.  前記制御部は、前記HDMI-CECコマンドによる音声データ出力終了の指示に応じて、前記視聴コンテンツの音声を音声出力するように前記映像音声出力処理部を制御し、
     前記映像音声出力処理部は、前記制御部の制御に従って、音声出力している前記視聴コンテンツの音声への前記外部装置の状態を読み上げる音声の合成を解除することを特徴とする、請求項6に記載のコンテンツ処理装置。
    The control unit controls the video / audio output processing unit to output audio of the viewing content in response to an instruction to end audio data output by the HDMI-CEC command,
    7. The audio / video output processing unit according to claim 6, wherein the video / audio output processing unit cancels the synthesis of the audio that reads out the state of the external device to the audio of the viewing content being audio-outputted according to the control of the control unit. The content processing apparatus described.
  8.  インデックスと前記外部装置の状態を表す音声とが一意に対応付けられた1つ以上の音声ファイルを格納する格納部をさらに備え、
     前記制御部は、前記HDMI-CECコマンドによる前記インデックスの指示に応じて、前記インデックスで特定される音声を前記格納部から取得し、当該取得された音声を音声出力するように前記映像音声出力処理部を制御し、
     前記映像音声出力処理部は、前記制御部の制御に従って、音声出力している前記視聴コンテンツの音声に前記取得された音声を合成することを特徴とする、請求項3に記載のコンテンツ処理装置。
    A storage unit for storing one or more audio files in which the index and the audio representing the state of the external device are uniquely associated;
    The control unit acquires the audio specified by the index from the storage unit according to the index instruction by the HDMI-CEC command, and outputs the acquired audio as audio. Control the part
    The content processing apparatus according to claim 3, wherein the video / audio output processing unit synthesizes the acquired audio with the audio of the viewing content being output as audio according to control of the control unit.
  9.  文字列を音声に変換する変換部をさらに備え、
     前記制御部は、前記HDMI-CECコマンドに格納される前記外部装置の状態を表す文字列を抽出して前記変換部を用いて音声に変換し、当該変換された音声を音声出力するように前記映像音声出力処理部を制御し、
     前記映像音声出力処理部は、前記制御部の制御に従って、音声出力している前記視聴コンテンツの音声に前記変換された音声を合成することを特徴とする、請求項3に記載のコンテンツ処理装置。
    It further includes a conversion unit that converts a character string into speech,
    The control unit extracts a character string representing the state of the external device stored in the HDMI-CEC command, converts the character string into sound using the conversion unit, and outputs the converted sound as sound. Control the audio / video output processing unit,
    The content processing apparatus according to claim 3, wherein the video / audio output processing unit synthesizes the converted audio with the audio of the viewing content being output as audio under the control of the control unit.
  10.  コンテンツの映像および音声を出力するコンテンツ処理装置に適用される状態読み上げ方法であって、
     所定のインタフェースを介して外部装置から制御情報を受信するステップ、
     前記制御情報が、前記外部装置の状態を通知する情報であるか否かを判断するステップ、および
     前記制御情報が前記外部装置の状態を通知する情報である場合、視聴コンテンツの映像出力を継続しつつ、前記制御情報によって通知された前記外部装置の状態を音声で読み上げるステップを含む、状態読み上げ方法。
    A state reading method applied to a content processing apparatus that outputs video and audio of content,
    Receiving control information from an external device via a predetermined interface;
    Determining whether the control information is information for notifying the state of the external device; and, if the control information is information for notifying the state of the external device, continues to output video of viewing content. A state reading method including a step of reading out the state of the external device notified by the control information by voice.
PCT/JP2011/007147 2011-12-21 2011-12-21 Content processing device and status read-out method WO2013093968A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/007147 WO2013093968A1 (en) 2011-12-21 2011-12-21 Content processing device and status read-out method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2011/007147 WO2013093968A1 (en) 2011-12-21 2011-12-21 Content processing device and status read-out method

Publications (1)

Publication Number Publication Date
WO2013093968A1 true WO2013093968A1 (en) 2013-06-27

Family

ID=48667900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/007147 WO2013093968A1 (en) 2011-12-21 2011-12-21 Content processing device and status read-out method

Country Status (1)

Country Link
WO (1) WO2013093968A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109413482A (en) * 2018-10-19 2019-03-01 北京奇艺世纪科技有限公司 A kind of control method of video playing and a kind of terminal device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007243673A (en) * 2006-03-09 2007-09-20 Nec Corp System for providing information in the rail car and program providing method
JP2009194534A (en) * 2008-02-13 2009-08-27 Sharp Corp Television apparatus
JP2010171864A (en) * 2009-01-26 2010-08-05 Casio Hitachi Mobile Communications Co Ltd Terminal device and program
JP2011017802A (en) * 2009-07-08 2011-01-27 J&K Car Electronics Corp Display device, program, and display method
WO2011145701A1 (en) * 2010-05-19 2011-11-24 シャープ株式会社 Source device, sink device, system, programme and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007243673A (en) * 2006-03-09 2007-09-20 Nec Corp System for providing information in the rail car and program providing method
JP2009194534A (en) * 2008-02-13 2009-08-27 Sharp Corp Television apparatus
JP2010171864A (en) * 2009-01-26 2010-08-05 Casio Hitachi Mobile Communications Co Ltd Terminal device and program
JP2011017802A (en) * 2009-07-08 2011-01-27 J&K Car Electronics Corp Display device, program, and display method
WO2011145701A1 (en) * 2010-05-19 2011-11-24 シャープ株式会社 Source device, sink device, system, programme and recording medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109413482A (en) * 2018-10-19 2019-03-01 北京奇艺世纪科技有限公司 A kind of control method of video playing and a kind of terminal device

Similar Documents

Publication Publication Date Title
JP5444476B2 (en) CONTENT DATA GENERATION DEVICE, CONTENT DATA GENERATION METHOD, COMPUTER PROGRAM, AND RECORDING MEDIUM
RU2480818C1 (en) Software updating system, display unit and software updating method
CN101146199B (en) Video-information processing apparatus, video-information processing method
CN101150690A (en) Power control apparatus, system and method
JP2009194753A (en) Electronic device, display system, transmission method and display method
CN106331846B (en) The method and device of audio transparent transmission
US20100141845A1 (en) Audio output device connectable with plurality of devices and method of controlling the same
JP2009272791A (en) Transmitter, information transmission method, receiver, and information processing method
JP2008293414A (en) Electronic equipment and method for searching connection device
TWI376949B (en) Enhanced display system with dvc connectivity
CN101523900A (en) Method for providing menu screen suitable for menus provided by external device and imaging device using the same
JP2015191515A (en) Electronic equipment
JP2011176526A (en) Communication device, communication control method, and program
JP2011045021A (en) Transmission system, reproduction device, transmission method, and program
CN101521781A (en) Method for setting HDMI audio format
US9224428B2 (en) Recording apparatus and control method thereof
JP2013135454A (en) Electronic apparatus, and mutual video and voice output method of electronic apparatus
WO2013093968A1 (en) Content processing device and status read-out method
JP5936685B2 (en) Relay device
JP2018182390A (en) Control method, transmission device, and reception device
US20060117120A1 (en) Controller to be connected to sender of stream data via IEEE 1394 serial bus
JP2007189346A (en) Content reproduction system, content output apparatus, content reproduction apparatus, and content reproduction method
JP2010004289A (en) Display device
WO2017043378A1 (en) Transmission device, transmission method, reception device, and reception method
WO2011115130A1 (en) Display device, television receiver, audio signal supply method, audio signal supply system, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11877952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11877952

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP