US20080027713A1 - Information display apparatus, information display method and program therefor - Google Patents

Information display apparatus, information display method and program therefor Download PDF

Info

Publication number
US20080027713A1
US20080027713A1 US11/861,980 US86198007A US2008027713A1 US 20080027713 A1 US20080027713 A1 US 20080027713A1 US 86198007 A US86198007 A US 86198007A US 2008027713 A1 US2008027713 A1 US 2008027713A1
Authority
US
United States
Prior art keywords
speech
playback
closed caption
display
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/861,980
Inventor
Kohei Momosaki
Kazuhiko Abe
Yasuyuki Masai
Makoto Yajima
Koichi Yamamoto
Munehiko Sasajima
Original Assignee
Kohei Momosaki
Kazuhiko Abe
Yasuyuki Masai
Makoto Yajima
Koichi Yamamoto
Munehiko Sasajima
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2003095402A priority Critical patent/JP4170808B2/en
Priority to JP2003-095402 priority
Priority to US10/810,648 priority patent/US7443449B2/en
Application filed by Kohei Momosaki, Kazuhiko Abe, Yasuyuki Masai, Makoto Yajima, Koichi Yamamoto, Munehiko Sasajima filed Critical Kohei Momosaki
Priority to US11/861,980 priority patent/US20080027713A1/en
Publication of US20080027713A1 publication Critical patent/US20080027713A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/445Receiver circuitry for displaying additional information
    • H04N5/44513Receiver circuitry for displaying additional information for displaying or controlling a single function of one single apparatus, e.g. TV receiver or VCR
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • H04N21/4325Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4884Data services, e.g. news ticker for displaying subtitles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/445Receiver circuitry for displaying additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry
    • H04N5/60Receiver circuitry for the sound signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Characteristics of or Internal components of the client
    • H04N21/42646Characteristics of or Internal components of the client for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/21Circuitry for suppressing or minimising disturbance, e.g. moiré or halo

Abstract

An information display apparatus includes a display device configured to display a video, a speech detection unit configured to detect a playback state of a playback speech, a closed caption display unit configured to generate character information associated with the playback speech and display it on the display device together with the video, and a closed caption display unit configured to carry out a changing control for changing according to the detected playback state a display state of the character information that is displayed on the display device by the closed caption display unit.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of and claims the benefit of priority under 35 U.S.C. §120 from U.S. application Ser. No. 10/810,648, filed Mar. 29, 2004, the entire contents of which are incorporated herein by reference. This application also claims the benefit of priority under 35 U.S.C. §119 from the prior Japanese Patent Application No. 2003-095402, filed Mar. 31, 2003, the entire contents of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an information display apparatus which is incorporated in or connected to a video playback apparatus or video recorder/player apparatus represented by a television receiver, DVD apparatus, or hard disk video recorder and provides a closed caption display function, an information display method, and a program.
  • 2. Description of the Related Art
  • Television broadcasts and DVD video contents provide closed caption information that supplements speech information. In a television broadcast, teletext signals are superposed on video signals. More specifically, character information is transmitted using an identification signal (VBI signal>inserted during a vertical blanking period (see, e.g., Japanese Patent Laid-Open No. 9-65295). DVDs and the like are so designed as to record multilingual closed caption information together with images and speech. Digital high-definition television broadcasts can also transmit closed captions as sub-video information.
  • The use of closed caption character information is effective not only for a hearing impaired person and deaf person but also for a normal listener when it is hard to hear speech corresponding to an image (for example, when the playback sound volume is small or ambient noise is large). This is also effective when it is difficult to play back speech corresponding to an image so as to be able to hear it (for example, in high speed playback such as fast-forward playback while displaying an image, slow-motion replay, pause, backward playback, or multi-screen display).
  • However, a conventional video playback apparatus or information display apparatus does not provide closed caption character information display functions corresponding to viewing situations. A technique of displaying a closed caption in a mute state has also been known (see, e.g., Japanese Patent Laid-Open No. 7-46500). No conventional video playback apparatus or information display apparatus provides a function of controlling display of closed caption character information in accordance with the viewing situation.
  • It is an object of the present invention to provide an information display apparatus capable of controlling display of closed caption character information in accordance with a viewing situation, and an information display method therefor. In playing back an image by a television receiver, video playback apparatus, video recorder/player apparatus, or the like, or playing back speech by a speech playback apparatus, speech recorder/player apparatus, or the like, closed caption character information is displayed or the display form is changed in accordance with the speech playback status to appropriately supplement speech information in, e.g., a situation in which it is probably impossible or hard to hear or understand speech.
  • BRIEF SUMMARY OF THE INVENTION
  • According to an aspect of the present invention, there is provided an information display apparatus comprising: a display device configured to display a video; a speech detection unit configured to detect a playback state of a playback speech; a closed caption display unit configured to generate character information associated with the playback speech and display it on the display device together with the video; and a closed caption display unit configured to carry out a changing control for changing according to the detected playback state a display state of the character information that is displayed on the display device by the closed caption display unit.
  • According to another aspect of the present invention, there is provided an information display method comprising: generating a playback speech; detecting a playback state of the playback speech; and changing a display state of character information associated with the playback speech according to the detected playback state.
  • According to another aspect of the present invention, there is provided a program stored in a computer readable medium for displaying character information on a display device, comprising: means for instructing a computer to playback a video and a speech from a recording medium; means for instructing the computer to change a playback state of the video and the speech; means for instructing the computer to display character information associated with the video and the speech; and means for instructing the computer to change a display state of the character information according to the display state of the video and the speech.
  • According to the present invention, when an image is played back by a television receiver, video playback apparatus, video recorder/player apparatus, or the like, or speech is played back by a speech playback apparatus, speech recorder/player apparatus, or the like. Closed caption character information is displayed or the display form is changed in accordance with the speech playback status to appropriately supplement speech information in, e.g., a situation in which it is probably impossible or difficult to hear or understand speech. In this manner, display of closed caption character information can be controlled in accordance with the viewing situation. For example, when speech does not satisfy conditions which are set in advance as a state in which the speech can be properly heard, closed caption information which supplements the speech information can be properly displayed, or the display form can be appropriately changed.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
  • FIG. 1 is a block diagram of an information display apparatus according to the first embodiment of the present invention;
  • FIG. 2 is a diagram showing an example of a process rule in a closed caption display controller of the embodiment;
  • FIGS. 3A to 3D are diagrams showing an example of control of a closed captions display in the embodiment;
  • FIG. 4 is a diagram showing another example of control of a closed caption display in the embodiment;
  • FIG. 5 is a block diagram of an information display apparatus according to the second embodiment of the present invention;
  • FIG. 6 is a diagram showing an example of a process rule in a closed caption display control unit of the embodiment;
  • FIG. 7 is a diagram showing an example of display on the screen in the embodiment;
  • FIG. 8 is a flow chart showing an example of a flow of a process in a closed caption display control unit of the embodiment;
  • FIG. 9 is a flow chart showing an example of a flow of a detailed process in step S1 of FIG. 8;
  • FIG. 10 is a diagram showing an example of a closed caption display method at the time of a reverse playback in the embodiment;
  • FIG. 11 is a block diagram of an information display apparatus according to the third embodiment of the present invention;
  • FIG. 12 is a block diagram of an information display apparatus according to the fourth embodiment of the present invention; and
  • FIG. 13 is a diagram showing an example of division display of a screen in the embodiment.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Preferred embodiments of the present invention will be described in detail below with reference to the several views of the accompanying drawing.
  • First Embodiment
  • An information display apparatus according to the first embodiment of the present invention is used while being connected to an apparatus that receives information such as a broadcast containing video data, speech data, and closed caption character information data. Closed caption character information may be given as data separated from an image and speech, superposed in a video signal and transmitted, or received as a sub-image.
  • FIG. 1 shows an example of the arrangement of the information display apparatus according to the first embodiment.
  • As shown in FIG. 1, the information display apparatus according to the first embodiment comprises a speech input unit 11 which inputs a speech signal, a speech playback state detector 12 which processes an input speech signal and detects a speech output state, a speech output unit 13 which outputs an input speech signal, an ambient noise detector 14 which detects an ambient noise level, a video input unit 21 which inputs a video signal, a character information input unit 27 which inputs closed caption character information, a closed caption display controller 23 which controls display of input closed caption character information, a closed caption video generator 24 which generates a video signal displaying a closed caption on the basis of an input video signal, input closed caption character information, and character font data, and a video output unit 25 which outputs a video signal generated by the closed caption video generator 24. The character information input unit 27 may be input the closed caption character information corresponding a speech by key-in or conventional speech recognition. When no closed caption is displayed, the closed caption video generator 24 outputs a video signal without displaying any closed caption. Character font data may also be installed in the closed caption video generator 24.
  • The speech playback state detector 12 monitors the sound volume of a speech signal input to the speech input unit 11. The sound volume can be monitored depending on the maximum power per a given time or the like. The speech playback state detector 12 can also be configured to detect whether the sound volume of a speech signal is 0.
  • The ambient noise detector 14 receives ambient noise via a microphone, and detects the loudness of noise. The microphone is so arranged as not to be influenced by a speech output from the speech output unit 13, or as to estimate any ambient sound other than playback speech by referring to a speech signal input to the speech input unit 11. A microphone may detect noise at the viewer position with directivity.
  • In accordance with the sound volume of a speech signal detected by the speech playback state detector 12 and the loudness of ambient noise detected by the ambient noise detector 14, the closed caption display controller 23 controls the closed caption video generator 24 by selecting the show or hide state of a closed caption in accordance with at least one of three rules shown in FIG. 2:
  • “Rule 1: display a closed caption when the speech signal is silent or it is in less than a given level”
  • “Rule 2: display a closed caption when the speech signal is in less than a reference level determined in accordance with the ambient noise level”
  • “Rule 3: display a closed caption when the ambient noise level is in not less than the given level”
  • If two or more of these rules are used, the hide state is selected only when the situation doe not correspond to all of the rules. When even one rule corresponds to the situation, the display state is selected.
  • As a sequence of determining whether each rule corresponds to the situation, for example, rules are extracted one by one to check whether the rule corresponds to the situation. When at first the rule corresponding to the situation is found, the display state is selected to end the sequence. When the rule corresponding to the situation cannot be found upon checking all of the rules, the hide state is selected to end the sequence.
  • In the above example, according to the rules 1 and 2, a closed caption is properly displayed in correspondence with the level of difficulty in hearing due to a weak speech signal. When these processes are not executed, the speech playback state detector 12 can be omitted. In the absence of the speech playback state detector 12, the speech input unit 11 is directly connected to the speech output unit 13.
  • According to the rules 2 and 3, a closed caption is appropriately displayed in correspondence with the difficulty in hearing due to an ambient noise. When these processes are not executed, the ambient noise detector 14 can be omitted.
  • For example, when the sound volume is turned down on the television tuner side, it is hard to hear and understand the contents of a talk though a laughing voice or the like can be heard. Also, when ambient noise becomes loud, e.g., the telephone starts ringing, it becomes difficult to hear the television speech. However, the television speech can be heard even in a noisy environment by turning up the sound volume of the television.
  • The first embodiment automatically displays a closed caption in such situation in which it is hard to hear speech, allowing the viewer to understand the contents. FIG. 3A illustrates a state in which the no-closed caption display state automatically changes to the closed caption display state.
  • The reference level and the parameters of the function are preferably set for each viewer who watches a program. For example, an elderly viewer may set the reference level of the speech signal slightly higher than that for a young viewer. For a viewer who watches a program using a headphone or earphone, the reference level of ambient noise may be set high or the influence of ambient noise on each reference level may be reduced.
  • In general, information supplementation by a closed caption is not so necessary for a normal listener. Even in the closed caption display state, the character size is set smaller than a normal one. The character size may be changed to a normal one in accordance with the closed caption display control condition, as shown in FIG. 3B, or set larger than a normal one, as shown in FIG. 3C. The number of lines may also be increased, as shown in FIG. 3D.
  • Subtitle display control depending on the speech condition is not effective for a hearing impaired person. Thus, a mode in which a closed caption is displayed is always set selectable in accordance with the viewer.
  • The difficulty in hearing may be evaluated in several stages by setting a plurality of reference levels for each condition. The degrees of difficulty in hearing may be made to correspond to respective operations by using the maximum value of a difficulty value, as shown in FIG. 4.
  • The information display apparatus according to the first embodiment can be so connected as to process an input from a television tuner on the input stage of an amplifier or the like which controls the sound volume. Also, the information display apparatus can be so connected as to process an output signal having undergone sound volume control by the amplifier or the like, or a signal for driving the loudspeaker. In this case, the sound volume control state is reflected, and actual difficulty in hearing a speech can be further reflected.
  • The rules 1 to 3 can be given by other equivalent expressions. For example, the rule 3 may be expressed by “display a closed caption when the ambient noise level is not in less than the given level”. This also applies to the remaining rules.
  • The rules 1 to 3 define conditions to display a closed caption. A closed caption is displayed when even one of the conditions is satisfied, and is not displayed when no condition is satisfied. To the contrary, the rules may define conditions not to display a closed caption. A closed caption is not displayed when all the conditions are satisfied, and is displayed when even one condition is not satisfied. These rules and functions are merely an example, and can be variously changed. All (or some) of the above-mentioned functions may be adopted to allow the viewer to arbitrary set which of the functions is to be used.
  • Second Embodiment
  • An information display apparatus according to the second embodiment of the present invention is realized by assembling it in an apparatus which reads out and plays back data from a recording medium such as a DVD that records video data, speech data, and closed caption character information data.
  • FIG. 5 shows an example of the arrangement of the information display apparatus according to the second embodiment.
  • As shown in FIG. 5, the information display apparatus according to the second embodiment comprises a recording medium 31 which records video data, speech data, and closed caption character information data, a playback controller 32 which controls read and playback of the recording medium, a speech playback unit 33 which reads out speech data from the recording medium 31, a speech playback state detector 12 which processes speech data and detects a speech output state, a speech output unit 13 which outputs speech data as a speech signal, an ambient noise detector 14 which detects an ambient noise level, a video playback unit 34 which reads out video data from the recording medium 31, a closed caption playback unit 35 which reads out closed caption character information data from the recording medium 31, a closed caption display controller 23 which controls display of read closed caption character information data, a closed caption video generator 24 which generates a video signal displaying a closed caption on the basis of read video data, read closed caption character information data, and character font data, and a video output unit 25 which outputs a generated video signal with superimposed closed caption.
  • The speech playback state detector 12, speech output unit 13, ambient noise detector 14, closed caption video generator 24, and video output unit 25 are basically the same as those in the first embodiment.
  • The playback controller 32 controls the playback state and playback sound volume in accordance with viewer operation via an operation panel, remote controller, or the like. The playback controller 32 reads out, plays back, and outputs video information, speech information, and closed caption information.
  • In accordance with the sound volume of a speech signal detected by the speech playback state detector 12, the loudness of ambient noise detected by the ambient noise detector 14, and the playback state and playback sound volume of the playback controller 32, the closed caption display controller 23 controls the closed caption video generator 24 by selecting the show or hide state of a closed caption in accordance with at least one of nine rules shown in FIG. 6:
  • “Rule 1: display a closed caption when the speech signal is silent or is in less than a given level”
  • “Rule 2: display a closed caption when the speech signal is in less than a reference level determined in accordance with the ambient noise level”
  • “Rule 3: display a closed caption when the ambient noise level is in not less than the given level”
  • “Rule 4: display a closed caption when the playback state is forward playback other than real-time playback”
  • “Rule 5: display a closed caption when the playback state is backward playback”
  • “Rule 6: display a closed caption when-the playback state is pause”
  • “Rule 7: display a closed caption when the playback sound volume is muted (sound deadening) or is in less than a given level”
  • “Rule 8: display a closed caption when the playback sound volume is in less than a reference level determined in accordance with the ambient noise level”
  • “Rule 9: display a closed caption when the playback state is a state in which a speech channel other than the main speech is output”
  • If a plurality of rules are utilized, the hide state is selected when all the rules result in the hide state.
  • When the rules 2, 3, and 8 are not adopted, the ambient noise detector 14 can be omitted. When the rules 1 and 2 are not applied, the speech playback state detector 12 can be omitted.
  • Instead of applying the rules 4 to 6 to selection of the closed caption display state in the closed caption display controller 23, the playback speech may be muted (sound deadening) in the closed caption display controller 23 when any of these three rules is satisfied. In this case, the speech playback state detector 12 detects that the playback speech does not generate any sound, and the rule 1 is applied, obtaining the same effects.
  • A rule “display a closed caption in a playback state other than forward real-time playback” may be employed in place of the rules 4 to 6. For example, the rule 4 can display a closed caption as shown in FIG. 7 during fast-forward playback at a five-fold speed.
  • For example, when the sound volume of DVD video is turned down, it is difficult to hear and understand the contents of a talk though a laughing voice or the like can be heard. Also, when ambient noise becomes loud, e.g., the telephone starts ringing, it becomes difficult to hear the speech of DVD video. However, the speech of the DVD video can be heard even in a noisy environment by turning up the playback sound volume. The sound volume may be determined from a speech signal, or control information of the DVD video apparatus.
  • In special playback such as two-fold-speed playback, it is difficult to play back speech so as to easily hear it. If the time axis is directly shortened, the speech pitch changes. Even use of a speech speed conversion technique of prolonging/shortening the time without changing the pitch, prolongation/shortening by about 50% or more makes hearing difficult. Prolongation/shortening by about 20% hardly makes hearing difficult, and even special playback can be regarded as real-time playback. From this, a condition to determine “not real-time playback” in the rule 4 may be defined as, for example, “playback with speech at a speed of 0.8 fold or less or 1.2 fold or more”.
  • Generally in backward (rewind) playback, speech cannot be discerned even by real-time playback. In general playback control, the speech synchronized with an image cannot be output during a pause.
  • When a plurality of speech channels such as a bilingual multiplex broadcast are provided, one speech channel such as main speech or sub-speech is selected and output. When the sub-speech is selected, no main speech is output, and thus a closed caption often corresponding to the main speech is preferably displayed. Also when the main speech and sub-speech are assigned to, e.g., left and right channels and simultaneously output, the main speech is hard to heard, and its closed caption is preferably displayed.
  • When the language is designated to select a speech channel, the viewer can easily understand his/her native language, but may find it difficult to understand another language. In this case, a closed caption corresponding to the language of the speech is desirably displayed. Alternatively, the viewer's native language is selected and set, and when a speech channel other than the native language is selected, the closed caption of the viewer's native language is preferably displayed.
  • According to the second embodiment, a closed caption can be automatically displayed in such situation in which it is hard to hear speech, allowing the viewer to understand the contents.
  • The processing flow in the closed caption display controller 23 will be explained with reference to the flow chart of FIG. 8.
  • The sound volume of a speech signal detected in the speech playback state detector 12 is evaluated (rules 1 and 2), and a closed caption display operation mode is set (step S1). The loudness of ambient noise detected in the ambient noise detector 14 is evaluated (rule 3), and a closed caption display operation mode is set (step S2). Of playback states held in the playback controller 32, the playback speed is evaluated (rules 4 to 6), and a closed caption display operation mode is set (step S3). The playback sound volume held in the playback controller 32 is evaluated (rules 7 and 8), and a closed caption display operation mode is set (step S4).
  • A speech channel to be selected out of multiple speech channels held in the playback controller 32 is evaluated (rule 9), and a closed caption display operation mode is set (step S5). The closed caption display state is determined in accordance with a mode in which it is hardest to hear speech, out of the closed caption display operation modes in the respective steps, and closed caption display is controlled (step S6).
  • A detailed processing flow in step S1 will be described with reference to the flow chart of FIG. 9.
  • A speech signal level A1 is compared with the first reference level A10 set in advance (step S11). If A1 is lower, A1 is compared with the second reference level A11 set in advance (step S12). If A1 is lower, (rule 1) closed caption display operation mode 2 is set (step S17). If A1 is higher than or equal to A11 in step S12, (rule 1) closed caption display operation mode 1 is set (step S16).
  • A reference level F1 n (An) determined in accordance with loudness An of ambient noise is calculated (step S13), and compared with the speech signal level A1 (step S14). If A1 is lower, (rule 2) closed caption display operation mode 1 is set (step S16). If no condition is satisfied, closed caption display operation mode 0 is set (step S15).
  • A closed caption display method in backward playback will be explained with reference to an example in FIG. 10.
  • In this example, pieces of video information of frames F1 to F11, and pieces of speech information synchronized in time are provided. “In sync with image” is added to or embedded in the frame F2. “Subtitle is displayed” is added to or superposed in the frame F6. Subtitle information “delete closed caption” is added to or superposed in the frame F10.
  • In forward playback, frames are sequentially played back from the frames F1 to F11. The closed caption “in sync with image” is displayed from F2 to F5, and the closed caption “closed caption is displayed” is displayed from the frames F6 to F9. In backward playback, the frames are played back in the reverse order from the frames F11 to F1. The frames are searched in the reverse order from the frame F11 for a frame containing closed caption information. If closed caption deletion information is found in the frame F10, the frames F11 and F10 are displayed without any closed caption. The frames are searched in the reverse order from the frame F9 for a frame containing closed caption information. If closed caption information “closed caption is displayed” is found in the frame F6, the frames F9, F8, F7, and F6 are displayed with the closed caption “closed caption is displayed”.
  • The frames are searched in the reverse order from the frame F5 for a frame containing closed caption information. If closed caption information “in sync with image” is found in the frame F2, the frames F5, F4, F3, and F2 are displayed with the closed caption “in sync with image”. No preceding frame containing closed caption information exists before the frame F1, thus the closed caption is deleted, and the frame F1 is displayed.
  • The rules 1 to 9 can be represented by other equivalent expressions. For example, the rule 5 may be expressed by “display a closed caption when the playback state is not forward playback, pause, or stop”. This also applies to the remaining rules.
  • The rules 1 to 9 define conditions to display a closed caption. A closed caption is displayed when even one of the conditions is satisfied, and is not displayed when no condition is satisfied. To the contrary, the rules may define conditions not to display a closed caption. A closed caption is not displayed when all the conditions are satisfied, and is displayed when even one condition is not satisfied.
  • These rules and functions are merely an example, and can be variously changed. All (or some) of the above-mentioned functions may be employed to allow the viewer to arbitrary set which of the functions is to be used.
  • Third Embodiment
  • An information display apparatus according to the third embodiment of the present invention is used while being connected to an apparatus which reads out and plays back data from a recording medium such as a DVD recording video data, speech data, and closed caption character information data. Subtitle character information may be given as data different from an image and speech, or superposed in a video signal and recorded.
  • FIG. 11 shows an example of the arrangement of the information display apparatus according to the third embodiment. As shown in FIG. 11, the information display apparatus according to the third embodiment comprises a speech input unit 11 which inputs a speech signal, a speech playback state detector 12 which processes an input speech signal and detects a speech output state, a speech output unit 13 which outputs an input speech signal, an ambient noise detector 14 which detects an ambient noise level, a control information input unit 26 which inputs playback control information of an image or the like, a video input unit 21 which inputs a video signal, a character information input unit 27 which inputs closed caption character information, a closed caption display controller 23 which controls display of input closed caption character information, a closed caption video generator 24 which generates a video signal displaying a closed caption on the basis of an input video signal, input closed caption character information, and character font data, and a video output unit 25 which outputs a generated video signal of a closed caption.
  • The speech input unit 11, speech playback state detector 12, speech output unit 13, ambient noise detector 14, video input unit 21, character information input unit 27, closed caption video generator 24, and video output unit 25 are basically the same as those in the first and second embodiments.
  • The control information input unit 26 inputs playback state information and playback sound volume information which are controlled in accordance with viewer operation via an operation panel, a remote controller, or the like in an apparatus which plays back an image or the like from a recording medium.
  • In accordance with the sound volume of a speech signal detected by the speech playback state detector 12, the loudness of ambient noise detected by the ambient noise detector 14, and playback state information and playback sound volume information which are input to the control information input unit 26, the closed caption display controller 23 controls the closed caption video generator 24 by selecting the show or hide state of a closed caption in accordance with, e.g., at least one of the nine rules described in the second embodiment. If a plurality of rules are utilized, the hide state is selected when all the rules exhibit the hide state.
  • The information display apparatus according to the third embodiment can be so configured as to receive information on the operation state of a DVD recorder/player apparatus or the like and input the information to the control information input unit 26. The information display apparatus may receive a remote control signal (infrared signal, radio signal, or the like) for operating the DVD recorder/player apparatus or the like, and then operate.
  • The information display apparatus according to the third embodiment can be so connected as to process an input from the DVD recorder/player apparatus or the like on the input stage of an amplifier or the like which controls the sound volume. Also, the information display apparatus can be so connected as to process an output signal having undergone sound volume control by the amplifier or the like, or a signal for driving the loudspeaker. In this case, the sound volume control state is reflected, and actual difficulty in hearing speech can be further reflected.
  • The above-described rules and functions are merely an example, and can be variously changed. All (or some) of the functions may be adopted to allow the viewer to arbitrary set which of the functions is to be used.
  • Fourth Embodiment
  • An information display apparatus according to the fourth embodiment of the present invention is realized by assembling it in an apparatus which receives information of a broadcast or the like containing video data, speech data, and closed caption character information data. Subtitle character information-may be given as data different from an image and speech, superposed in a video signal and transmitted, or received as a sub-image.
  • FIG. 12 shows an example of the arrangement of the information display apparatus according to the fourth embodiment. As shown in FIG. 12, the information display apparatus according to the fourth embodiment comprises a contents receiver 41 which receives contents of a broadcast or the like transmitted via radio waves, a cable, or the like, a speech playback unit 43 which extracts speech data from received information, a speech playback state detector 12 which processes an extracted speech signal and detects a speech output state, a speech output unit 13 which outputs an extracted speech signal, an ambient noise detector 14 which detects an ambient noise level, a video playback unit 44 which extracts video data from received information, a closed caption playback unit 45 which extracts closed caption character information data from received information, a closed caption display controller 23 which controls display of extracted closed caption character information, a closed caption video generator 24 which generates a video signal displaying a closed caption on the basis of an extracted video signal, input closed caption character information, and character font data, and a video output unit 25 which outputs a generated video signal of a closed caption.
  • The speech playback state detector 12, speech output unit 13, ambient noise detector 14, closed caption video generator 24, and video output unit 25 are basically the same as those in the first, second, and third embodiments. The playback controller 42 controls the playback state and playback sound volume in accordance with viewer operation via an operation panel, remote controller, or the like. The playback controller 42 reads out, plays back, and outputs video information, speech information, and closed caption information.
  • When the information display apparatus is equipped with a function of receiving contents transmitted via a plurality of channels in parallel with each other, splitting the screen into, for example, four screen sections as shown in FIG. 13, and displaying the contents, the playback controller 42 also controls the screen display state in accordance with viewer operation. The playback controller 42 further controls to select one of contents on the display windows and play back a speech.
  • In accordance with the sound volume of a speech signal detected by the speech playback state detector 12, the loudness of ambient noise detected by the ambient noise detector 14, and the playback state, playback sound volume, and screen split state of the playback controller 42, the closed caption display controller 23 controls the closed caption video generator 24 by selecting the show or hide state of a closed caption in accordance with at least one of four rules:
  • “Rule 1: display a closed caption when the speech signal is silent or is it is in less than a given level”
  • “Rule 2: display a closed caption when the speech signal is in less than a reference level determined in accordance with the ambient noise level”
  • “Rule 3: display a closed caption when the ambient noise level is in not less than the given level”
  • “Rule 10: display a closed caption when screen display is multi-screen section display of a plurality of windows and no speech output is selected for the windows”
  • If a plurality of rules are utilized, the non-display state is selected when all the rules represent the non-display state.
  • When the rules 2 and 3 are not applied, the ambient noise detector 14 can be omitted. When the rules 1 and 2 are not executed, the speech playback state detector 12 can be omitted.
  • For example, when the playback sound volume of the television is turned down, it is difficult to hear and understand the contents of a talk though a laughing voice or the like can be heard. Also, when ambient noise becomes loud, e.g., the telephone starts ringing, it becomes difficult to hear the speech of the television. However, the speech of the television can be heard even in a noisy environment by turning up the sound volume of the television. The sound volume may be determined from a speech signal, or control information of the television or the like.
  • When contents are displayed on two or more windows upon reception of the contents from a plurality of broadcasting stations, it is difficult to simultaneously output speech sounds corresponding to these windows and hear them. In this case, subtitles corresponding to the respective windows are preferably displayed as shown in FIG. 13A. In general, each screen section is selected to output its speech without outputting sounds corresponding to the remaining screen sections. The screen sections other than the screen section selected to output the speech may display corresponding subtitles, as shown in FIG. 13B.
  • According to the fourth embodiment, a closed caption can be automatically displayed in such situation in which it is hard to hear speech, allowing the viewer to understand the contents.
  • The rules 1 to 3 and 10 define conditions to display a closed caption. A closed caption is displayed when even one of the conditions is satisfied, and is not displayed when no condition is satisfied. To the contrary, the rules may define conditions not to display a closed caption. A closed caption is not displayed when all the conditions are satisfied, and is displayed when even one condition is not satisfied.
  • These rules and functions are merely an example, and can be variously changed. All (or some) of the above-mentioned functions may be employed to allow the viewer to arbitrarily set which of the functions is to be used. The above functions can also be implemented by writing them as software and processing them by a computer having a proper mechanism.
  • The above embodiments can also be executed as a program for causing a computer to execute a predetermined means, function as a predetermined means, or implement a predetermined function. In addition, the embodiments can be practiced as a computer-readable recording medium that records the program.
  • The present invention can control display of closed caption character information in accordance with the viewing situation.
  • Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

Claims (5)

1. An information display apparatus comprising:
a display device configured to display a video;
a speech detection unit configured to detect a playback state of a playback speech, the playback state including at least one of a sound volume of the playback speech and a playback speed;
a closed caption generating unit configured to generate a closed caption video comprising character information associated with the playback speech; and
a closed caption display control unit configured to display the closed caption video on the display device and to change a display state of the closed caption video depending upon the playback state detected by the speech detection unit.
2. The apparatus according to claim 1, wherein the closed caption display control unit changes the display state to a display size larger than a normal size.
3. An information display method comprising:
generating a playback speech;
detecting a playback state of the playback speech, the playback state including at least one of a sound volume of the playback speech and a playback speed; and
changing a display state of character information associated with the playback speech according to the detected playback state.
4. An information display method comprising:
generating a playback speech;
determining a noise level of an ambient noise other than the playback speech referring to a playback speech signal via a microphone; and
changing a display state of the character information associated with the playback speech according to the noise level.
5. An information display method comprising:
playing back a video and a speech from a storage medium;
controlling a playback state of the video and the speech;
detecting the playback state of the video and the speech that are played back; and
changing a display state of character information associated with the video and the speech according to the playback state.
US11/861,980 2003-03-31 2007-09-26 Information display apparatus, information display method and program therefor Abandoned US20080027713A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2003095402A JP4170808B2 (en) 2003-03-31 2003-03-31 Information display device, information display method, and program
JP2003-095402 2003-03-31
US10/810,648 US7443449B2 (en) 2003-03-31 2004-03-29 Information display apparatus, information display method and program therefor
US11/861,980 US20080027713A1 (en) 2003-03-31 2007-09-26 Information display apparatus, information display method and program therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/861,980 US20080027713A1 (en) 2003-03-31 2007-09-26 Information display apparatus, information display method and program therefor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/810,648 Continuation US7443449B2 (en) 2003-03-31 2004-03-29 Information display apparatus, information display method and program therefor

Publications (1)

Publication Number Publication Date
US20080027713A1 true US20080027713A1 (en) 2008-01-31

Family

ID=33407739

Family Applications (4)

Application Number Title Priority Date Filing Date
US10/810,648 Expired - Fee Related US7443449B2 (en) 2003-03-31 2004-03-29 Information display apparatus, information display method and program therefor
US11/861,995 Active 2027-10-16 US8212922B2 (en) 2003-03-31 2007-09-26 Information display apparatus, information display method and program therefor
US11/861,980 Abandoned US20080027713A1 (en) 2003-03-31 2007-09-26 Information display apparatus, information display method and program therefor
US12/245,020 Abandoned US20090040378A1 (en) 2003-03-31 2008-10-03 Information display apparatus, information display method and program therefor

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US10/810,648 Expired - Fee Related US7443449B2 (en) 2003-03-31 2004-03-29 Information display apparatus, information display method and program therefor
US11/861,995 Active 2027-10-16 US8212922B2 (en) 2003-03-31 2007-09-26 Information display apparatus, information display method and program therefor

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/245,020 Abandoned US20090040378A1 (en) 2003-03-31 2008-10-03 Information display apparatus, information display method and program therefor

Country Status (2)

Country Link
US (4) US7443449B2 (en)
JP (1) JP4170808B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9781409B2 (en) 2012-08-17 2017-10-03 Nec Corporation Portable terminal device and program

Families Citing this family (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4170808B2 (en) * 2003-03-31 2008-10-22 株式会社東芝 Information display device, information display method, and program
JP4128916B2 (en) * 2003-08-15 2008-07-30 株式会社東芝 Subtitle control apparatus and method, and program
JP3101737U (en) * 2003-11-18 2004-06-17 船井電機株式会社 DVD integrated TV
JP2005293174A (en) * 2004-03-31 2005-10-20 Toshiba Corp Text data editing device, method and program
KR101041810B1 (en) * 2004-08-27 2011-06-17 엘지전자 주식회사 Display apparatus and auto caption turn-on method thereof
KR100678938B1 (en) * 2004-08-28 2007-02-07 삼성전자주식회사 Apparatus and method for synchronization between moving picture and caption
JP2006081061A (en) * 2004-09-13 2006-03-23 Alpine Electronics Inc Audio output device and audio/video output device
KR100718080B1 (en) 2005-07-25 2007-05-16 삼성전자주식회사 Broadcast receiving device for displaying closed caption data and method thereof
JP2007082074A (en) * 2005-09-16 2007-03-29 Sony Corp Display control device, display control method, and program
KR101187787B1 (en) * 2006-02-18 2012-10-05 삼성전자주식회사 Method and apparatus for searching moving picture using key frame
JP2007300323A (en) * 2006-04-28 2007-11-15 Sharp Corp Subtitle display control system
US8739240B2 (en) * 2006-09-12 2014-05-27 At&T Intellectual Property I, L.P. Authoring system for IPTV network
US8375416B2 (en) * 2006-10-27 2013-02-12 Starz Entertainment, Llc Media build for multi-channel distribution
US20080186810A1 (en) * 2007-02-06 2008-08-07 Kumaran O R Senthil System and Method for Audiovisual Content Search
US20080295040A1 (en) * 2007-05-24 2008-11-27 Microsoft Corporation Closed captions for real time communication
WO2009029296A1 (en) * 2007-08-31 2009-03-05 At & T Mobility Ii Llc Enhanced messaging with language translation feature
JP4916988B2 (en) * 2007-09-25 2012-04-18 さゆり 堀 Video / audio playback apparatus and video / audio playback method
JP4839488B2 (en) * 2007-12-07 2011-12-21 Necカシオモバイルコミュニケーションズ株式会社 Subtitled video playback apparatus and method, and program
WO2009157893A1 (en) * 2008-06-24 2009-12-30 Thomson Licensing Method and system for redisplaying text
US20100106482A1 (en) * 2008-10-23 2010-04-29 Sony Corporation Additional language support for televisions
US8494723B2 (en) * 2010-02-05 2013-07-23 Honda Motor Co., Ltd. Method for operating a vehicle display and a vehicle display system
US20120249879A1 (en) * 2010-05-14 2012-10-04 Yuan yan-wei Method for eliminating subtitles of a video program, and associated video display system
US9792612B2 (en) 2010-11-23 2017-10-17 Echostar Technologies L.L.C. Facilitating user support of electronic devices using dynamic matrix code generation
US9329966B2 (en) 2010-11-23 2016-05-03 Echostar Technologies L.L.C. Facilitating user support of electronic devices using matrix codes
BR112013012368A2 (en) 2010-11-24 2019-10-01 Echostar Technologies Llc methods for measuring audience, tracking audiovisual content viewing habits and content visualization and mechanism for emitting data interpretation
US9280515B2 (en) 2010-12-03 2016-03-08 Echostar Technologies L.L.C. Provision of alternate content in response to QR code
US8886172B2 (en) 2010-12-06 2014-11-11 Echostar Technologies L.L.C. Providing location information using matrix code
US8875173B2 (en) 2010-12-10 2014-10-28 Echostar Technologies L.L.C. Mining of advertisement viewer information using matrix code
US9596500B2 (en) 2010-12-17 2017-03-14 Echostar Technologies L.L.C. Accessing content via a matrix code
US8640956B2 (en) 2010-12-17 2014-02-04 Echostar Technologies L.L.C. Accessing content via a matrix code
US9148686B2 (en) 2010-12-20 2015-09-29 Echostar Technologies, Llc Matrix code-based user interface
US8856853B2 (en) 2010-12-29 2014-10-07 Echostar Technologies L.L.C. Network media device with code recognition
US8292166B2 (en) 2011-01-07 2012-10-23 Echostar Technologies L.L.C. Performing social networking functions using matrix codes
US8534540B2 (en) 2011-01-14 2013-09-17 Echostar Technologies L.L.C. 3-D matrix barcode presentation
US8786410B2 (en) 2011-01-20 2014-07-22 Echostar Technologies L.L.C. Configuring remote control devices utilizing matrix codes
US8553146B2 (en) 2011-01-26 2013-10-08 Echostar Technologies L.L.C. Visually imperceptible matrix codes utilizing interlacing
US8468610B2 (en) 2011-01-27 2013-06-18 Echostar Technologies L.L.C. Determining fraudulent use of electronic devices utilizing matrix codes
US9571888B2 (en) 2011-02-15 2017-02-14 Echostar Technologies L.L.C. Selection graphics overlay of matrix code
US8511540B2 (en) 2011-02-18 2013-08-20 Echostar Technologies L.L.C. Matrix code for use in verification of data card swap
US8931031B2 (en) * 2011-02-24 2015-01-06 Echostar Technologies L.L.C. Matrix code-based accessibility
US9367669B2 (en) 2011-02-25 2016-06-14 Echostar Technologies L.L.C. Content source identification using matrix barcode
US8443407B2 (en) 2011-02-28 2013-05-14 Echostar Technologies L.L.C. Facilitating placeshifting using matrix code
US8550334B2 (en) 2011-02-28 2013-10-08 Echostar Technologies L.L.C. Synching one or more matrix codes to content related to a multimedia presentation
US8833640B2 (en) 2011-02-28 2014-09-16 Echostar Technologies L.L.C. Utilizing matrix codes during installation of components of a distribution system
US9736469B2 (en) 2011-02-28 2017-08-15 Echostar Technologies L.L.C. Set top box health and configuration
EP2525281B1 (en) 2011-05-20 2019-01-02 EchoStar Technologies L.L.C. Improved progress bar
US8958013B2 (en) * 2011-10-21 2015-02-17 Ramp Holdings, Inc. Aligning video clips to closed caption files
KR101919787B1 (en) * 2012-05-09 2018-11-19 엘지전자 주식회사 Mobile terminal and method for controlling thereof
EP2733698B1 (en) 2012-11-14 2016-09-14 Deutsche Telekom AG Apparatus for communicating and for displaying information
US9398335B2 (en) * 2012-11-29 2016-07-19 Qualcomm Incorporated Methods and apparatus for using user engagement to provide content presentation
US20140185830A1 (en) * 2012-12-27 2014-07-03 Daniel Avrahami Methods, systems, and apparatus for audio backtracking control
US9210360B2 (en) 2012-12-28 2015-12-08 Echostar Uk Holdings Limited Volume level-based closed-captioning control
US9691377B2 (en) * 2013-07-23 2017-06-27 Google Technology Holdings LLC Method and device for voice recognition training
US9548047B2 (en) 2013-07-31 2017-01-17 Google Technology Holdings LLC Method and apparatus for evaluating trigger phrase enrollment
KR20150142462A (en) * 2014-06-12 2015-12-22 삼성전자주식회사 Electric apparatus and control method thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5481296A (en) * 1993-08-06 1996-01-02 International Business Machines Corporation Apparatus and method for selectively viewing video information
US6243645B1 (en) * 1997-11-04 2001-06-05 Seiko Epson Corporation Audio-video output device and car navigation system
US20020101537A1 (en) * 2001-01-31 2002-08-01 International Business Machines Corporation Universal closed caption portable receiver
US6490553B2 (en) * 2000-05-22 2002-12-03 Compaq Information Technologies Group, L.P. Apparatus and method for controlling rate of playback of audio data
US6972802B2 (en) * 1997-10-21 2005-12-06 Bray J Richard Language filter for home TV
US7013273B2 (en) * 2001-03-29 2006-03-14 Matsushita Electric Industrial Co., Ltd. Speech recognition based captioning system
US7129990B2 (en) * 2001-01-11 2006-10-31 Jaldi Semiconductor Corp. System and method for detecting a non-video source in video signals
US7443449B2 (en) * 2003-03-31 2008-10-28 Kabushiki Kaisha Toshiba Information display apparatus, information display method and program therefor
US7446817B2 (en) * 2004-02-18 2008-11-04 Samsung Electronics Co., Ltd. Method and apparatus for detecting text associated with video

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0746500A (en) 1993-07-02 1995-02-14 Sony Corp Television receiver
JPH07123376A (en) 1993-10-20 1995-05-12 Hitachi Ltd Teletext broadcasting receiver
JPH07226907A (en) 1994-02-15 1995-08-22 Sony Corp Video signal reproducing device and video recording medium
US5703655A (en) * 1995-03-24 1997-12-30 U S West Technologies, Inc. Video programming retrieval using extracted closed caption data which has been partitioned and stored to facilitate a search and retrieval process
US5995155A (en) * 1995-07-17 1999-11-30 Gateway 2000, Inc. Database navigation system for a home entertainment system
JPH0965295A (en) 1995-08-21 1997-03-07 Toshiba Corp Vbi information receiver
US5657088A (en) * 1995-12-22 1997-08-12 Cirrus Logic, Inc. System and method for extracting caption teletext information from a video signal
JP4106698B2 (en) 1996-12-24 2008-06-25 ソニー株式会社 Television receiver having closed caption function and video signal processing method in television receiver having closed caption function
US7139031B1 (en) * 1997-10-21 2006-11-21 Principle Solutions, Inc. Automated language filter for TV receiver
US6542200B1 (en) * 2001-08-14 2003-04-01 Cheldan Technologies, Inc. Television/radio speech-to-text translating processor

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5481296A (en) * 1993-08-06 1996-01-02 International Business Machines Corporation Apparatus and method for selectively viewing video information
US6972802B2 (en) * 1997-10-21 2005-12-06 Bray J Richard Language filter for home TV
US6243645B1 (en) * 1997-11-04 2001-06-05 Seiko Epson Corporation Audio-video output device and car navigation system
US6490553B2 (en) * 2000-05-22 2002-12-03 Compaq Information Technologies Group, L.P. Apparatus and method for controlling rate of playback of audio data
US6505153B1 (en) * 2000-05-22 2003-01-07 Compaq Information Technologies Group, L.P. Efficient method for producing off-line closed captions
US7129990B2 (en) * 2001-01-11 2006-10-31 Jaldi Semiconductor Corp. System and method for detecting a non-video source in video signals
US20020101537A1 (en) * 2001-01-31 2002-08-01 International Business Machines Corporation Universal closed caption portable receiver
US7013273B2 (en) * 2001-03-29 2006-03-14 Matsushita Electric Industrial Co., Ltd. Speech recognition based captioning system
US7443449B2 (en) * 2003-03-31 2008-10-28 Kabushiki Kaisha Toshiba Information display apparatus, information display method and program therefor
US7446817B2 (en) * 2004-02-18 2008-11-04 Samsung Electronics Co., Ltd. Method and apparatus for detecting text associated with video

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9781409B2 (en) 2012-08-17 2017-10-03 Nec Corporation Portable terminal device and program

Also Published As

Publication number Publication date
US7443449B2 (en) 2008-10-28
JP4170808B2 (en) 2008-10-22
JP2004304531A (en) 2004-10-28
US20090040378A1 (en) 2009-02-12
US20080025698A1 (en) 2008-01-31
US20040252979A1 (en) 2004-12-16
US8212922B2 (en) 2012-07-03

Similar Documents

Publication Publication Date Title
US6546092B2 (en) Video caller identification systems and methods
US6243448B1 (en) Video caller identification systems and methods
US5602598A (en) Television receiver with caption display
US5671019A (en) Character information display apparatus for a partial and a full-screen display
KR100912212B1 (en) Audiovisualav device and control method thereof
JP2859476B2 (en) The method caption signal display device and caption signal display
JP4492462B2 (en) Electronic device, video processing apparatus, and video processing method
AU2009226267B2 (en) Display device with object-oriented stereo sound coordinate display
CN1051895C (en) 'Channel guide' TV system automatically activated by absence of program information
US8195029B2 (en) Content viewing support apparatus and content viewing support method, and computer program
US5917781A (en) Apparatus and method for simultaneously reproducing audio signals for multiple channels
US7467088B2 (en) Closed caption control apparatus and method therefor
US5900908A (en) System and method for providing described television services
KR950011655B1 (en) Channel & broadcasting station marked apparatus
US20030190147A1 (en) Method for reproducing sub-picture data in optical disc device, and method for displaying multi-text in optical disc device
US5696868A (en) Apparatus and method for recording/playing back broadcasting signal
JP2005522111A (en) Method, apparatus, and program for providing slow motion advertisement to video information
JP4550044B2 (en) Audio visual playback system and audio visual playback method
US8161504B2 (en) Systems and methods for memorializing a viewer's viewing experience with captured viewer images
JP3615195B2 (en) Content recording / playback apparatus and content editing method
JP4172379B2 (en) Recording / playback device
EP2066113A2 (en) Audio processing apparatus, video processing apparatus, and audiovisual system
EP1631080A2 (en) Video apparatus and method for controlling the same
KR101360316B1 (en) System and method for closed captioning
EP0988752B1 (en) A tv receiver with an electronic program guide (epg)

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION