CN104954831A - Method for look data definition and transmission on high definition multimedia interface - Google Patents

Method for look data definition and transmission on high definition multimedia interface Download PDF

Info

Publication number
CN104954831A
CN104954831A CN201510266221.8A CN201510266221A CN104954831A CN 104954831 A CN104954831 A CN 104954831A CN 201510266221 A CN201510266221 A CN 201510266221A CN 104954831 A CN104954831 A CN 104954831A
Authority
CN
China
Prior art keywords
video content
metadata
described video
hdmi
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510266221.8A
Other languages
Chinese (zh)
Inventor
伊苟·托拜厄斯·道瑟尔
赖纳·茨威
沃尔夫冈·安德芮斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
InterDigital CE Patent Holdings SAS
Original Assignee
Thomson Licensing SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing SAS filed Critical Thomson Licensing SAS
Priority claimed from CN200880126062.XA external-priority patent/CN101933094A/en
Publication of CN104954831A publication Critical patent/CN104954831A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • G11B27/3063Subcodes
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/641Multi-purpose receivers, e.g. for auxiliary information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing

Abstract

The invention provides a method for look data definition and transmission on a high definition multimedia interface. The method includes the steps: generating metadata of video contents; preparing for the video contents and the metadata; transmitting the video contents and the metadata on the high definition multimedia interface. A content creator uses the metadata for modifying the video contents before displaying the video contents by considering change of different display devices and change of different creating intentions.

Description

For watching the method for the transmission in data definition and HDMI (High Definition Multimedia Interface)
The divisional application that the application is the applying date is on January 31st, 2008, application number is 200880126062.X, denomination of invention is the application for a patent for invention of " for watching the method and system of the transmission in data definition and HDMI (High Definition Multimedia Interface) ".
the cross reference of related application
The application relates to lawyer's catalog number and is No.PU070307 and the non-provisional application of called after " for watching the method and system of data definition and transmission ", and its full content is incorporated fully by reference by reference, and is currently submitted.
Technical field
Present principles relate generally to multimedia interface, relates more specifically to the method and system for watching the transmission in data definition and HDMI (High Definition Multimedia Interface) (HDMI).
Background technology
Current, when using in order to family or specialty uses and transmits video content product, there is a kind of single color that this delivery of video product is made and judging (intention of its general representing video content founder).But the difference of content uses practice to occur, thus the color of content is made to judge may be necessary to be modified.Such as, these different uses practice comprises the different display types of such as orthographic projection display, direct-view display or portable display and so on, each in these display types all needs to judge that carrying out some changes, to provide the best image of this video content to color.
Summary of the invention
According to the method and system of the various embodiments of present principles by providing the transmission in viewing data definition (look data definition) and HDMI (High Definition Multimedia Interface) (HDMI), solve the deficiencies in the prior art.
According to an aspect of present principles, provide a kind of method.The method comprises the metadata of generating video content.This metadata is used for by considering change between different display unit and differently creating the change between intention and modified to video content before display of video content by creator of content.The method also comprises preparation video content and metadata, to transmit in HDMI (High Definition Multimedia Interface).
According to the another aspect of present principles, provide a kind of system.This system comprises Generator and metadata transmission preparation device.Generator is used for the metadata of generating video content.This metadata is used for by considering change between different display unit and differently creating the change between intention and modified to video content before display of video content by creator of content.Metadata transmitting device for preparing video content and metadata, to transmit in HDMI (High Definition Multimedia Interface).
Accompanying drawing explanation
Consider detailed description below in conjunction with the drawings, easily can understand the instruction of present principles, wherein:
Fig. 1 shows the high level block diagram for connecting the system 100 sending viewing data (look data) in HDMI (High Definition Multimedia Interface) according to the embodiment of the present invention;
Fig. 2 shows the more detailed high level block diagram of the system 100 of Fig. 1 of the realization order viewing transfer of data according to the embodiment of the present invention;
Fig. 3 shows the more detailed high level block diagram of the system 100 of watching Fig. 1 of transfer of data according to the realization of alternate embodiment of the present invention side by side;
Fig. 4 shows the flow chart of the method for watching data definition and transmission according to the embodiment of the present invention;
Fig. 5 shows the exemplary expression of the viewing data 500 according to the embodiment of the present invention;
Fig. 6 shows the exemplary KLV symbol being used in the metadata 600 in viewing data basic messae (Look Data Elementary Message) according to the embodiment of the present invention;
Fig. 7 illustrates in greater detail the KLV symbol of the metadata 600 according to Fig. 6 of the embodiment of the present invention;
Fig. 8 shows the exemplary viewing data basic messae 800 being implemented as the 3D-LUT of the bit-depth with 8 bits according to the embodiment of the present invention;
Fig. 9 shows the exemplary viewing data basic messae 900 being implemented as the 3D-LUT of the bit-depth with 10 bits according to the embodiment of the present invention;
Figure 10 shows the exemplary viewing data basic messae 1000 being implemented as the 1D-LUT of the bit-depth with 8 bits according to the embodiment of the present invention;
Figure 11 shows the exemplary viewing data basic messae 1100 being implemented as the 1D-LUT of the bit-depth with 10 bits according to the embodiment of the present invention;
Figure 12 shows the exemplary viewing data basic messae 1200 being implemented as 3 × 3 matrixes of the bit-depth with 8 bits according to the embodiment of the present invention;
Figure 13 shows the exemplary viewing data basic messae 1300 being implemented as 3 × 3 matrixes of the bit-depth with 10 bits according to the embodiment of the present invention;
Figure 14 shows the exemplary viewing data basic messae 1400 being implemented as 3 × 3 matrixes of the bit-depth with 16 bits according to the embodiment of the present invention;
Figure 15 shows the exemplary filters group 1500 for frequency response correction according to the embodiment of the present invention;
Figure 16 shows the discrete frequency 1600 for frequency equilibrium according to the embodiment of the present invention;
Figure 17 shows the exemplary viewing data basic messae 1700 for 8 bit frequency equilibriums according to the embodiment of the present invention;
Figure 18 shows the exemplary viewing data basic messae 1800 for motor behavior (motion behavior) according to the embodiment of the present invention;
Figure 19 shows the exemplary viewing data basic messae 1900 for film grain according to the embodiment of the present invention;
Figure 20 shows the exemplary viewing data basic messae 2000 for noise according to the embodiment of the present invention;
Figure 21 shows the exemplary viewing data basic messae 2100 for time editing (time editing) according to the embodiment of the present invention, and wherein time editing can be used to editor control;
Figure 22 shows the exemplary viewing data basic messae 2200 for tone mapping according to the embodiment of the present invention;
Figure 23 shows the exemplary HDMI gamut metadata bag 2300 that can be applied to it according to the embodiment of the present invention of the embodiment of the present invention;
Figure 24 shows the illustrative examples HDMI gamut metadata header 2400 that can be applied to it according to the present invention of the embodiment of the present invention;
Figure 25 shows the exemplary gamut metadata bag 2500 that can be applied to according to the present invention of the embodiment of the present invention in its HDMI version 1.3;
Figure 26 shows the exemplary viewing packet header 2600 for vendor-specific information frame (vendor specific info frame) according to the embodiment of the present invention;
Figure 27 shows the exemplary vendor-specific information frame 2700 that can be applied to according to the present principles of the embodiment of the present invention in its CEA 861D;
Figure 28 shows the special CEC instruction 2800 of the exemplary seller for sending viewing data according to the embodiment of the present invention;
Figure 29 shows the exemplary network layer model 2900 for consumer electronics's control bus according to the embodiment of the present invention;
Figure 30 shows the high level schematic diagram for utilizing CEC seller's specific command to provide the exemplary CEC process 3000 being applied to application communication according to the embodiment of the present invention;
Figure 31 shows and illustrates according to the viewing packet of the embodiment of the present invention high level block diagram to the exemplary transformation 3100 of the frame for utilizing HDMI to transmit;
Figure 32 show according to the embodiment of the present invention for watching the exemplary viewing data basic messae 3200 that data come into force;
Figure 33 shows the example devices 3300 for generating CEC validate signal (validate signal) according to the embodiment of the present invention;
Figure 34 shows the flow chart of the method for the validate signal transmission in CEC according to the embodiment of the present invention.
Should be appreciated that, accompanying drawing for illustration of concept of the present invention, and must not be that unique possible configuration of the present invention is described.For the ease of understanding, in the conceived case, use identical reference number to represent element identical in accompanying drawing.
Embodiment
Present principles advantageously provides a kind of method and system for watching the transmission in data definition and HDMI (High Definition Multimedia Interface) (HDMI).Although mainly describe present principles in the background of transmission system relating to source apparatus and display unit, specific embodiment of the present invention should not be considered to cause restriction to scope of the present invention.
The function of the various elements shown in accompanying drawing can by using specialized hardware and can the hardware of executive software and the combination of suitable software providing.When provided by a processor, these functions can be provided by single application specific processor, single share processor or multiple independent processor (some of them can be shared).In addition, clearly the using of term " processor " or " controller " should not be understood to refer to exclusively can the hardware of executive software, and can imply and comprise digital signal processor (" DSP ") hardware, read-only memory (" ROM "), random access memory (" RAM ") and Nonvolatile memory devices for storing software without limitation.In addition, the 26S Proteasome Structure and Function equivalent that all statements of principle of the present invention, aspect and embodiment and specific embodiment thereof are used for covering it is quoted here.In addition, wish that these equivalents comprise the equivalent (that is, tubular construction how, does not perform any element developed of identical function) of current known equivalent and in the future exploitation.
So, such as, it will be apparent to one skilled in the art that shown here block representation embodies the illustrative system components of principle of the present invention and/or the conceptual diagram of circuit.Similarly, should be understood that, any flow table, flow chart, state transition figure, false code etc. represent and order can represent and the various process that can be performed by computer or processor, no matter and whether this computer or processor are explicitly shown in computer-readable medium.
For the quoting of " embodiment " or " embodiment " of present principles, specification means that the special characteristic, structure, characteristic etc. in conjunction with the embodiments described is included at least one embodiment of present principles.So, occur that the phrase " in one embodiment " of diverse location in the description or the appearance of " in an embodiment " must not refer to identical embodiment.
In addition, as used herein, for transmission and the reception of metadata, phrase " band in (in-band) " refer to this metadata by with send and/or receive together with the picture material after the colour correction shown by consumer devices.On the contrary, phrase " band outer (out-of-band) " refers to the transmission of metadata and/or receives independent of the picture material after the colour correction will shown by consumer devices.
In addition, as used herein, term " scene " refers to the scope of the picture frame in moving image, and wherein moving image usually derives from single " camera lens " and means a series of sequence photographies between scene change.
In addition, as used herein, phrase " viewing data management " refers to editor, the transmission of viewing data and applies.
In addition, as used herein, phrase " compact disk player " refers to any one in the compact disk player, BLU-RAY device for reproducing digital video disc, high-definition digital video disc player etc. of single-definition.
In addition, as used herein, " untapped colour gamut configuration file " refer to current in the version 1.3 (or any previous version) of HDMI standard not by the colour gamut configuration file used.
In addition, as used herein, phrase " viewing data " and term " metadata " (when term " metadata " relates to this viewing data) refer to for and/or relate to the data of colors countenance, space filtering, motor behavior, film grain, noise, the such as integer of editor and tone mapping, non integer value and/or boolean (Boolean) value.This viewing data and/or metadata can be used to control, open or close the related mechanism for realizing aforementioned processing, and can be used to the function revising them.In addition, viewing data and/or metadata can comprise the specification of mapping table.
Such as, about in the embodiment of colors countenance, color map can utilize 1-D LUT (one-dimensional look-up table), 3-D LUT (three dimensional lookup table) and/or 3 × 3LUT to realize.As embodiment, when 3-D LUT, this LUT is used to reception three input values, and (each value represents a kind of colour component (red, green or blue), and predefined three-dimensional output valve (such as, Red, Green and Blue) is generated to each independently red, green and blue input three-dimensional value).In this case, from content source to content consumer, the metadata of (such as, display unit) will comprise LUT specification.
Another embodiment can relate to such as performing the specification of the mapping function of the circuit of " GOG " (gain (Gain), biased (Offset), gamma (Gamma)) etc., and wherein, GOG is defined as foloows:
Vout=Gain* (Offset+Vin) ^Gamma, for each colour component.
In this case, viewing data and/or metadata will comprise nine (9) individual values, wherein for each colour component in three colour components, there is one group of Gain, Offset and Gamma.
Viewing data used herein are used to affect these mechanism; Many group viewing data can be there are, to realize more than one but the transmission/storage of multiple viewing.
Certainly, present principles is not limited to above-described embodiment, and under the instruction of the principle here provided, other other embodiments realized comprising viewing data and/or metadata while the spirit keeping present principles can be expected easily by this area and those of ordinary skill in the related art.Below at least with reference to figure 5, further describe viewing data.
Such as, Fig. 1 shows the high level block diagram for connecting the system 100 sending viewing data in HDMI (High Definition Multimedia Interface) according to the embodiment of the present invention.The system 100 of Fig. 1 comprises and/or relates to content source device 110, HDMI (High Definition Multimedia Interface) (HDMI) jockey 120 and display unit 130 illustratively.Should be understood that content source device 110 can be (but being not limited to) high-definition digital video disc player, BLU-RAY player and network insertion unit (such as, including but not limited to Set Top Box (STB)).Content source device 110 provides the content that will be sent to display unit 130 via HDMI (High Definition Multimedia Interface) jockey 120 and be used for showing.The metadata comprising (such as) viewing data can be supplied to display unit 130 by content source 110, and is supplied to content source 110 by display unit 130.Should be understood that HDMI (High Definition Multimedia Interface) (HDMI) jockey 120 can include, but is not limited to HDMI (High Definition Multimedia Interface) (HDMI) cable.
Display unit 130 (and/or to be placed between transmission medium 120 and display unit 130 and to be connected to one or more devices of these devices) can comprise and is respectively used to receive, stores and the receiver 161 of apply metadata, storage device 162 and/or metadata applicator (metadata applier) 162.
Such as, Fig. 2 shows the more detailed high level block diagram of the system 100 of Fig. 1 of the use order viewing transfer of data according to the embodiment of the present invention.In the embodiment of fig. 2, content 202 and viewing database 204 are placed on the place of content creation portion 210 of system 100.In the embodiment of fig. 2, watch database 204 and store viewing data.In one embodiment, the content creation portion 210 of system 100 comprises viewing Data Generator 288 for generating viewing data 206 and for preparing to watch data with as further described below the viewing transfer of data preparation device 299 transmitted on HDMI.Content 202 and viewing data 206 are combined 207 at place of content creation portion 210.Use one or more transmission and/or storage medium 220, content 202 and corresponding viewing data 206 are sequentially sent to the content display part 230 of system 100, and wherein content 202 is separated and processed with viewing data 206.The content display part 230 of system 100 can comprise the display unit 130 shown in (such as) Fig. 1.Then viewing data 206 can be stored in the viewing data database 232 at content display part 230 place of the system of being placed on 100.Should be understood that the transmission shown in Fig. 2 and/or storage medium 220 contain sequential delivery and/or the storage of content 202 and/or viewing data 206.
Fig. 3 shows the more detailed high level block diagram of the system 100 of watching Fig. 1 of transfer of data according to the realization of alternate embodiment of the present invention side by side.In the embodiments of figure 3, content 302 and viewing data database 304 are placed on the place of content creation portion 310 of system 300.In the embodiments of figure 3, watch database 304 and store viewing data 306.In one embodiment, the content creation portion 310 of system 300 comprise for generate viewing data 306 viewing Data Generator 388 and for prepare viewing data 306 with the viewing transfer of data preparation device 399 transmitted on HDMI as described below.Content 302 and viewing data 306 are combined 307 at place of content creation portion 310.Use one or more transmission and/or storage medium 320, content 302 and corresponding viewing data 306 are transmitted in parallel the content display part 330 to system 300, and wherein content 302 is separated and processed with viewing data 306.The content display part 330 of system 300 can comprise the display unit 130 shown in (such as) Fig. 1.Then viewing data 306 can be stored in the viewing data database 332 at content display part 330 place of the system of being placed on 300.Should be understood that the transmission shown in Fig. 3 and/or storage medium 320 contain parallel transmission and/or the storage of content 302 and/or viewing data 306.
Fig. 4 shows the flow chart of the method for watching data definition and transmission according to the embodiment of the present invention.Method 400 starts in step 402 place, wherein for video content generates viewing data.This viewing data relate to but are not limited to colors countenance, space filtering, motor behavior, film grain, noise, editor, tone mapping etc.This viewing data can be used to control, open or close the related mechanism for realizing above-mentioned process, and are used to revise their function.Then the method proceeds to step 404.
In step 404, viewing data are prepared for transmission, and this step includes but not limited to as (previously having generated in step 402) viewing data genaration one or more viewing data basic messae (Look Data Elementary Message), generation comprise one or more viewing packet of one or more viewing data basic messae respectively and store viewing data etc. on disk.Then the method proceeds to step 406.
In step 406, viewing data and video content are used HDMI to be sent to display unit.This transmission can relate to but be not limited to (such as) use HDMI color metadata, CEA/HDMI specific information frame, HDMI/CEC (consumer electronics's control) agreement etc.Transmit for use HDMI color metadata, this use can relate to use Gamut boundary description (GBD) metadata container.Transmit for use CEA/HDMI vendor-specific information frame, this use can relate to GBD stream protocol is applied to vendor-specific information frame.Transmit for use HDMI CEC agreement, this use can relate to be added network abstract layer, enabled services quality (QoS) and CEC is timed to video on CEC.Then the method proceeds to step 408.
In step 408, according to viewing data receiver, storage revise video content, and by revised video content display on the display apparatus.Then method 400 can be exited.
Should be understood that and can realize changing reception, storage and the aforementioned order revised and use according to reality.Such as, storage can correspond to provide metadata on a storage medium and/or correspond to and is temporarily stored in content reproduction side (content rendition side) to process subsequently.
In one embodiment of the invention, principle of the present invention can be used to the content created for high-definition digital video disc (HD-DVD) and/or BLU-RAY dish by following process: encode to content according to International Organization for standardization/International Electrotechnical Commission (IEC) dynamic image expert group-4 (MPEG-4) part 10 advanced video coding (AVC) standard/office of international telecommunication union telecommunication (ITU-T) H.264 promotion (hereinafter referred to " MPEG-4AVC standard "), content is stored on dish, then the signal processing unit amendment video content controlled in display is used for display.In this applications, watch data to be stored on dish.Then viewing data are sent to display by use HDMI (High Definition Multimedia Interface) (HDMI).There has been described the various illustrative methods using HDMI to send viewing data.Certainly, should be understood that, present principles is not limited only to described embodiment, and under the instruction of the principle provided in this article, this area and person of ordinary skill in the relevant will expect these and other embodiments various and their modification (simultaneously keeping the spirit of present principles).
Should be understood that in various embodiments, under the specialty that present principles can be used in " digital log (the Digital Dailies) " included but not limited in processing moving product or half professional environment.
scene boundary data
Fig. 5 shows the exemplary expression of the viewing data 500 of the embodiment according to present principles.The viewing data 500 of Fig. 5 comprise viewing packet 510 illustratively, each scene or sequence of scenes 515 1 viewing packet.It should be noted that general by contents producer to define scene boundary.As shown in Figure 5, each viewing packet (LDP) 510 can comprise one or more viewing data basic messae 520.Each viewing data basic messae (LDEM) can comprise the parameter 525 controlling the information process unit presenting for content and/or show.More specifically, according to embodiments of the invention, viewing packet 510 and viewing data basic messae 520 will be sent out with parameter 525 or be passed to the display system comprising content rendering device together with each video content.At display system place, content rendering device (such as, the decoder of display or Set Top Box) according to the parameter in viewing data basic messae 520, viewing packet 510 is applied to each video content, which creates the viewing scene of data or the display properties of sequence of scenes to affect or to change into.
In one embodiment, if determine that viewing data 500 are identical between scene 515, then upgrade viewing data 500 when can be changed by the scape that is absent from the scene in the middle of scene 515, share viewing data 500.So viewing data 500 are remained valid, until viewing data 500 are disabled or are updated.This invalid can comprising is set to " false (FALSE) " by " data are effective " in " viewing data basic messae " being marked, and forbids the application of LDEM metadata.A kind of substituting sends the new LDEM with identical labeled identifier (Tag ID).
viewing packet
In one embodiment of the invention, for viewing packet (Look Data Packet) transmission, " KLV " (key, length, value) metadata concept is implemented, but other known viewing data packet transmission concept also can be implemented.Namely, although one or more embodiment relative to KLV metadata conceptual description, but should be understood that present principles is not limited to only realize KLV metadata concept, so the additive method realizing viewing packet also can be implemented according to various embodiments of the present invention while the spirit keeping present principles.
KLV concept is understood data packet transmission for transmitting device and when is terminated to be very useful under the condition of not resolving content.This illustrates in figure 6 and figure 7.Such as, Fig. 6 shows the exemplary KLV symbol being used in the metadata 600 in viewing data basic messae according to the embodiment of the present invention.Fig. 7 illustrates in greater detail the KLV symbol of the metadata 600 according to Fig. 6 of the embodiment of the present invention.
More specifically, with reference to figure 6 and Fig. 7, each packet can comprise " key " field 610 (such as this message relates to " viewing data ") of the character of Indication message." key " field can comprise timestamp 617 or alternately comprise " scene identifiers (scene ID) ", thus receiving element can know that the data in which scene must be ready to be employed immediately.Should be understood that timestamp 617 and/or scene ID are optional, and the system of (such as) time of origin code tracking can be used to.In addition, each packet can comprise the length field 620 of the number of words in the payload part of designation data bag.Should be understood that length field 620 is also optional, its use such as can depend on metadata token.
In addition, each packet can comprise the value field (value field) 630 of the payload part for convey data packets.In one embodiment, the word size of payload content can be determined by metadata token.In one embodiment of the invention, payload can comprise (such as) independent " viewing data basic messae ", and wherein another layer of KLV can be used, or only uses KV (key and value) alternatively.
viewing data basic messae
1. colors countenance
In one embodiment of the invention, colors countenance can define in viewing data basic messae.That is, colors countenance such as can be realized by one or more 3D-LUT, one or more 1D-LUT and/or one or more 3 × 3LUT.Such as, the example definitions of this viewing data basic messae provides in Fig. 8 to Figure 14.
More specifically, Fig. 8 shows the exemplary viewing data basic messae 800 being implemented as the 3D-LUT of the bit-depth with 8 bits according to the embodiment of the present invention.As shown in Figure 8, watch data basic messae 800 and comprise Tag ID part 810 and value part 820.Value part 820 includes validity part, Color Space Definitions part, length definitional part and value part illustratively.Each part of the viewing data basic messae 800 of Fig. 8 comprises respective description and name portion.The Tag ID part 810 of Fig. 8 defines 8 bit ID (it is 0x11 illustratively) of 3D-LUT.In value part 820, whether validity part defines data effective, and is defined as Boolean to being illustrated property in fig. 8.In value part 820, color space part definition color space, and be defined as [00]=RGB to being illustrated property in fig. 8, [01]=XYZ, [10]=YCrCb, and [11]=retain.
Length definitional part in the value part 820 of Fig. 8 defines the length (it is assumed to be 8 bit node data illustratively) of payload in units of byte.In addition, value part defines the interval (it is assumed to be fixed intervals illustratively) of such as LUT node data, input data, the various values of word definition and order (it is " first character RED; CIE_X or Y ", " second word GREEN; CIE_Y or Cr ", " the 3rd word BLUE, CIE_Z or Cb " illustratively) and so on.In the viewing data basic messae 800 of Fig. 8, the grid scanning that value part also defines illustratively " first blue (BLUE) changes, then green (Green), then red (Red) ".
Fig. 9 shows the exemplary viewing data basic messae 900 being implemented as the 3D-LUT of the bit-depth with 10 bits according to the embodiment of the present invention.The viewing data basic messae 900 of Fig. 9 is similar to the viewing data basic messae of Fig. 8 substantially, except the ID of 3D-LUT in fig .9 has the bit-depth of 10 bits and has value 0x12.In addition, in the viewing data basic messae 900 of Fig. 9, length definitional part defines and is assumed to be 10 bit byte data illustratively and the length being packaged as the payload of the word of 32 bits.In addition, in the embodiment in fig. 9, value part defines word " red (RED) ", " green (GREEN) " and " blue (BLUE) " further as follows:
Word=RED < < 20+GREEN < < 10+BLUE.
Figure 10 shows the exemplary viewing data basic messae 1000 being implemented as the 1D-LUT of the bit-depth with 8 bits according to the embodiment of the present invention.In the viewing data basic messae 1000 of Figure 10, the ID of 1D-LUT has the bit-depth of 8 bits, and value is 0x13.Be different from the viewing data basic messae of above Fig. 8 and Fig. 9, in the viewing data basic messae 1000 of Figure 10, color definition part defines color, and it whether will be applied to all passages for the LUT of red channel, green channel or blue channel or this LUT.In Fig. 10, color-values is defined as illustratively [00]=red or CIE_X or Y, [01]=green or CIE_Y or Cr, [10]=blue or CIE_Z or Cb, and [11]=all passages.In addition, in viewing data basic messae 1000, value part defines LUT and exports data and be expected to be 256 8 bit values that the output valve for minimum input value starts.
Figure 11 shows the exemplary viewing data basic messae 1100 being implemented as the 1D-LUT of the bit-depth with 10 bits according to the embodiment of the present invention.The viewing data basic messae 1100 of Figure 11 is similar to the viewing data basic messae 1000 of Figure 10 substantially, and except in the embodiment in figure 11, viewing data basic messae 1100 comprises and has the bit-depth of 10 bits and the ID of value 0x14.In addition, in viewing data basic messae 1100, value part definition LUT exports data and is expected to be 1024 10 bit values that the output valve for minimum input value starts, and 3 10 bit values is packaged into 32 bit words with following value:
Word=LUT [0] < < 20+LUT [1] < < 10+LUT [2].
Figure 12 shows the exemplary viewing data basic messae 1200 being implemented as 3 × 3 matrixes of the bit-depth with 10 bits according to the embodiment of the present invention.In viewing data basic messae 1200, color definition part defines the matrix application of the value with [00]=RGB to RGB (gamma), [01]=RGB to RGB (linearly) and [11]=XYZ to XYZ.In addition, in the viewing data basic messae 1200 of Figure 12, the coefficient value that the definition of value part is expected is 9 10 bit values of following form:
[B1 [C1 C2 C3 [A1
B2 = C4 C5 C6 x A2
B3] C7 C8 C9] A3]
Wherein, A1 and B1 is red or CIE_X, A2 and B2 are green or CIE_Y, and A3 and B3 is blue or CIE_Z, and puts in order as C1-C2-C3.In the viewing data basic messae 1200 of Figure 12, value part defines the word that three coefficients are packaged as 1 32 bit, thus makes total payload be 3 × 32 bit=96 bits with following value:
Word=C1 < < C20+C2 < < 10+C3.
Figure 13 shows the exemplary viewing data basic messae 1300 being implemented as 3 × 3 matrixes of the bit-depth with 8 bits according to the embodiment of the present invention.The viewing data basic messae 1300 of Figure 13 is similar to the viewing data basic messae 1200 of Figure 12 substantially, and except in the embodiment of Figure 13, viewing data basic messae 1300 comprises and has the ID that value is the bit-depth of 8 bits of 0x16.In addition, in the viewing data basic messae 1300 of Figure 13, total payload is 9 × 8 bit=72 bits.
Figure 14 shows the exemplary viewing data basic messae 1400 being implemented as 3 × 3 matrixes of the bit-depth with 16 bits according to the embodiment of the present invention.The viewing data basic messae 1400 of Figure 14 is similar to the viewing data basic messae 1200 of Figure 12 and the viewing data basic messae 1300 of Figure 13 substantially, except in the embodiment of Figure 14, viewing data basic messae 1400 comprises and has the ID with the bit-depth of 16 bits that value is 0x17.In addition, in the viewing data basic messae 1400 of Figure 14, total payload is 9 × 16 bit=144 bits.
2. spatial filter
In an embodiment of the present invention, space filtering controls to specify in viewing data basic messae.Such as, airspace filter can be utilized to change roomage response or frequency response.A kind of illustrative methods changing spatial frequency response is use one group of finite impulse response (FIR) filter (each filter is wherein tuned to a specific centre frequency).Figure 15 shows the exemplary filters group 1500 for frequency response correction according to the embodiment of the present invention.The bank of filters 1500 of Figure 15 comprises multiple filter 1510, at least one multiplier 1520 and at least one combiner 1530 illustratively.
In one embodiment, in order to strengthen or frequency of fadings details, by changing filter coefficient (C0 ... CN) frequency response of image is processed.Such as, Figure 16 shows the exemplary discrete frequency 1600 for frequency equilibrium according to the embodiment of the present invention.As shown in Figure 16, filter coefficient (C0 ... CN) can be specified by the viewing data basic messae for frequency response.
Such as, Figure 17 shows the exemplary viewing data basic messae 1700 for 8 bit frequency equilibriums according to the embodiment of the present invention.As shown in the embodiment of Figure 17, viewing data basic messae 1700 defines the number of coefficients of frequency equalizer (such as, nearly 16,4 bits), and define each coefficient and control a frequency multiplier (frequency band multiplier).
3. motor behavior
In one embodiment, the message included for making display motor behavior is calibrated to the information of the kinetic characteristic of expectation can being utilized, in viewing data basic messae, specifying that motor behavior controls.This information carries the specification of the kinetic characteristic of expectation, and can carry from content processing unit, the helper data of the process simplified in display.Such as, Figure 18 shows the exemplary viewing data basic messae 1800 for motor behavior according to the embodiment of the present invention.The viewing data basic messae 1800 of the embodiment of Figure 18 defines input frame frequency (U8), field repetition (U8), the epideictic behaviour (U16) of expectation and the eye motion track (2 × U32) in x/y direction that unit is Hz illustratively.In addition, in the viewing data basic messae 1800 of the embodiment of Figure 18, define and whether there is estimation or preliminary treatment.
4. film grain
In an embodiment, film grain controls to specify in viewing data basic messae.In one embodiment of the invention, film grain message can obtain from MPEG-4AVC standard, payload type=19.Figure 19 shows the exemplary viewing data basic messae 1900 for film grain according to the embodiment of the present invention.
5. noise
In an embodiment, Noise measarement can specify in viewing data basic messae.That is, in for the viewing data basic messae of noise, can add to all color channels the white noise determining grade, or add a specific grade/characteristic to each passage.In addition, in an embodiment, noise can be removed from one or more color channel.In one embodiment, by the mode identical with roomage response as above, frequency response can be revised, change noise characteristic.Figure 20 shows the exemplary viewing data basic messae 2000 for noise according to the embodiment of the present invention.
6. edit
In an embodiment, the editor of one or more scene can be specified in viewing data basic messae.Such as, according to viewing data basic messae of the present invention, one or more segments of scene group or scene can be cut.Like this, the scene after editing can utilize the renewal of editing data to be shown subsequently.So in an embodiment, " the editing list " of IN and the OUT time code in special scenes can be sent out.In one embodiment, the first frame of scene will have time code 00:00:00:00 (HH:MM:SS:FF).Figure 21 shows the exemplary viewing data basic messae 2100 for time editing that can be used to editor control according to the embodiment of the present invention.
7. tone mapping
In one embodiment, tone mapping is defined in viewing data basic messae.Such as, when high dynamic range images is converted to low dynamic range echograms, tone mapping can be used.Exemplarily, general application can be the conversion of the image from the encoded images of 10 bits to 8 bits or 7 bits.Should be understood that present principles is not limited to any specific tone-mapping algorithm, so any method of tone mapping can be used while the spirit keeping present principles.As an example, tone mapping can specify in supplemental enhancement information (SEI) message in MPEG-4AVC standard.Such as, Figure 22 shows the exemplary viewing data basic messae 2200 for tone mapping according to the embodiment of the present invention.The viewing data basic messae 2200 of Figure 22 can specify the parameter that also can be prescribed in the sei message.
viewing transfer of data
In HDMI, there is the distinct methods for sending viewing data.Some illustrative methods for sending viewing data include but not limited to that the use of " gamut metadata bag " for the data except gamut metadata, the use of " vendor-specific information frame " and consumer electronics control the use of (CEC) seller specific command.
1.HDMI color metadata
The current version of attention HDMI specification is version 1.3A, there is the new method being used for transmitting colourity metadata via HDMI since the version 1.3 of HDMI specification.In one embodiment of the invention, not only send colourity metadata, but use transmission possibility to send " viewing packet ".So the use of colour gamut configuration file (Gamut Profile) is suggested, it is not by current HDMI specification, and version 1.3A uses, such as GBD_profile=7.HDMI specification version 1.3 is considered in an individual transmission and is reached 800 HDMI packets, but the future version of specification can provide different packet sums.But the time for this can continue nearly 10 video fields (video field), but this may change along with the future version of interface specification.For every HDMI packet 28 byte, this will amount to 21.8K byte.
So, in the embodiments of the invention relating to this viewing transfer of data, should ensure that viewing packet is not more than the full-size of HDMI gamut metadata bag.In addition, because viewing packet may need to adapt to by scene, and scene is defined as the video field scope sharing viewing packet data, so " viewing packet " (LDP) that the scene before this renewal example should not be shorter than current scene transmits the time spent.
In order to use the HDMI colourity metadata according to specification version 1.3, the length of packet will be calculated, and GBD_Length_H (high byte) and GBD_Length_L (low byte) will be filled in two bytes of gamut metadata bag, as shown in Figure 23.That is, Figure 23 shows the exemplary HDMI gamut metadata bag 2300 that can be applied to it according to the embodiments of the invention of the embodiment of the present invention.
In one embodiment, optional School Affairs can be performed to the whole packet comprising any padding data of GBD header and viewing packet and (if available).Figure 24 shows the exemplary HDMI gamut metadata header 2400 that can be applied to it according to the present invention of the embodiment of the present invention.Table 1 describes the semanteme according to the syntactic element shown in Figure 24 of the embodiment of the present invention.The part of Figure 24 and table 1 extract from " HDMI (High Definition Multimedia Interface) specification version 1.3a " (High Definition Multimedia Interface Specification Version 1.3a), are called table 5-30 " gamut metadata packet head " (Gamut Metadata Packet Header) wherein.
Table 1
As mentioned above, data can be divided into the independent HDMI packet for transmitting, and 22 bytes are used for a GBD packet and 28 bytes are used for all remaining data bags.If last packet can not utilize " viewing packet " data stuffing completely, then it needs to utilize aforesaid " padding data " to fill, and wherein according to one embodiment of present invention, padding data can comprise one or more 0.For data flow, HDMI GBD data flow mechanism is used, and has " Next_Field ", " Affected_Gamut_Seq_Num ", " Current_Gamut_Seq_Num " and " Packet_Seq " (see Figure 24).Do not exist by be employed viewing data time, a transmission of " No_Current_GBD " of corresponding " GBD_Profile "=7 is enough to send this request.All viewing data video signal corrections are with being about to be prohibited, until new viewing packet is sent out.
Or, the following communication means described about " HDMI CEC agreement " can be used, but GBD method is preferred because it with built-in frame synchornization method for feature.
Figure 25 shows the exemplary gamut metadata bag 2500 that can be applied to according to the present invention of the embodiment of the present invention in its HDMI version 1.3.Gamut metadata bag 2500 comprises the conversion table part 2550 of actual video part 2510, the GBD message part 2520 at source place, VSYNC part 2530, gamut metadata packet portion 2540 and receiving terminal.The part of Figure 25 is extracted from " HDMI (High Definition Multimedia Interface) illustrates version 1.3a ", is called " example P0 transfer sequence " (Example P0Transmission Sequence) of Fig. 5 to 6 wherein.
2.CEA/HDMI vendor-specific information frame
Replace realizing HDMI GBD metadata as above, in alternative embodiments of the present invention, can " vendor-specific information frame " used according to the invention (vendor specific info frame).Vendor-specific information frame describes in 6.1 chapters of such as CEA-861-D specification.HDMI specification allows the use of CEA-861-D information frame, described in its chapters and sections 5.3.5.In fact, information frame packet is 28 bytes in length.Be that data package size is restricted to and only has 1 information frame relative to unique difference of gamut metadata bag.In one embodiment of the invention, also propose to use the GBD metadata streams for vendor-specific information frame to control.That is, in one embodiment, following correction is employed: because vendor-specific information frame described above is limited to an only packet, so length field is only 5 bits in size.This means that length information and cyclic redundancy code (CRC) information should be placed on " being similar to GBD " header (viewing packet header (see Figure 26), thus it grown to 6 bytes from 3 bytes).That is, Figure 26 shows the exemplary viewing packet header 2600 being used for vendor-specific information frame will used together with HDMI version 1.3 according to the embodiment of the present invention.Table 2 below describes the semanteme according to the syntactic element shown in Figure 26 of the embodiment of the present invention.
Table 2
Figure 27 shows the exemplary vendor-specific information frame 2700 that can be applied to according to the present invention of the embodiment of the present invention in its such as CEA861D.The vendor-specific information frame 2700 of Figure 27 defines byte number, field name and content illustratively.As shown in Figure 27, vendor-specific information frame 2700 defines ' n ' byte illustratively, as having value 01 16vendor-specific information frame type code.Vendor-specific information frame 2700 further defines n+1 frame as having value 01 16vendor-specific information frame version, define n+2 frame and equal to comprise as having the L that IEEE registers the value of the total amount of byte in the information frame payload of ID vinformation frame length, define n+3,4,5 frames are as 24 Bit IEEE Registration identifiers (least significant byte formerly) and n+L v-1 1frame is as the special payload of the seller.
Therefore, Figure 28 shows the special CEC instruction 2800 of the exemplary seller for sending viewing data according to the embodiment of the present invention.The seller special CEC instruction 2800 of Figure 28 defines the title of various CEC instruction, description and value illustratively.More specifically, in the embodiment of Figure 28, specifically start bit and be defined.In addition, seller ID (being IEEE seller address illustratively) is defined as equaling 3 bytes of IEEE OUI appointment.The special CEC instruction 2800 of the seller of Figure 28 also defines the first data block as the command code of mark number (Tag Nr.) with 1 byte value.In the special CEC instruction 2800 of the seller started with the second data block, the operation several piece defined defines length, bag order and operand.More specifically, the length of following data is defined by units of byte.Whether packet sequence uses designation data bag to be 2 bit definitions of unique packet, the first packet or final data bag in color gamut data packet sequence illustratively in the embodiment of Figure 28.This packet sequence is defined as foloows illustratively in Figure 28:
Intermediate data bag in=0 (0b00) sequence
The first packet in=1 (0b01) sequence
Last packet in=2 (0b10) sequence
Unique data bag in=3 (0b11) sequence.
3.HDMI CEC agreement
Consumer electronics's control (CEC) is the double-direction control bus in HDMI.It is the share medium used by multiple audio/videos (A/V) device being connected to this bus.
It is slow especially in essence, has the original data transmissions speed in 100 to 200 bit range per second.According to HDMI specification version 1.3a, the single seller special CEC message has the original size of maximum 16 × 10 bits, and has the original payload of maximum 11 × 8 bits.Consider the data flow expense of agreement and 100% to 200%, the transmission of a CEC message will spend the several seconds.This means that i.e. 21.8k byte, sends cost number minute if send the data of equal number with the maximum possible in aforementioned two kinds of methods.This is only true when not having other devices during this period of time to use bus (transmission time will increase further in this case).
So the size of suggestion restriction viewing packet.Consider this speed, it is unpractiaca (especially LUT downloads, see Fig. 8 to Figure 11) that some viewing data basic messaes use in routine use.
But consider the payload size of CEC frame, it is inevitable substantially that viewing packet will be longer than a CEC frame.Because CEC is designed to simple application, so level of abstraction can realize on CEC, to make the communication robust more according to the embodiment of the present invention.
More specifically, relative to International Electrotechnical Commission's open system interconnection (OSI) (ISO/OSI) reference model, the CEC functional realiey part of physical layer, data link and network layer.Service quality (QoS) is not provided by network layer.So in an embodiment of the present invention, QoS will solve in the layer be implemented on CEC agreement.
Figure 29 shows the exemplary network layer model 2900 controlling to use together with (CEC) bus with consumer electronics according to the embodiment of the present invention.Network layer model 2900 comprises application layer 2905, TL_CEC layer 2910, CEC layer 2915 and CEC-physical layer (CEC-phys) 2920, and the communication wherein between first device (device 1) 2981 and the second device (device 2) 2982 is performed at CEC physical layer (CEC-phys) 2920 place.
Figure 30 shows the high-level diagram being applied to the exemplary CEC process 3000 of application communication for utilizing CEC seller's specific command to provide according to the embodiment of the present invention.That is, relative to CEC process 3000, viewing data basic messae generates in content creating/creation by generating application 3005.Then, watch data basic messae and be assembled into viewing packet by packet assembling block 3010.The mode that cyclic redundancy code (CRC) is defined with HDMI by crc block 3015 is calculated from this packet.Here CRC is implemented as School Affairs, and this School Affairs is defined as comprising all packet datas of header and the byte wide summation of School Affairs data, to equal 0.
Then, CRC and viewing packet are separated into the frame of suitable size to frame block 3020 by packet, for communication.In an embodiment of the present invention, the data package size of 88 bytes is used as example.In this embodiment, a CEC message carries the CRC data of 8 bits, and message subsequently carries the payload data of many 8 bits, because CRC data only need to be transmitted once at each viewing packet.
Alternatively as shown in Figure 31, CRC data also can be sent out at transmission end place in frame N.That is, Figure 31 shows and illustrates that viewing packet is to the high level block diagram of the exemplary transformation 3100 of frame, and wherein viewing packet is converted into frame and transmits to utilize HDMI.Then referring back to Figure 30, data are ready for be transmitted via CEC 3030.
In one embodiment of the invention, on the contrary at receiver side.Namely; frame is reassemblied as viewing packet; School Affairs is calculated, then watch data basic messae and disassembled out with the application of supplying receiver side (it is generally the display that the intention specified according to creator of content changes viewing) by from viewing packet.
It should be noted that crc block 3015 is parts of network layer 2900.When crc error, any one in two kinds of illustrative methods below can be performed, although should notice that principle of the present invention is not limited only to method described below.In first method, viewing packet can be dropped.In the second approach, repeat requests can be issued.When first method, the packet previously sent will be remained valid.
The high level block diagram from viewing packet to the exemplary transformation 3100 of the frame for utilizing HDMI to transmit according to the embodiment of the present invention is shown referring back to Figure 31, Figure 31.Conversion 3100 relate to given viewing packet 3110 and to CRC calculate 3120 and packet to the application of the conversion 3130 of frame with acquisition frame 3140 as a result.Wherein, the first frame has CRC information 3145.
Contrary with aforesaid certain methods, CEC does not have the common time base with vision signal.So in an embodiment of the present invention, in order to carry out synchronously to viewing packet, the validate signal of LDP transmission ending place is used, to carry out timing to the parameter of sent LDP and data to the loading in the video processing block of receiving terminal.Like this, viewing packet is sent out on CEC, and keeps invalid before specific " coming into force " CEC order is sent out.
But coming into force can not by accurate timing.So in one embodiment of the invention, a kind of possibility estimates the uncertainty of Applicative time, and guarantee that the change in Video processing can not multilated.Scene change blanking can be used.The signal that " comes into force " can be that 1 byte is short, but adds the expense of CEC bit, altogether will reach the additional initial bits of minimum 60 bit (as shown in Figure 32).That is, Figure 32 show according to the embodiment of the present invention for watching the exemplary viewing data basic messae 3200 that data come into force.
So, transmitting time can be calculated according to CEC time started+20 × CEC nominal data bit period.HDMI specification version 1.3a advises that the nominal CEC time started is 4.5 milliseconds, and nominal CEC data cycle time is 2.4 milliseconds.This causes 4.5 milliseconds+60 × 2.4 milliseconds=148.5 milliseconds.So the field had for 60Hz is postponed 9 by " coming into force " signal, the field for 50Hz postpones 8, or postpones 4 for the field of 24Hz.But this may change along with the renewal version of the CEC specification in HDMI specification.
Like this, before in field frequencies range being at least 9 of the application of 60Hz, " must to come into force " order to the request of CEC maker.Because HDMI receiving terminal controls to be on video and CEC, so be applied to given picture material in order to avoid the LDP data of mistake, propose during the saltus step stage, picture material is disappeared or otherwise makes it not by the impact that LDP changes.Bound-time when the saltus step stage is confirmed as supposing utilizing the fastest possible CEC saltus step speed of HDMI specification and the slowest CEC transmission add the duration between the possible processing delay in receiving terminal.
In order to the stationary problem that the viewing packet overcome based on scene changes, propose the illustrative methods according to the embodiment of the present invention below.That is, in order to by CEC and audio video synchronization, physical level CEC action is performed.This can be realized by the equipment such as shown in Figure 33.
More specifically, Figure 33 shows the example devices 3300 for generating CEC validate signal according to the embodiment of the present invention.The equipment 3300 of Figure 33 comprises CEC maker 3310 illustratively, launches pusher side 3320, receiver side 3330 and CEC decoder 3340.Physical level CEC action is performed by equipment 3300, and equipment 3300 carries out synchronously at the VSYNC (vertical synchronizing signal) of the PHY layer action and video section of launching pusher side 3320 couples of CEC.At receiver side 3330, this equipment, by carrying out synchronous by the application of viewing packet data with the action of CEC physical layer, will watch the application of synchronized of packet data to picture process in the mode of frame synchronization.Equipment 3300 by utilize VSYNC to CEC come into force order carry out timing, etc. the EOM order of last byte that comes into force of packet data to be watched realize above process.
Figure 34 shows the flow chart of the method for the validate signal transmission in CEC according to the embodiment of the present invention.The method 3400 starts from step 3410.In step 3410, LDP is sent to receiving terminal.Then method 3400 proceeds to step 3420.
In step 3420, " to come into force " Signal transmissions to the CEC maker request in source apparatus in block 3420.Then method 3400 proceeds to step 3430.
In step 3430, CEC maker sends CEC and " to come into force " order from next VSYNC event.Then method 3400 proceeds to step 3440.
In step 3440, receiving terminal receives " coming into force " signal.Then method 3400 proceeds to step 3450.
In step 3450, receiving terminal waits for until the end of transmission and when there is next VSYNC signal event in receiving end device, it makes LDP data come into force.Then method 3400 proceeds to step 3460.
In step 3460, LDP data content is applied to video processing block by receiving end device.Then method 3400 can be exited.
As mentioned above, " come into force " transmission time of signal will be approximately 52.5 milliseconds.So the field had for 60Hz is postponed 4 by " coming into force " signal, the field for 50Hz and 24Hz postpones 3.So, be necessary before about 4 frames before its application, make the LDP end of transmission and " coming into force " Signal transmissions is started.So will the uncertainty of LDP application be there is not.
In alternative embodiments of the present invention, another kind of transmission method can comprise the possibility of the networking in future of the novelty used for LDP transmission.Such as, HDMI can adopt the new networking passage on top layer/top in future, or will substitute some possibilities in existing HDMI dedicated data transmission possibility.This new transmission method can based on known networking technology, and can be asynchronous for video.This networking technology can be used to send LDP packet in the same fashion, and is used to use the packet that describes together with various embodiment of the present invention described herein to control and the CEC transmission method of such as audio video synchronization and so on.
Describing preferred embodiment for watching data definition and method and system by the transmission of HDMI (these embodiments for illustration of and be not used in restriction the present invention), should notice that those of ordinary skill in the art can make various amendment and change under the inspiration of above-mentioned instruction.So should be appreciated that, under the scope and spirit of the present invention can summarized in claims, change is made to disclosed specific embodiment of the present invention.Although foregoing points to various embodiment of the present invention, of the present invention other can be devised with further embodiment under the condition not departing from base region of the present invention.

Claims (28)

1. a method, comprising:
Generate the metadata of the visual characteristic identifying video content, described metadata is used for being carried out adjusting by the display mechanism controlling display unit for the change between different display unit and the different change created between intention and revised described video content before the described video content of display by creator of content; And
By identifying the part will be associated with described metadata of described video content, prepare described video content and described metadata to transmit in HDMI (High Definition Multimedia Interface).
2. method according to claim 1, is also included in described HDMI (High Definition Multimedia Interface) and sends described metadata and described video content, thus makes described metadata can be used to revise described video content before the described video content of display.
3. method according to claim 1, wherein said metadata is prepared for transmitting relative under at least one pattern of described video content in band and out-of-band pattern.
4. method according to claim 1, wherein said metadata is prepared for utilizing about at least one transmission in the version 1.3A of HDMI (High Definition Multimedia Interface) specification and the new colour gamut configuration file of any more Gamut boundary description metadata container of older version and untapped colour gamut configuration file.
5. method according to claim 1, wherein said metadata is prepared for utilizing vendor-specific information frame to transmit.
6. method according to claim 5, wherein said preparation process comprises Gamut boundary description flow control application in described vendor-specific information frame.
7. method according to claim 1, wherein said metadata is prepared for utilizing HDMI (High Definition Multimedia Interface) consumer electronics control protocol to transmit.
8. method according to claim 7, wherein said preparation process is included on described consumer electronics's control protocol and adds at least one in network abstract layer and enabled services quality.
9. method according to claim 7, wherein said preparation process comprises and carries out timing for described video content to described consumer electronics's control protocol, and this comprises and described consumer electronics's control protocol being coupled with vertical synchronizing signal.
10. method according to claim 1, wherein said metadata is used at least one in following process: the colors countenance of described video content, control the space filtering of described video content to change modulation transfer function associated with it, control the motor behavior of described video content, control the film grain aspect of described video content, add noise to described video content, control the editor of the scene in described video content, and relative to the tone mapping of described video content.
11. methods according to claim 10, wherein said metadata corresponds at least one in one or more look-up table and one or more color transformed both matrixes.
12. methods according to claim 1, wherein said preparation process comprises described set of metadata is woven to packet.
13. methods according to claim 12, comprise further and generating for being inserted at least one message in described packet, at least one message described relates at least one in following process: the colors countenance of described video content, control the space filtering of described video content to change modulation transfer function associated with it, control the motor behavior of described video content, control the film grain aspect of described video content, noise is added to described video content, control the editor of the scene in described video content, and relative to the tone mapping of described video content.
14. methods according to claim 1, are also included in the dish described video content of upper storage and described metadata, to show subsequently according to the amended video content of described metadata.
15. 1 kinds of systems, comprising:
Generator, for generating the metadata of the visual characteristic identifying video content, described metadata is used for being carried out adjusting by the display mechanism controlling display unit for the change between different display unit and the different change created between intention and revised described video content before the described video content of display by creator of content; And
Metadata transmission preparation device, for the part will be associated with described metadata by the described video content of identification, prepares described video content and described metadata to transmit in HDMI (High Definition Multimedia Interface).
16. systems according to claim 15, also comprise HDMI (High Definition Multimedia Interface) transmitting device, for sending described metadata and described video content in described HDMI (High Definition Multimedia Interface), thus make described metadata can be used to revise described video content before the described video content of display.
17. systems according to claim 15, wherein said metadata is prepared for transmitting relative under at least one pattern of described video content in band and out-of-band pattern.
18. systems according to claim 15, wherein said metadata is prepared for utilizing about at least one transmission in the version 1.3A of HDMI (High Definition Multimedia Interface) specification and the new colour gamut configuration file of any more Gamut boundary description metadata container of older version and untapped colour gamut configuration file.
19. systems according to claim 15, wherein said metadata is prepared for utilizing vendor-specific information frame to transmit.
20. systems according to claim 19, wherein Gamut boundary description current control is applied to described vendor-specific information frame.
21. systems according to claim 15, wherein said metadata is prepared for utilizing HDMI (High Definition Multimedia Interface) consumer electronics control protocol to transmit.
22. systems according to claim 22, wherein said metadata transmission preparation device at least performs one of following process: on described consumer electronics's control protocol, add network abstract layer, and enabled services quality.
23. systems according to claim 22, wherein said metadata transmission preparation device carries out timing for described video content to described consumer electronics's control protocol, and this comprises and described consumer electronics's control protocol being coupled with vertical synchronizing signal.
24. systems according to claim 15, wherein said metadata is used at least one in following process: the colors countenance of described video content, control the space filtering of described video content to change modulation transfer function associated with it, control the motor behavior of described video content, control the film grain aspect of described video content, add noise to described video content, control the editor of the scene in described video content and the tone mapping relative to described video content.
25. systems according to claim 24, wherein said metadata corresponds at least one in one or more look-up table and one or more both color conversion matrix.
26. systems according to claim 15, wherein said metadata is organized as packet.
27. systems according to claim 26, wherein said packet comprises at least one message of at least one process related in following process: the colors countenance of described video content, control the space filtering of described video content to change modulation transfer function associated with it, control the motor behavior of described video content, control the film grain aspect of described video content, noise is added to described video content, control the editor of the scene in described video content, and relative to the tone mapping of described video content.
28. systems according to claim 15, also comprise storage device, for storing described video content and described metadata, to show subsequently according to the amended video content of described metadata on dish.
CN201510266221.8A 2008-01-31 2008-01-31 Method for look data definition and transmission on high definition multimedia interface Pending CN104954831A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200880126062.XA CN101933094A (en) 2008-01-31 2008-01-31 Method and system for look data definition and transmission over a high definition multimedia interface

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN200880126062.XA Division CN101933094A (en) 2008-01-31 2008-01-31 Method and system for look data definition and transmission over a high definition multimedia interface

Publications (1)

Publication Number Publication Date
CN104954831A true CN104954831A (en) 2015-09-30

Family

ID=54198877

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510266221.8A Pending CN104954831A (en) 2008-01-31 2008-01-31 Method for look data definition and transmission on high definition multimedia interface

Country Status (1)

Country Link
CN (1) CN104954831A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114073089A (en) * 2019-06-28 2022-02-18 杜比实验室特许公司 High dynamic range video content type metadata

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114073089A (en) * 2019-06-28 2022-02-18 杜比实验室特许公司 High dynamic range video content type metadata
US11743550B2 (en) 2019-06-28 2023-08-29 Dolby Laboratories Licensing Corporation Video content type metadata for high dynamic range
CN114073089B (en) * 2019-06-28 2024-02-09 杜比实验室特许公司 Method and medium for generating digital video bit stream and playing back video content

Similar Documents

Publication Publication Date Title
CN101933094A (en) Method and system for look data definition and transmission over a high definition multimedia interface
KR100593581B1 (en) How to encapsulate data into transport packets of constant size
EP3053335B1 (en) Transmitting display management metadata over hdmi
US5535216A (en) Multiplexed gapped constant bit rate data transmission
CN101952892B (en) Method and system for look data definition and transmission
JP6282357B2 (en) Broadcast signal transmission / reception method and apparatus based on color gamut resampling
CN102648629B (en) Provision of supplemental processing information
JP2018532294A (en) Broadcast signal transmitting apparatus, broadcast signal receiving apparatus, broadcast signal transmitting method, and broadcast signal receiving method
US8332898B2 (en) Apparatus, systems and methods to synchronize communication of content to a presentation device and a mobile device
CN105981391A (en) Transmission device, transmission method, reception device, reception method, display device, and display method
CN108370450A (en) Broadcast singal sending device, broadcasting signal receiving, broadcast singal sending method and broadcast signal received method
CN104509139B (en) Method for providing multimedia message service
JP2005229153A (en) Dimmer system and dimmer method, distributor and distribution method, receiver and reception method, recorder and recording method, and reproducing apparatus and reproducing method
KR20110111251A (en) Method and apparatus for providing metadata for sensory effect, computer readable record medium on which metadata for sensory effect is recorded, method and apparatus for representating sensory effect
US20050025460A1 (en) Information-processing apparatus, information-processing method, program-recording medium, and program
CN104954831A (en) Method for look data definition and transmission on high definition multimedia interface
US7398543B2 (en) Method for broadcasting multimedia signals towards a plurality of terminals
CN105163169A (en) Data package format method and system suitable for transmission
EP1435738A1 (en) Method and system for generating input file using meta language regarding graphic data compression
CN103004220B (en) The system and method for content is added at data stream during obtaining
Reitmeier Distribution to the Viewer

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190516

Address after: France

Applicant after: Interactive Digital CE Patent Holding Company

Address before: France's Nigeria - Billancourt City

Applicant before: Thomson Licensing SA

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150930