WO2017094645A1 - Systems and methods for signalling application accessibility - Google Patents

Systems and methods for signalling application accessibility Download PDF

Info

Publication number
WO2017094645A1
WO2017094645A1 PCT/JP2016/085118 JP2016085118W WO2017094645A1 WO 2017094645 A1 WO2017094645 A1 WO 2017094645A1 JP 2016085118 W JP2016085118 W JP 2016085118W WO 2017094645 A1 WO2017094645 A1 WO 2017094645A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
level
conformance
accessibility
signalling
Prior art date
Application number
PCT/JP2016/085118
Other languages
French (fr)
Inventor
Kiran Mukesh MISRA
Sachin G. Deshpande
Original Assignee
Sharp Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Kabushiki Kaisha filed Critical Sharp Kabushiki Kaisha
Publication of WO2017094645A1 publication Critical patent/WO2017094645A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/633Control signals issued by server directed to the network components or client
    • H04N21/6332Control signals issued by server directed to the network components or client directed to client
    • H04N21/6334Control signals issued by server directed to the network components or client directed to client for authorisation, e.g. by transmitting a key
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Definitions

  • the present disclosure relates to the field of interactive television.
  • Digital media playback capabilities may be incorporated into a wide range of devices, including digital televisions, including so-called “smart” televisions, set-top boxes, laptop or desktop computers, tablet computers, digital recording devices, digital media players, video gaming devices, cellular phones, including so-called “smart” phones, dedicated video streaming devices, and the like.
  • Digital media content (e.g., video and audio programming) may originate from a plurality of sources including, for example, over-the-air television providers, satellite television providers, cable television providers, online media service providers, including, so-called streaming service providers, and the like.
  • Digital media content may be delivered over packet-switched networks, including bidirectional networks, such as Internet Protocol (IP) networks and unidirectional networks, such as digital broadcast networks.
  • IP Internet Protocol
  • Digital media content may be transmitted from a source to a receiver device (e.g., a digital television or a smart phone) according to a transmission standard.
  • transmission standards include Digital Video Broadcasting (DVB) standards, Integrated Services Digital Broadcasting Standards (ISDB) standards, and standards developed by the Advanced Television Systems Committee (ATSC), including, for example, the ATSC 2.0 standard.
  • the ATSC is currently developing the so-called ATSC 3.0 suite of standards.
  • the ATSC 3.0 suite of standards seeks to support a wide range of diverse video based services through diverse delivery mechanisms.
  • the ATSC 3.0 suite of standards seeks to support broadcast video delivery, so-called broadcast streaming/file download video delivery, so-called broadband streaming/file download video delivery, and combinations thereof (i.e., “hybrid services”).
  • An example of a hybrid video service contemplated for the ATSC 3.0 suite of standards includes a receiver device receiving an over-the-air video broadcast (e.g., through a unidirectional transport) and receiving a synchronized closed caption presentation from an online media service provider through a packet switched network (e.g., through a bidirectional transport).
  • a synchronized closed caption presentation may represent an example of an application based enhancement, which may be referred to as an app-based enhancement or feature.
  • Other examples of application based enhancements include targeted advertising and enhancements providing interactive viewing experiences. Current techniques for signalling information associated with and/or properties of application based enhancements may be less than ideal.
  • this disclosure describes techniques for signalling (or signaling) information associated with application based enhancements.
  • this disclosure describes techniques for signalling accessibility characteristics associated with an application associated with a video and/or audio presentation.
  • accessibility characteristics may include levels of accessibility based on an established accessibility guideline. It should be noted that although in some examples the techniques of this disclosure are described with respect to ATSC standards, including those currently under development, the techniques described herein are generally applicable to any transmission standard.
  • the techniques described herein are generally applicable to any of DVB standards, ISDB standards, ATSC Standards, Digital Terrestrial Multimedia Broadcast (DTMB) standards, Digital Multimedia Broadcast (DMB) standards, Hybrid Broadcast and Broadband Television (HbbTV) standard, World Wide Web Consortium (W3C) standards, Universal Plug and Play (UPnP) standards, and other video encoding standards.
  • DTMB Digital Terrestrial Multimedia Broadcast
  • DMB Digital Multimedia Broadcast
  • HbbTV Hybrid Broadcast and Broadband Television
  • W3C World Wide Web Consortium
  • UPF Universal Plug and Play
  • a method for signalling information associated with an application associated with a video or audio service comprises signalling a syntax element indicating a level of conformance associated with the application, and signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
  • a method for signalling information associated with an application associated with a video or audio service comprises for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
  • a method for parsing information associated with an application associated with a video or audio service comprises parsing a syntax element indicating accessibility of the application, and performing an action based on the parsed syntax element.
  • FIG. 1 is a conceptual diagram illustrating an example of content delivery protocol model according to one or more techniques of this disclosure.
  • FIG. 2 is a block diagram illustrating an example of a system that may implement one or more techniques of this disclosure.
  • FIG. 3 is a block diagram illustrating an example of a service distribution engine that may implement one or more techniques of this disclosure.
  • FIG. 4A is a computer program listing illustrating an example of signalling accessibility information according to one or more techniques of this disclosure.
  • FIG. 4B is a computer program listing illustrating an example of signalling accessibility information according to one or more techniques of this disclosure.
  • FIG. 5 is a block diagram illustrating an example of a receiver device that may implement one or more techniques of this disclosure.
  • Zero or more application-based features or enhancements may be associated with digital media content transmitted from a source to a receiver device. It should be noted that in some examples digital media content transmitted from a source to a receiver device and/or enhancements associated therewith may be referred to as a service.
  • An application may enable enhanced presentations for a primary media service (e.g., a television program, a movie, an event, or the like). For example, an application may enable an enhanced audio presentation (e.g., alternative language, commentary etc.), an enhanced video presentation (e.g., secondary views, etc.), and caption services (e.g., subtitles) to be presented in conjunction with an over-the-air television broadcast. Further, applications may enable a programmatically controlled user experience.
  • a primary media service e.g., a television program, a movie, an event, or the like.
  • an application may enable an enhanced audio presentation (e.g., alternative language, commentary etc.), an enhanced video presentation (e.g., secondary views, etc.), and caption
  • an application may enable targeted advertisements based on a user’s viewing history to be presented, enable interactive gaming, and the like.
  • an application may enable a so-called second screen enhancement. That is, for example, a primary media service may be rendered on a primary display device (e.g., a digital television) and a corresponding application (e.g., an interactive game) may be executed on a secondary computing device (e.g., a tablet computing device).
  • a primary media service may be rendered on a primary display device (e.g., a digital television) and a corresponding application (e.g., an interactive game) may be executed on a secondary computing device (e.g., a tablet computing device).
  • An application may include a downloaded application or a native application.
  • a downloaded application may refer to a collection of downloaded documents enabling a self-contained function.
  • Documents may include multimedia files and files defined according to a particular programming language. Examples of programming languages include, Hypertext Markup Language (HTML), Dynamic HTML, eXtensible Markup Language (XML), Java, JavaScript, JavaScript Object Notation (JSON), and Cascading Style Sheets (CSS).
  • a native application may refer to software stored on a receiver device configured to perform a function using downloaded data.
  • an electronic programming guide application stored on a set-top box may be configured to display television listings using electronic service guide data received from a server.
  • each application and/or respective application data may be separately signaled.
  • creators of respective diverse applications including applications associated with the same primary media service, may independently develop their applications.
  • applications and/or respective application data may be signaled using diverse delivery mechanisms. For example, applications, application data, and/or components thereof may be delivered over bidirectional networks or unidirectional networks. In this manner, for a particular primary media service, a plurality of diverse independently developed applications may be available through a plurality of diverse sources (e.g., a broadcaster or various servers).
  • Computing devices and/or transmission systems may be based on models including one or more abstraction layers, where data at each abstraction layer is represented according to particular structures, e.g., packet structures, modulation schemes, etc.
  • An example of a model including defined abstraction layers is the so-called Open Systems Interconnection (OSI) model illustrated in FIG. 1.
  • the OSI model defines a 7-layer stack model, including an application layer, a presentation layer, a session layer, a transport layer, a network layer, a data link layer, and a physical layer.
  • a physical layer may generally refer to a layer at which electrical signals form digital data.
  • a physical layer may refer to a layer that defines how modulated radio frequency (RF) symbols form a frame of digital data.
  • RF radio frequency
  • a data link layer which may also be referred to as link layer, may refer to an abstraction used prior to physical layer processing at a sending side and after physical layer reception at a receiving side. It should be noted that a sending side and a receiving side are logical roles and a single device may operate as both a sending side in one instance and as a receiving side in another instance.
  • Each of an application layer, a presentation layer, a session layer, a transport layer, and a network layer may define how data is delivered for use by a user application.
  • Transmission standards may include a content delivery protocol model specifying supported protocols for each layer and further defining one or more specific layer implementations.
  • WD Working Drafts
  • the proposed ATSC 3.0 unidirectional physical layer includes a physical layer frame structure including a defined bootstrap, preamble, and data payload structure including one or more physical layer pipes (PLPs).
  • PLP physical layer pipes
  • a PLP may generally refer to a logical structure including all or portions of a data stream.
  • one or more Layer Coding Transport (LCT) channels may be included in a PLP.
  • An LCT channel may carry individual components of a service (e.g., audio, video, or closed caption component streams) or file-based items of content associated with a service (e.g., documents included in an application enhancement).
  • the proposed link layer abstracts various types of data encapsulated in particular packet types (e.g., MPEG transport stream (MPEG-TS) packets, Internet Protocol (IP) packets, signalling packets, extension packets, etc.) into a single generic format for processing by the physical layer.
  • MPEG-TS may be defined as a standard container format for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data.
  • PSIP Program and System Information Protocol
  • the proposed ATSC 3.0 suite of standards also support so-called broadband physical layers and data link layers to enable support for hybrid video services. For example, it may be desirable for a primary presentation of a sporting event to be received by a receiving device through an over-the-air broadcast and an application enhancement associated with the sporting event (e.g., updated statistics) to be received from an online media service provider.
  • an application enhancement associated with the sporting event e.g., updated statistics
  • ATSC 3.0 uses the term “broadcast” to refer to a unidirectional over-the-air transmission physical layer
  • the so-called ATSC 3.0 broadcast physical layer supports video delivery through streaming or file download.
  • the term broadcast as used herein should not be used to limit the manner in which video and associated data may be transported according to one or more techniques of this disclosure.
  • content delivery protocol model 100 is “aligned” with the 7-layer OSI model for illustration purposes. It should be noted that such an illustration should not be construed to limit implementations of the content delivery protocol model 100 or the techniques described herein.
  • Content delivery protocol model 100 may generally correspond to the currently proposed content delivery protocol model for the ATSC 3.0 suite of standards. As described in detail below the techniques described herein may be incorporated into a system implementation of content delivery protocol model 100, or a similar content delivery protocol model, in order to enable and/or enhance functionality in an interactive video distribution environment.
  • documents e.g., XML Documents
  • documents including applications, application data, and/or components thereof may be delivered over bidirectional networks or unidirectional networks.
  • Such documents may be referred to as signaling object files or non-real time (NRT) content files and may further be referred to as metadata fragments or service signaling fragments, and the like.
  • content delivery protocol model 100 includes two mechanisms for supporting delivery of signaling object or NRT content files through ATSC Broadcast Physical layer: (1) through MPEG Media Transport Protocol (MMTP) over User Datagram Protocol (UDP) and Internet Protocol (IP); and (2) through Real-time Object delivery over Unidirectional Transport (ROUTE) over UDP and IP.
  • MMTP MPEG Media Transport Protocol
  • UDP User Datagram Protocol
  • IP Internet Protocol
  • ROUTE Real-time Object delivery over Unidirectional Transport
  • MMTP is described in ISO/IEC: ISO/IEC 23008-1, “Information technology-High efficiency coding and media delivery in heterogeneous environments-Part 1: MPEG media transport (MMT),” which is incorporated by reference herein in its entirety.
  • a ROUTE session may comprise one or more LCT channels which carry as a whole, or in part, the content components that make up a service session.
  • content delivery protocol model 100 supports delivery of signaling object or NRT content files through the broadband physical layer, i.e., through Hypertext Transfer Protocol (HTTP), including, for example, through HTTP Requests.
  • HTTP Hypertext Transfer Protocol
  • a transmission standard may define data elements that describe applications.
  • Data elements that describe applications may be referred to as application data elements and/or items of metadata associated with applications and may be include documents (e.g., XML documents).
  • documents e.g., XML documents.
  • data elements that describe applications may be delivered through MMTP over UDP and IP, through ROUTE over UDP and IP, and through HTTP.
  • Application data elements may enable a receiver device to determine how to access a particular application (e.g., application data elements may include a Universal Resource Locator (URL)) and execute the particular application in conjunction with a primary media service (e.g., application data elements may include application control codes or activation states). Further, application data elements may enable a receiver device to determine if a particular application is supported by the receiver device. In some examples, application data elements may be signaled based on so-called application information tables (AIT) or application signaling tables (AST).
  • AIT application information tables
  • AST application signaling tables
  • ETSI TS 102 809 European Telecommunications Standard Institutes Technical Specification (ETSI TS) 102 809 V1.1.1 (2010-01), Digital Video Broadcasting (DVB); Signalling and carriage of interactive application and services in Hybrid broadcast/broadband environments (hereinafter “ETSI TS 102 809”), which is incorporated by reference herein in its entirety, defines an application information table which may be used to generate an XML document including application data elements. That is, for example, a receiver device may receive an XML document formatted according to the schema provided in ETSI TS 102 809 and parse the XML document to determine how to access a particular application and execute the particular application.
  • ETSI TS 102 809 provides that an application can be completely described by: an application name which can be multilingual (element appName); a unique identification (element applicationIdentifier); a generic descriptor which is common and mandatory for all types of application (element applicationDescriptor); an application specific descriptor which will depend upon the type of the signalled application (element applicationSpecificDescriptor); an application usage descriptor which is optional (element applicationUsageDescriptor); an application boundary descriptor which is optional (element applicationBoundaryDescriptor); one or more application transport descriptors (element applicationTransport); and a simple application location descriptor (element applicationLocation).
  • Table 1 provides a summary of an ApplicationList element and elements describing an application as provided in ETSI TS 102 809. It should be noted that for the sake of brevity Table 1 does not include all of the child elements and attributes for an ApplicationList as defined according to ETSI TS 102 809.
  • Table 1 and other tables described herein may generally follow XML conventions.
  • XML conventions may apply: Elements in an XML document may be identified by an upper-case first letter and in bold face as Element. To express that an element Element1 is contained in another element Element2, may be written as Element2.Element1. If an element’s name consists of two or more combined words, camel-casing is typically used, e.g. ImportantElement. Elements may be present either exactly once, or the minimum and maximum occurrence is defined by ⁇ minOccurs> ... ⁇ maxOccurs> (as indicated by Cardinality in Tables 1-4 and Table 6-7).
  • Attributes in an XML document may be identified by a lower-case first letter as well as being are preceded by a ‘@’-sign, e.g. @attribute. To point to a specific attribute @attribute contained in an element Element, one may write Element@attribute. If an attribute's name consists of two or more combined words, camel-casing is typically used after the first word, e.g. @veryImportantAttribute. Attributes may have assigned a status in the XML as mandatory (M), required (R), optional (O), optional with default value (OD) and conditionally mandatory (CM) (as indicated by Cardinality in Tables 1-4 and Table 6-7).
  • M mandatory
  • R required
  • O optional
  • OD optional with default value
  • CM conditionally mandatory
  • Namespace qualification of elements and attributes may be used as per XML standards, in the form of namespace:Element or @namespace:attribute
  • the fully qualified namespace may be provided in the schema fragment associated with the declaration.
  • External specifications extending the namespace of DASH may be expected to document the element name in the semantic table with an extension namespace prefix.
  • Variables defined in the context of Tables is this disclosure may be specifically highlighted with italics, e.g. InternalVariable. Structures that are defined as part of the hierarchical data model may be identified by an upper-case first letter. It should be noted that other notational conventions may be used and the techniques described herein should not be limited based on example notational conventions described herein.
  • An application signaling table based on the application information table defined in ETSI TS 102 809 is currently proposed for inclusion in the ATSC 3.0 suite of standards for generating documents including application data elements.
  • a collection of applications that constitute an application-based feature are signaled using an XML document called an application signaling table, where an application signaling table includes as its root element an ApplicationList element, as defined in ETSI TS 102 809, further extended and constrained.
  • an application signaling table is included for each application-based enhancement.
  • Table 2 provides an example of an application signaling table containing an ApplicationList element, as defined in ETSI TS 102 809, as its root element and further including currently proposed extensions and constraints.
  • the following datatypes are specified in DataType field string, unsignedByte, unsignedShort, unsignedInt, unsignedLong, boolean, anyURI, dateTime, mpeg7:XXX, TransportProtocolDescriptorType, mhp:xxx, and ipi:xxx.
  • each of string, unsignedByte, unsignedShort, unsignedInt, unsignedLong, boolean, anyURI, dateTime may be defined according to XML specifications.
  • string may include a string of characters
  • unsignedByte may include an unsigned 8-bit integer
  • unsignedShort may include an unsigned 16-bit integer
  • unsignedInt may include an unsigned 32-bit integer
  • unsignedLong may include an unsigned 64-bit integer
  • boolean may be used to specify a boolean value (i.e., true or false)
  • anyURI may be used to specify a universal resource identifier (URI)
  • dateTime may be used to specify a date and a time according to the following form “YYYY-MM-DDThh:mm:ss” where, YYYY indicates the year, MM indicates the month, DD indicates the day, T indicates the start of the required time section, hh indicates the hour, mm indicates the minute, and ss indicates the second.
  • each of mpeg7:XXX, TransportProtocolDescriptorType, mhp:xxx and ipi:xxx may be used to specify an enumerated data type, as provided in ETSI TS 102 809.
  • the prefix, (i.e., mhp, ipi, and mpeg7) describes a namespace for the corresponding element or attribute.
  • a Request for Comments refers to RFC publication from the Internet Engineering Task Force (IETF), where each of the respective RFC included in Table 2 is incorporated by reference in its respective entirety.
  • RFC Request for Comments
  • Dotted-IPv4 refers to a dotted decimal notation of an Internet Protocol Version 4 address
  • EFDT refers to an electronic file delivery table.
  • An electronic file delivery table (EFDT) may be used to deliver files over a unidirectional transport.
  • a HTTP Secure may be a protocol for secure communication.
  • HTTPS may use a secure sockets layer (SSL) method.
  • HTTPS may user a transport layer security (TLS) method.
  • Table 3 represents a generalized version of Table 2. That is, Table 3 includes does not explicitly specify each of the children elements and attributes for each of appName, applicationIdentifier, applicationDescriptor, applicationUsageDescriptor, applicationBoundaryDescriptor, applicationTransport, applicationLocation, and applicationSpecificDescriptor included in Table 2. It should be noted that Table 3 includes a first level of children elements under applicationDescriptor, i.e., type, controlCode, visibility, serviceBound, priority, version, mhpVersion, and storageCapabilities.
  • application data elements may enable a receiver device to determine if a particular application is supported by the receiver device.
  • an application providing an alternative audio presentation may not be useful to a user with a hearing impairment or a user that is not fluent in the language of the alternative audio presentation.
  • a user with a disability may request that only applications that provide an adequate level of accessibility to the user are executed (e.g., only applications that are accessible to the visually impaired).
  • a service provider may classify applications based on accessibility.
  • the systems and techniques described herein enable accessibility information associated with an application associated with a video and/or audio presentation to be signalled. It should be noted that although the techniques described herein are described with respect accessibility levels associated with disabilities, in some examples, the techniques described herein may be generally applicable to user preferences.
  • FIG. 2 is a block diagram illustrating an example of a system that may implement one or more techniques described in this disclosure.
  • System 200 may be configured to communicate data in accordance with the techniques described herein.
  • system 200 includes one or more receiver devices 202A-202N, television service network 204, television service provider site 206, wide area network 212, one or more content provider sites 214A-214N, and one or more data provider sites 216A-216N.
  • System 200 may include software modules. Software modules may be stored in a memory and executed by a processor.
  • System 200 may include one or more processors and a plurality of internal and/or external memory devices.
  • Examples of memory devices include file servers, file transfer protocol (FTP) servers, network attached storage (NAS) devices, local disk drives, or any other type of device or storage medium capable of storing data.
  • Storage media may include Blu-ray discs, DVDs, CD-ROMs, magnetic disks, flash memory, or any other suitable digital storage media.
  • System 200 represents an example of a system that may be configured to allow digital media content, such as, for example, a movie, a live sporting event, etc., and data and applications and multimedia presentations associated therewith (e.g., caption services), to be distributed to and accessed by a plurality of computing devices, such as receiver devices 202A-202N.
  • receiver devices 202A-202N may include any device configured to receive data from television service provider site 206.
  • receiver devices 202A-202N may be equipped for wired and/or wireless communications and may include televisions, including so-called smart televisions, set top boxes, and digital video recorders.
  • receiver devices 202A-202N may include desktop, laptop, or tablet computers, gaming consoles, mobile devices, including, for example, “smart” phones, cellular telephones, and personal gaming devices configured to receive data from television service provider site 206.
  • system 200 is illustrated as having distinct sites, such an illustration is for descriptive purposes and does not limit system 200 to a particular physical architecture. Functions of system 200 and sites included therein may be realized using any combination of hardware, firmware and/or software implementations.
  • Television service network 204 is an example of a network configured to enable digital media content, which may include television services, to be distributed.
  • television service network 204 may include public over-the-air television networks, public or subscription-based satellite television service provider networks, and public or subscription-based cable television provider networks and/or over the top or Internet service providers.
  • television service network 204 may primarily be used to enable television services to be provided, television service network 204 may also enable other types of data and services to be provided according to any combination of the telecommunication protocols described herein.
  • television service network 204 may enable two-way communications between television service provider site 206 and one or more of receiver devices 202A-202N.
  • Television service network 204 may comprise any combination of wireless and/or wired communication media.
  • Television service network 204 may include coaxial cables, fiber optic cables, twisted pair cables, wireless transmitters and receivers, routers, switches, repeaters, base stations, or any other equipment that may be useful to facilitate communications between various devices and sites.
  • Television service network 204 may operate according to a combination of one or more telecommunication protocols.
  • Telecommunications protocols may include proprietary aspects and/or may include standardized telecommunication protocols. Examples of standardized telecommunications protocols include DVB standards, ATSC standards, ISDB standards, DTMB standards, DMB standards, Data Over Cable Service Interface Specification (DOCSIS) standards, HbbTV standards, W3C standards, and UPnP standards.
  • DOCSIS Data Over Cable Service Interface Specification
  • television service provider site 206 may be configured to distribute television service via television service network 204.
  • television service provider site 206 may include one or more broadcast stations, a cable television provider, a satellite television provider, or an Internet-based television provider.
  • television service provider site 206 includes service distribution engine 208 and database 210.
  • Service distribution engine 208 may be configured to receive data, including, for example, multimedia content, interactive applications, and messages, and distribute data to receiver devices 202A-202N through television service network 204.
  • service distribution engine 208 may be configured to transmit television services according to aspects of the one or more of the transmission standards described above (e.g., an ATSC standard).
  • service distribution engine 208 may be configured to receive data from one or more sources.
  • television service provider site 206 may be configured to receive a transmission including television programming through a satellite uplink/downlink. Further, as illustrated in FIG. 2, television service provider site 206 may be in communication with wide area network 212 and may be configured to receive data from content provider sites 214A-214N and further receive data from data provider sites 216A-216N. It should be noted that in some examples, television service provider site 206 may include a television studio and content may originate therefrom.
  • Database 210 may include storage devices configured to store data including, for example, multimedia content and data associated therewith, including for example, descriptive data and executable interactive applications. For example, a sporting event may be associated with an interactive application that provides statistical updates.
  • Data associated with multimedia content may be formatted according to a defined data format, such as, for example, HTML, Dynamic HTML, XML, and JSON, and may include URLs and URIs enabling receiver devices 202A-202N to access data, e.g., from one of data provider sites 216A-216N.
  • television service provider site 206 may be configured to provide access to stored multimedia content and distribute multimedia content to one or more of receiver devices 202A-202N through television service network 204.
  • multimedia content e.g., music, movies, and television (TV) shows
  • stored in database 210 may be provided to a user via television service network 204 on a so-called on demand basis.
  • Wide area network 212 may include a packet based network and operate according to a combination of one or more telecommunication protocols.
  • Telecommunications protocols may include proprietary aspects and/or may include standardized telecommunication protocols. Examples of standardized telecommunications protocols include Global System Mobile Communications (GSM) standards, code division multiple access (CDMA) standards, 3rd Generation Partnership Project (3GPP) standards, European Telecommunications Standards Institute (ETSI) standards, European standards (EN), IP standards, Wireless Application Protocol (WAP) standards, and Institute of Electrical and Electronics Engineers (IEEE) standards, such as, for example, one or more of the IEEE 802 standards (e.g., Wi-Fi).
  • GSM Global System Mobile Communications
  • CDMA code division multiple access
  • 3GPP 3rd Generation Partnership Project
  • ETSI European Telecommunications Standards Institute
  • EN European standards
  • IP standards European standards
  • WAP Wireless Application Protocol
  • IEEE Institute of Electrical and Electronics Engineers
  • Wide area network 212 may comprise any combination of wireless and/or wired communication media.
  • Wide area network 212 may include coaxial cables, fiber optic cables, twisted pair cables, Ethernet cables, wireless transmitters and receivers, routers, switches, repeaters, base stations, or any other equipment that may be useful to facilitate communications between various devices and sites.
  • wide area network 216 may include the Internet.
  • content provider sites 214A-214N represent examples of sites that may provide multimedia content to television service provider site 206 and/or receiver devices 202A-202N.
  • a content provider site may include a studio having one or more studio content servers configured to provide multimedia files and/or streams to television service provider site 206.
  • content provider sites 214A-214N may be configured to provide multimedia content using the IP suite.
  • a content provider site may be configured to provide multimedia content to a receiver device according to Real Time Streaming Protocol (RTSP), or HTTP.
  • RTSP Real Time Streaming Protocol
  • Data provider sites 216A-216N may be configured to provide data, including hypertext based content, and the like, to one or more of receiver devices 202A-202N and/or television service provider site 206 through wide area network 212.
  • a data provider site 216A-216N may include one or more web servers.
  • Data provided by data provider site 216A-216N may be defined according to data formats, such as, for example, HTML, Dynamic HTML, XML, and JSON.
  • An example of a data provider site includes the United States Patent and Trademark Office website. It should be noted that in some examples, data provided by data provider sites 216A-216N may be utilized for so-called second screen applications.
  • companion device(s) in communication with a receiver device may display a website in conjunction with television programming being presented on the receiver device.
  • data provided by data provider sites 216A-216N may include audio and video content.
  • data elements that describe applications may be delivered through HTTP.
  • data provider sites 216A-216N may be configured generate data or documents including applications and/or data elements that describe applications according to one or more of the techniques described herein.
  • service distribution engine 208 may be configured to receive data, including, for example, multimedia content, interactive applications, and messages, and distribute data to receiver devices 202A-202N through television service network 204.
  • FIG. 3 is a block diagram illustrating an example of a service distribution engine that may implement one or more techniques of this disclosure.
  • Service distribution engine 300 may be configured to receive data and output a signal representing that data for distribution over a communication network, e.g., television service network 204.
  • service distribution engine 300 may be configured to receive one or more data streams and output a signal that may be transmitted using a single radio frequency band (e.g., a 6 MHz channel, an 8 MHz channel, etc.) or a bonded channel (e.g., two separate 6 MHz channels).
  • a data stream may generally refer to data encapsulated in a set of one or more data packets.
  • service distribution engine 300 includes transport package generator 302, transport/network packet generator 304, link layer packet generator 306, frame builder and waveform generator 308, and system memory 310.
  • transport package generator 302, transport/network packet generator 304, link layer packet generator 306, frame builder and waveform generator 308, and system memory 310 may be interconnected (physically, communicatively, and/or operatively) for inter-component communications and may be implemented as any of a variety of suitable circuitry, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware or any combinations thereof.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • service distribution engine 300 is illustrated as having distinct functional blocks, such an illustration is for descriptive purposes and does not limit service distribution engine 300 to a particular hardware architecture. Functions of service distribution engine 300 may be realized using any combination of hardware, firmware and/or software implementations.
  • System memory 310 may be described as a non-transitory or tangible computer-readable storage medium. In some examples, system memory 310 may provide temporary and/or long-term storage. In some examples, system memory 310 or portions thereof may be described as non-volatile memory and in other examples portions of system memory 310 may be described as volatile memory. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), and static random access memories (SRAM). Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. System memory 310 may be configured to store information that may be used by service distribution engine 300 during operation.
  • RAM random access memories
  • DRAM dynamic random access memories
  • SRAM static random access memories
  • EPROM electrically programmable memories
  • EEPROM electrically erasable and programmable
  • system memory 310 may include individual memory elements included within each of transport package generator 302, transport/network packet generator 304, link layer packet generator 306, and frame builder and waveform generator 308.
  • system memory 310 may include one or more buffers (e.g., First-in First-out (FIFO) buffers) configured to store data for processing by a component of service distribution engine 300.
  • FIFO First-in First-out
  • Transport package generator 302 may be configured to receive data and generate a transport package according to a defined applicant transport package structure.
  • transport package generator 302 may be configured to receive one or more segments of encoded video data and generate a package based on MMTP.
  • Transport/network packet generator 304 may be configured to receive a transport package and encapsulate the transport package into corresponding transport layer packets (e.g., UDP, Transport Control Protocol (TCP), etc.) and network layer packets (e.g., IPv4, IPv6, compressed IP packets, etc.).
  • transport layer packets e.g., UDP, Transport Control Protocol (TCP), etc.
  • network layer packets e.g., IPv4, IPv6, compressed IP packets, etc.
  • transport package generator 302 may be configured to receive data or documents including applications and/or data elements that describe applications generate a package or a similar data structure based on received data according to one or more techniques of this disclosure.
  • Link layer packet generator 306 may be configured to receive network packets and generate packets according to a defined link layer packet structure (e.g., an ATSC 3.0 link layer packet structure).
  • Frame builder and waveform generator 308 may be configured to receive one or more link layer packets and output symbols (e.g., OFDM symbols) arranged in a frame structure.
  • a frame structure may include a bootstrap, a preamble, and a data payload including one or more PLPs.
  • a frame may be referred to as a physical layer frame (PHY-Layer frame).
  • a bootstrap may act as a universal entry point for a waveform.
  • a preamble may include so-called Layer-1 signaling (L1-signaling).
  • L1-signaling may provide the necessary information to configure physical layer parameters.
  • Frame builder and waveform generator 308 may be configured to produce a signal for transmission within one or more of types of RF channels: a single 6 MHz channel, a single 7 MHz channel, single 8 MHz channel, a single 11 MHz channel, and bonded channels including any two or more separate single channels (e.g., a 14 MHz channel including a 6 MHz channel and a 8 MHz channel).
  • Frame builder and waveform generator 308 may be configured to insert pilots and reserved tones for channel estimation and/or synchronization. In one example, pilots and reserved tones may be defined according to an orthogonal frequency division multiplexing (OFDM) symbol and sub-carrier frequency map.
  • OFDM orthogonal frequency division multiplexing
  • Frame builder and waveform generator 308 may be configured to generate an OFDM waveform by mapping OFDM symbols to sub-carriers. It should be noted that in some examples, frame builder and waveform generator 308 may be configured to support layer division multiplexing. Layer division multiplexing may refer to super-imposing multiple layers of data on the same RF channel (e.g., a 6 MHz channel). Typically, an upper layer refers to a core (e.g., more robust) layer supporting a primary media service and a lower layer refers to a high data rate layer supporting enhanced services. For example, an upper layer could support basic High Definition video content and a lower layer could support enhanced Ultra-High Definition video content.
  • Table 4 provides an example of an application service table including data elements that describe accessibility of an application. It should be noted that similar to Table 3 above, Table 4 is a generalized application service table and includes a first layer of children elements under applicationDescriptor, i.e., type, controlCode, visibility, serviceBound, priority, version, applicationAccessibility, mhpVersion, and storageCapabilities. Thus, Table 4 represents an example of an application service table including data elements that describe accessibility of an application as part of the general properties of an application.
  • applicationDescriptor i.e., type, controlCode, visibility, serviceBound, priority, version, applicationAccessibility, mhpVersion, and storageCapabilities.
  • applicationAccessibility may be included as an element at the same level as appName, applicationIdentifier, applicationDescriptor, applicationUsageDescriptor, applicationBoundaryDescriptor, applicationTransport, applicationLocation, and applicationSpecificDescriptor.
  • @language attribute includes a language identifier as defined by Internet Engineering Task Force (IETF) Best Current Practice (BCP) 47.
  • BCP is a persistent name for a series of IETF RFCs whose numbers change as they are updated. The latest RFC describing language tag syntax is RFC 5646, Tags for the Identification of Languages, which is incorporated by reference herein, and it obsoletes the older RFCs 4646, 3066 and 1766.
  • IETF BCP 47 identified by month and year of publication
  • a particular RFC such RFC 5646 instead of IETF BCP 47 is used for defining the values that @language attribute may take on to facilitate lower receiver complexity.
  • the length of value of xml:lang is variable.
  • IETF BCP 47 may be used to represent a language of a caption service in a more efficient manner than ISO 639.2/B and ISO 8859-1.
  • Table 6 represents an example where applicationAccessibility is based on defined levels of conformance and Table 7 represents an example where applicationAccessibility is based on defined groups. Accessibility levels of an application may be based on an established accessibility guideline.
  • An example of an established accessibility guideline includes the “Web Content Accessibility Guidelines (WCAG) 2.0”, 11 December 2008, published by the W3C (hereinafter “WCAG 2.0”), which is incorporated by reference in its entirety.
  • WCAG 2.0 includes a wide range of recommendations for making Web content more accessible.
  • WCAG 2.0 anticipates that by authors of Web content following the guidelines included therein content will be made more accessible to a wider range of people with disabilities, including blindness and low vision, deafness and hearing loss, learning disabilities, cognitive limitations, limited movement, speech disabilities, photosensitivity and combinations thereof and further may make content more usable to users in general.
  • WCAG 2.0 provides the following 12 basic guidelines: Guideline 1.1 Text Alternatives: Provide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols or simpler language. Guideline 1.2 Time-based Media: Provide alternatives for time-based media. Guideline 1.3 Adaptable: Create content that can be presented in different ways (for example simpler layout) without losing information or structure. Guideline 1.4 Distinguishable: Make it easier for users to see and hear content including separating foreground from background. Guideline 2.1 Keyboard Accessible: Make all functionality available from a keyboard. Guideline 2.2 Enough Time: Provide users enough time to read and use content. Guideline 2.3 Seizures: Do not design content in a way that is known to cause seizures.
  • Guideline 2.4 Navigable Provide ways to help users navigate, find content, and determine where they are.
  • Guideline 3.1 Readable Make text content readable and understandable.
  • Guideline 3.2 Predictable Make Web pages appear and operate in predictable ways.
  • Guideline 3.3 Input Assistance Help users avoid and correct mistakes.
  • Guideline 4.1 Compatible Maximize compatibility with current and future user agents, including assistive technologies.
  • WCAG 2.0 includes success criteria for each guideline.
  • a guideline may be considered met when success criteria is satisfied.
  • WCAG 2.0 defines the following conformance levels: A (lowest), AA, AAA (highest) for success criteria.
  • a (lowest), AA, AAA highest
  • a level A conformance may occur when a web page satisfies a success criteria by having titles that describe topic or purpose
  • a level AA conformance may occur when a web page, in addition to satisfying success criteria associated with level A, satisfies a success criteria by having headings and labels describe topic or purpose
  • a level AAA conformance may occur when a web page, in addition to satisfying success criteria associated with level A and level AA, satisfies a success criteria by having section headings are used to organize the content.
  • the working group has also documented a wide variety of informative techniques.
  • the informative techniques fall into two categories: (1) those that are sufficient for meeting the success criteria and (2) those that are advisory.
  • the advisory techniques typically go beyond what is required by the individual success criteria and allow authors to better address the guidelines.
  • Some advisory techniques address accessibility barriers that are not covered by testable success criteria.
  • common known failures have also been documented by the W3C, for example in “Sufficient and Advisory Techniques in Understanding WCAG 2.0.” It should be noted that although the techniques described here are described with respect to WCAG 2.0, the techniques described herein are generally applicable to future versions of Web Content Accessibility Guidelines, published by the W3C. For example, the techniques described here are applicable to future Web Content Accessibility Guidelines including any combination of additional and/or redacted guidelines, success criteria, and conformance level.
  • Table 5 is a table summarizing success criteria and levels of conformances for WCAG 2.0. It should be noted, the success criteria listed under columns Level A, Level AA, Level AAA generally describe the success criteria, specific detail of success criteria can be found in WCAG 2.0 document. In the example in Table 5, for Level A conformance all the goals/success criteria identified under Level A column must be met. For level AA conformance all the goals/success criteria identified under Level A column and Level AA column must be met. For level AAA conformance all the goals/success criteria identified under Level A column and Level AA column and Level AAA column must be met. In yet another example, one may consider conformance for a Guideline. For example, for Guideline 2.4, respective success criteria are provided for each level of conformance. For example, level AAA conformance occurs if 2.4.X success criteria in column Level A and Level AA are satisfied as well as 2.4.X success criteria in the Level AAA column.
  • Content including an application enhancement may be classified based on properties included in the content that are included in Table 5.
  • an application may include the 25 properties included in the Level A column of Table 5, another application may additionally include the 13 properties included in the Level AA column of Table 5, and another application may additionally include the 23 properties included in the Level AAA column.
  • each application may be respective described as having a Level A level of conformance (i.e., the application satisfies all the Level A success criteria), a Level AA level of conformance (i.e., the application satisfies all the Level A and Level AA success criteria), and a Level AAA level of conformance, (i.e., the application satisfies all the Level A, Level AA and Level AAA success criteria).
  • levels of conformance may be provided on a respective guideline basis.
  • guidelines may be grouped and conformance may be provided for each group. For example, one group may be related to media and another group may be related to text. In another example, one group may be related to the visually impaired and another group may be related to hearing impaired, and another group may be related to mobility impaired.
  • Table 5 may be modified to include fewer or more success criteria. Further, it should be noted that in some examples, success criterion included in one column in Table 5 may be moved to another column. Such modifications may be useful in cases where applications providing enhancements associated with a primary service have more particular accessibility requirements than web content not associated with another service.
  • the media may have associated accessibility metadata information that may be signaled along with the media. For example, an audio track may be identified as intended for descriptive video service, a subtitle track may be identified as captions, a video track may be identified as being for sign language, a video track may be identified as including burnt-in captions, a video track may be identified as including sign language video (for e.g. using picture in picture). In such an event, related guidelines in Table 5 be ignored/removed when evaluating conformance.
  • applicationAccessibility may signal accessibility information based on the example levels of conformance illustrated in Table 5.
  • Table 6 represents an example of an XML representation of applicationAccessibility that may be used to signal a level of accessibility based on the example levels of conformance illustrated in Table 5.
  • attribute @name indicates the name of an established accessibility guideline. It should be noted that in some examples, an established accessibility guideline may include a guideline other than WCAG 2.0. As further illustrated in Table 6 @Level attribute may be a string specifying one of Level A, Level AA, or Level AAA conformance. Further, in one example, @Level attribute may signal a level of conformance using an unsignedByte. It should be noted that in the case where an unsignedByte is used, values may be provided for partial levels of conformance.
  • a value of 0 may indicate “A”
  • a value of 1 may indicate that 6 or the 13 criterion in Level 2 are satisfied, i.e., “partial AA”
  • a value of 2 may indicate “AA”
  • a value of 3 may indicate “AAA”.
  • @Level is an unsignedByte
  • other partial levels of conformance may be indicated using the 256 values available using the unsignedByte dataType. That is, any number of combinations of the 61 success criteria illustrated in Table 6 may be signalled as satisfied. As illustrated in Table 6, Exception enables success criteria that are not satisfied to be explicitly signalled.
  • Table 6 provides multiple possible ways to signal whether success criteria included in Table 5 are satisfied.
  • Exception element may be redundant and optional.
  • @Level is an string
  • other partial levels of conformance may be indicated using a “0” of “1” corresponding to the 61 success criteria illustrated in Table 6, concatenated together, where “0” may indicate it is unknown if the corresponding success criteria is met and a “1” may indicate that the corresponding success criteria is met.
  • “0” may indicate the corresponding success criteria is not met.
  • a value “2” may also be used to form the concatenated string and indicate the corresponding success criteria is not met.
  • an application may include multiple pages (e.g., multiple HTML pages).
  • PagesWithUnkownAccessibiltyLevel element in Table 6 enables pages with unknown levels of accessibility to be signaled. It should be noted that in other examples, any suitable data structure (e.g., combination of binary values) may be used to communicate the accessibility information for one or more pages associated with an application.
  • FIG. 4A is a computer program listing illustrating an example of signalling a level of accessibility according to one or more techniques of this disclosure.
  • an application signaling table specifies that a corresponding application has a level of accessibility corresponding to a Level AA level of conformance, as provided in Table 5, with the exception that the application does satisfy the 2.1.1 Keyboard criteria or the 2.1.2 No Keyboard Trap criteria, as provided in WCAG 2.0.
  • a receiver device may be configured to compare the accessibility metadata (e.g., level of accessibility and exceptions) and preferences set by the user and may take some action, (e.g., block an application from running, allow an application to run etc.).
  • the accessibility metadata e.g., level of accessibility and exceptions
  • preferences set by the user may take some action, (e.g., block an application from running, allow an application to run etc.).
  • FIG. 4A is a computer program listing illustrating an example of signalling a level of accessibility according to one or more techniques of this disclosure. In the example illustrated in FIG. 4B, an @Guideline attribute is included in the document.
  • a service provider may be useful for a service provider to classify an application as: accessible to visually impaired, accessible to color blind, accessible to blind, accessible to low vision, accessible to hearing impaired, accessible to deaf, accessible to hard of hearing, accessible to mobility impaired, and/or enables assistive technologies.
  • Each of visually impaired, color blind, blind, low vision, hearing impaired, deaf, hard of hearing, mobility impaired, and requiring assistive technologies may be referred to as a defined group.
  • accessibility information may be signalled based on defined groups.
  • Table 7 represents an example of an XML representation of applicationAccessibility that may be used to signal accessibility information based on defined groups. It should be noted that in other examples, Table 7 may include fewer or more defined groups.
  • VisuallyImpairedAccessible, HearingImpairedAccessible, and, MobilityImpairedAccessible may be associated with an unsignedByte value indicating one of the following: 0: application is not accessible to the defined group; 1: application is accessible to the defined group; or 2: it is unknown if the application is accessible to the defined group
  • the default value inferred by receiver when an element is not present may be 0. In one example, the default value inferred by receiver when an element is not present may be 1. In one example, the default value inferred by receiver when an element is not present may be 2.
  • accessibility for a defined group may be based on success criteria. For example, for VisuallyImpairedAccessible to indicate an application is accessible, one or more success criteria provided Guideline 1.1 Text Alternatives in WCAG 2.0 may be required to be satisfied. It should be noted that accessibility with respect to a particular group may be based on any number of combinations of success criteria, including success criteria from multiple guidelines.
  • a device configured to transmit an application signaling table including one or more elements included in either Table 6 or Table 7 represents an example of a device configured to signal accessibility characteristics associated with an application associated with a video and/or audio presentation.
  • the different application accessibility (e.g. Table 6, Table 7) signaling and the variants described herein may be included in metadata describing another file containing an application.
  • an application may be included in a future version of the International Organization for Standardization (ISO) Base Media File (BMFF) file (14496-12 ISO Base Media File Format) and the application accessibility signaling may be included in a Media Presentation Description (MPD).
  • MPD is described in ISO/IEC 23009-1:2014, “Information technology -- Dynamic adaptive streaming over HTTP (DASH) -- Part 1: Media presentation description and segment formats”, incorporated by reference herein in its entirety.
  • FIG. 5 is a block diagram illustrating an example of a receiver device that may implement one or more techniques of this disclosure. That is, receiver device 500 may be configured to parse a signal based on the semantics described above with respect to Table 6 or Table 7 and execute applications based on the parsed signal. Receiver device 500 is an example of a computing device that may be configured to receive data from a communications network and allow a user to access multimedia content. In the example illustrated in FIG. 5, receiver device 500 is configured to receive data via a television network, such as, for example, television service network 204 described above. Further, in the example illustrated in FIG. 5, receiver device 500 is configured to send and receive data via a wide area network. It should be noted that in other examples, receiver device 500 may be configured to simply receive data through a television service network 204. The techniques described herein may be utilized by devices configured to communicate using any and all combinations of communications networks.
  • receiver device 500 includes central processing unit(s) 502, system memory 504, system interface 510, data extractor 512, audio decoder 514, audio output system 516, video decoder 518, display system 520, I/O device(s) 522, and network interface 524.
  • system memory 504 includes operating system 506 and applications 508.
  • Each of central processing unit(s) 502, system memory 504, system interface 510, data extractor 512, audio decoder 514, audio output system 516, video decoder 518, display system 520, I/O device(s) 522, and network interface 524 may be interconnected (physically, communicatively, and/or operatively) for inter-component communications and may be implemented as any of a variety of suitable circuitry, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware or any combinations thereof.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • receiver device 500 is illustrated as having distinct functional blocks, such an illustration is for descriptive purposes and does not limit receiver device 500 to a particular hardware architecture. Functions of receiver device 500 may be realized using any combination of hardware, firmware and/or software implementations.
  • CPU(s) 502 may be configured to implement functionality and/or process instructions for execution in receiver device 500.
  • CPU(s) 502 may include single and/or multi-core central processing units.
  • CPU(s) 502 may be capable of retrieving and processing instructions, code, and/or data structures for implementing one or more of the techniques described herein. Instructions may be stored on a computer readable medium, such as system memory 504.
  • System memory 504 may be described as a non-transitory or tangible computer-readable storage medium. In some examples, system memory 504 may provide temporary and/or long-term storage. In some examples, system memory 504 or portions thereof may be described as non-volatile memory and in other examples portions of system memory 504 may be described as volatile memory. System memory 504 may be configured to store information that may be used by receiver device 500 during operation, including for example user preferences. System memory 504 may be used to store program instructions for execution by CPU(s) 502 and may be used by programs running on receiver device 500 to temporarily store information during program execution. Further, in the example where receiver device 500 is included as part of a digital video recorder, system memory 504 may be configured to store numerous video files.
  • Applications 508 may include applications implemented within or executed by receiver device 500 and may be implemented or contained within, operable by, executed by, and/or be operatively/communicatively coupled to components of receiver device 500. Applications 508 may include instructions that may cause CPU(s) 502 of receiver device 500 to perform particular functions. Applications 508 may include algorithms which are expressed in computer programming statements, such as, for-loops, while-loops, if-statements, do-loops, etc. Applications 508 may be developed using a specified programming language. Examples of programming languages include, Java TM , Jini TM , C, C++, Objective C, Swift, Perl, Python, PhP, UNIX Shell, Visual Basic, and Visual Basic Script.
  • receiver device 500 includes a smart television
  • applications may be developed by a television manufacturer or a broadcaster.
  • applications 508 may execute in conjunction with operating system 506. That is, operating system 506 may be configured to facilitate the interaction of applications 508 with CPUs(s) 502, and other hardware components of receiver device 500.
  • Operating system 506 may be an operating system designed to be installed on set-top boxes, digital video recorders, televisions, and the like. It should be noted that techniques described herein may be utilized by devices configured to operate using any and all combinations of software architectures.
  • System interface 510 may be configured to enable communications between components of receiver device 500.
  • system interface 510 comprises structures that enable data to be transferred from one peer device to another peer device or to a storage medium.
  • system interface 510 may include a chipset supporting Accelerated Graphics Port (AGP) based protocols, Peripheral Component Interconnect (PCI) bus based protocols, such as, for example, the PCI Express TM (PCIe) bus specification, which is maintained by the Peripheral Component Interconnect Special Interest Group, or any other form of structure that may be used to interconnect peer devices (e.g., proprietary bus protocols).
  • AGP Accelerated Graphics Port
  • PCI Peripheral Component Interconnect
  • PCIe PCI Express TM
  • PCIe Peripheral Component Interconnect Special Interest Group
  • receiver device 500 is configured to receive and, optionally, send data via a television service network.
  • a television service network may operate according to a telecommunications standard.
  • a telecommunications standard may define communication properties (e.g., protocol layers), such as, for example, physical signaling, addressing, channel access control, packet properties, and data processing.
  • data extractor 512 may be configured to extract video, audio, and data from a signal.
  • a signal may be defined according to, for example, aspects DVB standards, ATSC standards, ISDB standards, DTMB standards, DMB standards, and DOCSIS standards.
  • Data extractor 512 may be configured to extract video, audio, and data, from a signal generated by service distribution engine 300 described above. That is, data extractor 512 may operate in a reciprocal manner to service distribution engine 300. Further, data extractor 512 may be configured to parse link layer packets based on any combination of one or more of the structures described above.
  • Audio decoder 514 may be configured to receive and process audio packets.
  • audio decoder 514 may include a combination of hardware and software configured to implement aspects of an audio codec. That is, audio decoder 514 may be configured to receive audio packets and provide audio data to audio output system 516 for rendering.
  • Audio data may be coded using multi-channel formats such as those developed by Dolby and Digital Theater Systems. Audio data may be coded using an audio compression format. Examples of audio compression formats include Motion Picture Experts Group (MPEG) formats, Advanced Audio Coding (AAC) formats, DTS-HD formats, and Dolby Digital (AC-3) formats.
  • MPEG Motion Picture Experts Group
  • AAC Advanced Audio Coding
  • DTS-HD formats DTS-HD formats
  • AC-3 formats Dolby Digital
  • Audio output system 516 may be configured to render audio data.
  • audio output system 516 may include an audio processor, a digital-to-analog converter, an amplifier, and a speaker system.
  • a speaker system may include any of a variety of speaker systems, such as headphones, an integrated stereo speaker system, a multi-speaker system, or a surround sound system.
  • Video decoder 518 may be configured to receive and process video packets.
  • video decoder 518 may include a combination of hardware and software used to implement aspects of a video codec.
  • video decoder 518 may be configured to decode video data encoded according to any number of video compression standards, such as ITU-T H.262 or ISO/IEC MPEG-2 Visual, ISO/IEC MPEG-4 Visual, ITU-T H.264 (also known as ISO/IEC MPEG-4 Advanced video Coding (AVC)), and High-Efficiency Video Coding (HEVC).
  • Display system 520 may be configured to retrieve and process video data for display. For example, display system 520 may receive pixel data from video decoder 518 and output data for visual presentation.
  • display system 520 may be configured to output graphics in conjunction with video data, e.g., graphical user interfaces.
  • Display system 520 may comprise one of a variety of display devices such as a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device capable of presenting video data to a user.
  • a display device may be configured to display standard definition content, high definition content, or ultra-high definition content.
  • I/O device(s) 522 may be configured to receive input and provide output during operation of receiver device 500. That is, I/O device(s) 522 may enable a user to select multimedia content to be rendered. Input may be generated from an input device, such as, for example, a push-button remote control, a device including a touch-sensitive screen, a motion-based input device, an audio-based input device, or any other type of device configured to receive user input. I/O device(s) 522 may be operatively coupled to receiver device 500 using a standardized communication protocol, such as for example, Universal Serial Bus protocol (USB), Bluetooth, ZigBee or a proprietary communications protocol, such as, for example, a proprietary infrared communications protocol.
  • USB Universal Serial Bus protocol
  • ZigBee ZigBee
  • proprietary communications protocol such as, for example, a proprietary infrared communications protocol.
  • Network interface 524 may be configured to enable receiver device 500 to send and receive data via a local area network and/or a wide area network.
  • Network interface 524 may include a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device configured to send and receive information.
  • Network interface 524 may be configured to perform physical signaling, addressing, and channel access control according to the physical and Media Access Control (MAC) layers utilized in a network.
  • Receiver device 500 may be configured to parse a signal generated according to any of the techniques described above.
  • Receiver device 500 may be configured to receive and parse any of the application signaling tables described above and further store user preferences, including for example accessibility preferences.
  • Receiver device 500 may further cause an action associated with an application to occur based on accessibility information included in an application signaling table and user preference information.
  • receiver device 500 represents an example of a device configured to parse a syntax element indicating accessibility of the application, and perform an action based on the parsed syntax element.
  • Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol.
  • Computer-readable media generally may correspond to (1) tangible computer-readable storage media which is non-transitory or (2) a communication medium such as a signal or carrier wave.
  • Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure.
  • a computer program product may include a computer-readable medium.
  • such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • any connection is properly termed a computer-readable medium.
  • a computer-readable medium For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
  • DSL digital subscriber line
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described herein.
  • the functionality described herein may be provided within dedicated hardware and/or software modules configured for encoding and decoding, or incorporated in a combined codec. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • the techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set).
  • IC integrated circuit
  • a set of ICs e.g., a chip set.
  • Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a codec hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
  • each functional block or various features of the base station device and the terminal device used in each of the aforementioned embodiments may be implemented or executed by a circuitry, which is typically an integrated circuit or a plurality of integrated circuits.
  • the circuitry designed to execute the functions described in the present specification may comprise a general-purpose processor, a digital signal processor (DSP), an application specific or general application integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic devices, discrete gates or transistor logic, or a discrete hardware component, or a combination thereof.
  • the general-purpose processor may be a microprocessor, or alternatively, the processor may be a conventional processor, a controller, a microcontroller or a state machine.
  • the general-purpose processor or each circuit described above may be configured by a digital circuit or may be configured by an analogue circuit. Further, when a technology of making into an integrated circuit superseding integrated circuits at the present time appears due to advancement of a semiconductor technology, the integrated circuit by this technology is also able to be used.
  • a method for signalling information associated with an application associated with a video or audio service comprises signalling a syntax element indicating a level of conformance associated with the application, and signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
  • a method for signalling information associated with an application associated with a video or audio service comprises for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
  • a device for signalling information associated with an application associated with a video or audio service comprises one or more processors configured to signal a syntax element indicating a level of conformance associated with the application, and signal zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
  • a device for signalling information associated with an application associated with a video or audio service comprises one or more processors configured to for one or more defined groups, signal a respective syntax element indicating accessibility information of the application with respect to defined groups.
  • an apparatus for signalling information associated with an application associated with a video or audio service comprises means for signalling a syntax element indicating a level of conformance associated with the application, and means for signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
  • an apparatus for signalling information associated with an application associated with a video or audio service comprises means for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
  • a non-transitory computer-readable storage medium comprises instructions stored thereon that upon execution cause one or more processors of a device to signal a syntax element indicating a level of conformance associated with the application, and signal zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
  • a non-transitory computer-readable storage medium comprises instructions stored thereon that upon execution cause one or more processors of a device to for one or more defined groups, signal a respective syntax element indicating accessibility information of the application with respect to defined groups.
  • a method for parsing information associated with an application associated with a video or audio service comprises parsing a syntax element indicating accessibility of the application, and performing an action based on the parsed syntax element.
  • a device for parsing information associated with an application associated with a video or audio service comprises one or more processors configured to parse a syntax element indicating accessibility of the application, and perform an action based on the parsed syntax element.
  • an apparatus for parsing information associated with an application associated with a video or audio service comprises means for parsing a syntax element indicating accessibility of the application, and means for performing an action based on the parsed syntax element.
  • a non-transitory computer-readable storage medium comprises instructions stored thereon that upon execution cause one or more processors of a device to parse a syntax element indicating accessibility of the application, and perform an action based on the parsed syntax element.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

A device may be configured to signal accessibility characteristics associated with an application associated with a video and/or audio presentation. In some examples, accessibility characteristics may include levels of accessibility based on an established accessibility guideline. A device may signal a level of conformance. A device may signal accessibility with respect to a defined group. A receiver device may parse a syntax element indicating accessibility of an application and perform an action based on the accessibility of an application.

Description

SYSTEMS AND METHODS FOR SIGNALLING APPLICATION ACCESSIBILITY
The present disclosure relates to the field of interactive television.
<Cross Reference to related applications>
This Non-provisional application claims priority under 35 U.S.C. § 119 on U.S. Provisional Patent Application No. 62/262,326, filed on December 2, 2015, the entire contents of which are hereby incorporated by reference.
Digital media playback capabilities may be incorporated into a wide range of devices, including digital televisions, including so-called “smart” televisions, set-top boxes, laptop or desktop computers, tablet computers, digital recording devices, digital media players, video gaming devices, cellular phones, including so-called “smart” phones, dedicated video streaming devices, and the like. Digital media content (e.g., video and audio programming) may originate from a plurality of sources including, for example, over-the-air television providers, satellite television providers, cable television providers, online media service providers, including, so-called streaming service providers, and the like. Digital media content may be delivered over packet-switched networks, including bidirectional networks, such as Internet Protocol (IP) networks and unidirectional networks, such as digital broadcast networks.
Digital media content may be transmitted from a source to a receiver device (e.g., a digital television or a smart phone) according to a transmission standard. Examples of transmission standards include Digital Video Broadcasting (DVB) standards, Integrated Services Digital Broadcasting Standards (ISDB) standards, and standards developed by the Advanced Television Systems Committee (ATSC), including, for example, the ATSC 2.0 standard. The ATSC is currently developing the so-called ATSC 3.0 suite of standards. The ATSC 3.0 suite of standards seeks to support a wide range of diverse video based services through diverse delivery mechanisms. For example, the ATSC 3.0 suite of standards seeks to support broadcast video delivery, so-called broadcast streaming/file download video delivery, so-called broadband streaming/file download video delivery, and combinations thereof (i.e., “hybrid services”). An example of a hybrid video service contemplated for the ATSC 3.0 suite of standards includes a receiver device receiving an over-the-air video broadcast (e.g., through a unidirectional transport) and receiving a synchronized closed caption presentation from an online media service provider through a packet switched network (e.g., through a bidirectional transport). A synchronized closed caption presentation may represent an example of an application based enhancement, which may be referred to as an app-based enhancement or feature. Other examples of application based enhancements include targeted advertising and enhancements providing interactive viewing experiences. Current techniques for signalling information associated with and/or properties of application based enhancements may be less than ideal.
In general, this disclosure describes techniques for signalling (or signaling) information associated with application based enhancements. In particular, this disclosure describes techniques for signalling accessibility characteristics associated with an application associated with a video and/or audio presentation. In some examples, accessibility characteristics may include levels of accessibility based on an established accessibility guideline. It should be noted that although in some examples the techniques of this disclosure are described with respect to ATSC standards, including those currently under development, the techniques described herein are generally applicable to any transmission standard. For example, the techniques described herein are generally applicable to any of DVB standards, ISDB standards, ATSC Standards, Digital Terrestrial Multimedia Broadcast (DTMB) standards, Digital Multimedia Broadcast (DMB) standards, Hybrid Broadcast and Broadband Television (HbbTV) standard, World Wide Web Consortium (W3C) standards, Universal Plug and Play (UPnP) standards, and other video encoding standards.
According to one example of the disclosure, a method for signalling information associated with an application associated with a video or audio service, comprises signalling a syntax element indicating a level of conformance associated with the application, and signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
A method for signalling information associated with an application associated with a video or audio service, comprises for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
According to one example of the disclosure, a method for parsing information associated with an application associated with a video or audio service, comprises parsing a syntax element indicating accessibility of the application, and performing an action based on the parsed syntax element.
The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
FIG. 1 is a conceptual diagram illustrating an example of content delivery protocol model according to one or more techniques of this disclosure. FIG. 2 is a block diagram illustrating an example of a system that may implement one or more techniques of this disclosure. FIG. 3 is a block diagram illustrating an example of a service distribution engine that may implement one or more techniques of this disclosure. FIG. 4A is a computer program listing illustrating an example of signalling accessibility information according to one or more techniques of this disclosure. FIG. 4B is a computer program listing illustrating an example of signalling accessibility information according to one or more techniques of this disclosure. FIG. 5 is a block diagram illustrating an example of a receiver device that may implement one or more techniques of this disclosure.
Zero or more application-based features or enhancements may be associated with digital media content transmitted from a source to a receiver device. It should be noted that in some examples digital media content transmitted from a source to a receiver device and/or enhancements associated therewith may be referred to as a service. An application may enable enhanced presentations for a primary media service (e.g., a television program, a movie, an event, or the like). For example, an application may enable an enhanced audio presentation (e.g., alternative language, commentary etc.), an enhanced video presentation (e.g., secondary views, etc.), and caption services (e.g., subtitles) to be presented in conjunction with an over-the-air television broadcast. Further, applications may enable a programmatically controlled user experience. For example, an application may enable targeted advertisements based on a user’s viewing history to be presented, enable interactive gaming, and the like. It should be noted that an application may enable a so-called second screen enhancement. That is, for example, a primary media service may be rendered on a primary display device (e.g., a digital television) and a corresponding application (e.g., an interactive game) may be executed on a secondary computing device (e.g., a tablet computing device).
An application may include a downloaded application or a native application. In some examples, a downloaded application may refer to a collection of downloaded documents enabling a self-contained function. Documents may include multimedia files and files defined according to a particular programming language. Examples of programming languages include, Hypertext Markup Language (HTML), Dynamic HTML, eXtensible Markup Language (XML), Java, JavaScript, JavaScript Object Notation (JSON), and Cascading Style Sheets (CSS). In one example, a native application may refer to software stored on a receiver device configured to perform a function using downloaded data. For example, an electronic programming guide application stored on a set-top box may be configured to display television listings using electronic service guide data received from a server.
In some cases, each application and/or respective application data may be separately signaled. In this manner, creators of respective diverse applications, including applications associated with the same primary media service, may independently develop their applications. Further, applications and/or respective application data may be signaled using diverse delivery mechanisms. For example, applications, application data, and/or components thereof may be delivered over bidirectional networks or unidirectional networks. In this manner, for a particular primary media service, a plurality of diverse independently developed applications may be available through a plurality of diverse sources (e.g., a broadcaster or various servers).
Computing devices and/or transmission systems may be based on models including one or more abstraction layers, where data at each abstraction layer is represented according to particular structures, e.g., packet structures, modulation schemes, etc. An example of a model including defined abstraction layers is the so-called Open Systems Interconnection (OSI) model illustrated in FIG. 1. The OSI model defines a 7-layer stack model, including an application layer, a presentation layer, a session layer, a transport layer, a network layer, a data link layer, and a physical layer. A physical layer may generally refer to a layer at which electrical signals form digital data. For example, a physical layer may refer to a layer that defines how modulated radio frequency (RF) symbols form a frame of digital data. A data link layer, which may also be referred to as link layer, may refer to an abstraction used prior to physical layer processing at a sending side and after physical layer reception at a receiving side. It should be noted that a sending side and a receiving side are logical roles and a single device may operate as both a sending side in one instance and as a receiving side in another instance. Each of an application layer, a presentation layer, a session layer, a transport layer, and a network layer may define how data is delivered for use by a user application.
Transmission standards, including transmission standards currently under development, may include a content delivery protocol model specifying supported protocols for each layer and further defining one or more specific layer implementations. It should be noted that aspects of the ATSC 3.0 suite of standards currently under development are described in Working Drafts (WD), which may include proposed aspects for inclusion in a published (“final”) version of an ATSC 3.0 standard. For example, ATSC Candidate Standard: System Discovery and Signaling (Doc. A/321 Part 1), Doc. S32-231r4, 06 May 2015 (hereinafter “A/321”), which is incorporated by reference in its entirety, describes specific proposed aspects of an ATSC 3.0 unidirectional physical layer implementation. The proposed ATSC 3.0 unidirectional physical layer includes a physical layer frame structure including a defined bootstrap, preamble, and data payload structure including one or more physical layer pipes (PLPs). In one example, a PLP may generally refer to a logical structure including all or portions of a data stream. In one example, one or more Layer Coding Transport (LCT) channels may be included in a PLP. An LCT channel may carry individual components of a service (e.g., audio, video, or closed caption component streams) or file-based items of content associated with a service (e.g., documents included in an application enhancement).
Further, a corresponding link layer for the ATSC 3.0 unidirectional physical layer implementation is currently under development. The proposed link layer abstracts various types of data encapsulated in particular packet types (e.g., MPEG transport stream (MPEG-TS) packets, Internet Protocol (IP) packets, signalling packets, extension packets, etc.) into a single generic format for processing by the physical layer. It should be noted that in one example, an MPEG-TS may be defined as a standard container format for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data. The ATSC 3.0 proposed link layer supports segmentation of a single upper layer packet into multiple link layer packets and concatenation of multiple upper layer packets into a single link layer packet. In addition to the unidirectional physical layer and link layer, the proposed ATSC 3.0 suite of standards also support so-called broadband physical layers and data link layers to enable support for hybrid video services. For example, it may be desirable for a primary presentation of a sporting event to be received by a receiving device through an over-the-air broadcast and an application enhancement associated with the sporting event (e.g., updated statistics) to be received from an online media service provider. It should be noted that although ATSC 3.0 uses the term “broadcast” to refer to a unidirectional over-the-air transmission physical layer, the so-called ATSC 3.0 broadcast physical layer supports video delivery through streaming or file download. As such, the term broadcast as used herein should not be used to limit the manner in which video and associated data may be transported according to one or more techniques of this disclosure.
Referring again to FIG. 1, an example content delivery protocol model is illustrated. In the example illustrated in FIG. 1, content delivery protocol model 100 is “aligned” with the 7-layer OSI model for illustration purposes. It should be noted that such an illustration should not be construed to limit implementations of the content delivery protocol model 100 or the techniques described herein. Content delivery protocol model 100 may generally correspond to the currently proposed content delivery protocol model for the ATSC 3.0 suite of standards. As described in detail below the techniques described herein may be incorporated into a system implementation of content delivery protocol model 100, or a similar content delivery protocol model, in order to enable and/or enhance functionality in an interactive video distribution environment.
As described above, documents (e.g., XML Documents) including applications, application data, and/or components thereof may be delivered over bidirectional networks or unidirectional networks. Such documents may be referred to as signaling object files or non-real time (NRT) content files and may further be referred to as metadata fragments or service signaling fragments, and the like. Referring to FIG. 1, content delivery protocol model 100 includes two mechanisms for supporting delivery of signaling object or NRT content files through ATSC Broadcast Physical layer: (1) through MPEG Media Transport Protocol (MMTP) over User Datagram Protocol (UDP) and Internet Protocol (IP); and (2) through Real-time Object delivery over Unidirectional Transport (ROUTE) over UDP and IP. MMTP is described in ISO/IEC: ISO/IEC 23008-1, “Information technology-High efficiency coding and media delivery in heterogeneous environments-Part 1: MPEG media transport (MMT),” which is incorporated by reference herein in its entirety. It should be noted that a ROUTE session may comprise one or more LCT channels which carry as a whole, or in part, the content components that make up a service session. As further illustrated in FIG. 1, content delivery protocol model 100 supports delivery of signaling object or NRT content files through the broadband physical layer, i.e., through Hypertext Transfer Protocol (HTTP), including, for example, through HTTP Requests. In order to enable receiver devices to distinguish various diverse applications, a transmission standard may define data elements that describe applications. Data elements that describe applications may be referred to as application data elements and/or items of metadata associated with applications and may be include documents (e.g., XML documents). Thus, in the example of content delivery protocol model 100, data elements that describe applications may be delivered through MMTP over UDP and IP, through ROUTE over UDP and IP, and through HTTP.
Application data elements may enable a receiver device to determine how to access a particular application (e.g., application data elements may include a Universal Resource Locator (URL)) and execute the particular application in conjunction with a primary media service (e.g., application data elements may include application control codes or activation states). Further, application data elements may enable a receiver device to determine if a particular application is supported by the receiver device. In some examples, application data elements may be signaled based on so-called application information tables (AIT) or application signaling tables (AST). European Telecommunications Standard Institutes Technical Specification (ETSI TS) 102 809 V1.1.1 (2010-01), Digital Video Broadcasting (DVB); Signalling and carriage of interactive application and services in Hybrid broadcast/broadband environments (hereinafter “ETSI TS 102 809”), which is incorporated by reference herein in its entirety, defines an application information table which may be used to generate an XML document including application data elements. That is, for example, a receiver device may receive an XML document formatted according to the schema provided in ETSI TS 102 809 and parse the XML document to determine how to access a particular application and execute the particular application.
ETSI TS 102 809 provides that an application can be completely described by: an application name which can be multilingual (element appName); a unique identification (element applicationIdentifier); a generic descriptor which is common and mandatory for all types of application (element applicationDescriptor); an application specific descriptor which will depend upon the type of the signalled application (element applicationSpecificDescriptor); an application usage descriptor which is optional (element applicationUsageDescriptor); an application boundary descriptor which is optional (element applicationBoundaryDescriptor); one or more application transport descriptors (element applicationTransport); and a simple application location descriptor (element applicationLocation). Table 1 provides a summary of an ApplicationList element and elements describing an application as provided in ETSI TS 102 809. It should be noted that for the sake of brevity Table 1 does not include all of the child elements and attributes for an ApplicationList as defined according to ETSI TS 102 809.
Figure JPOXMLDOC01-appb-I000001
Further, it should be noted that Table 1 and other tables described herein may generally follow XML conventions. For example, one or more of the following XML conventions may apply:
Elements in an XML document may be identified by an upper-case first letter and in bold face as Element. To express that an element Element1 is contained in another element Element2, may be written as Element2.Element1. If an element’s name consists of two or more combined words, camel-casing is typically used, e.g. ImportantElement. Elements may be present either exactly once, or the minimum and maximum occurrence is defined by <minOccurs> ... <maxOccurs> (as indicated by Cardinality in Tables 1-4 and Table 6-7).
Attributes in an XML document may be identified by a lower-case first letter as well as being are preceded by a ‘@’-sign, e.g. @attribute. To point to a specific attribute @attribute contained in an element Element, one may write Element@attribute. If an attribute's name consists of two or more combined words, camel-casing is typically used after the first word, e.g. @veryImportantAttribute. Attributes may have assigned a status in the XML as mandatory (M), required (R), optional (O), optional with default value (OD) and conditionally mandatory (CM) (as indicated by Cardinality in Tables 1-4 and Table 6-7).
Namespace qualification of elements and attributes may be used as per XML standards, in the form of namespace:Element or @namespace:attribute The fully qualified namespace may be provided in the schema fragment associated with the declaration. External specifications extending the namespace of DASH (Dynamic Adaptive Streaming over HTTP) may be expected to document the element name in the semantic table with an extension namespace prefix.
Variables defined in the context of Tables is this disclosure may be specifically highlighted with italics, e.g. InternalVariable.
Structures that are defined as part of the hierarchical data model may be identified by an upper-case first letter.
It should be noted that other notational conventions may be used and the techniques described herein should not be limited based on example notational conventions described herein.
An application signaling table based on the application information table defined in ETSI TS 102 809 is currently proposed for inclusion in the ATSC 3.0 suite of standards for generating documents including application data elements. In particular, it is proposed that a collection of applications that constitute an application-based feature are signaled using an XML document called an application signaling table, where an application signaling table includes as its root element an ApplicationList element, as defined in ETSI TS 102 809, further extended and constrained. Further, it is proposed that for a primary service an application signaling table is included for each application-based enhancement.
Table 2 provides an example of an application signaling table containing an ApplicationList element, as defined in ETSI TS 102 809, as its root element and further including currently proposed extensions and constraints. Referring to Table 2, the following datatypes are specified in DataType field string, unsignedByte, unsignedShort, unsignedInt, unsignedLong, boolean, anyURI, dateTime, mpeg7:XXX, TransportProtocolDescriptorType, mhp:xxx, and ipi:xxx. In one example, each of string, unsignedByte, unsignedShort, unsignedInt, unsignedLong, boolean, anyURI, dateTime, may be defined according to XML specifications. For example, string may include a string of characters, unsignedByte may include an unsigned 8-bit integer, unsignedShort may include an unsigned 16-bit integer, unsignedInt may include an unsigned 32-bit integer, unsignedLong may include an unsigned 64-bit integer, boolean may be used to specify a boolean value (i.e., true or false), anyURI may be used to specify a universal resource identifier (URI), and dateTime may be used to specify a date and a time according to the following form “YYYY-MM-DDThh:mm:ss” where, YYYY indicates the year, MM indicates the month, DD indicates the day, T indicates the start of the required time section, hh indicates the hour, mm indicates the minute, and ss indicates the second. In one example, each of mpeg7:XXX, TransportProtocolDescriptorType, mhp:xxx and ipi:xxx may be used to specify an enumerated data type, as provided in ETSI TS 102 809. In these examples the prefix, (i.e., mhp, ipi, and mpeg7) describes a namespace for the corresponding element or attribute.
Further, in Table 2, a Request for Comments (RFC) refers to RFC publication from the Internet Engineering Task Force (IETF), where each of the respective RFC included in Table 2 is incorporated by reference in its respective entirety. In Table 2, Dotted-IPv4 refers to a dotted decimal notation of an Internet Protocol Version 4 address and EFDT refers to an electronic file delivery table. An electronic file delivery table (EFDT) may be used to deliver files over a unidirectional transport. A HTTP Secure (HTTPS) may be a protocol for secure communication. In an example, HTTPS may use a secure sockets layer (SSL) method. In another example, HTTPS may user a transport layer security (TLS) method.
Figure JPOXMLDOC01-appb-I000002
Figure JPOXMLDOC01-appb-I000003
Figure JPOXMLDOC01-appb-I000004
Figure JPOXMLDOC01-appb-I000005
Figure JPOXMLDOC01-appb-I000006
Figure JPOXMLDOC01-appb-I000007
Figure JPOXMLDOC01-appb-I000008
Figure JPOXMLDOC01-appb-I000009
Figure JPOXMLDOC01-appb-I000010
Figure JPOXMLDOC01-appb-I000011
Table 3 represents a generalized version of Table 2. That is, Table 3 includes does not explicitly specify each of the children elements and attributes for each of appName, applicationIdentifier, applicationDescriptor, applicationUsageDescriptor, applicationBoundaryDescriptor, applicationTransport, applicationLocation, and applicationSpecificDescriptor included in Table 2. It should be noted that Table 3 includes a first level of children elements under applicationDescriptor, i.e., type, controlCode, visibility, serviceBound, priority, version, mhpVersion, and storageCapabilities.
Figure JPOXMLDOC01-appb-I000012
Figure JPOXMLDOC01-appb-I000013
It should be noted that in the instances where techniques described herein are described with respect to a generalized application signaling table, such descriptions are provided for the sake of brevity and should not be constructed to limit the techniques described herein. Further, it should be noted that the techniques described herein may be generally applicable to an application signaling table having additional extensions and constraints. That is, the techniques described herein may be generally applicable regardless of particular elements and attributes included in an application signaling table.
As described above, application data elements may enable a receiver device to determine if a particular application is supported by the receiver device. In some cases, in addition to determining if a particular application is supported by the receiver device, it may be useful for a receiver device to determine if a particular application is useful to and/or desired by a particular user operating a receiver device. For example, an application providing an alternative audio presentation may not be useful to a user with a hearing impairment or a user that is not fluent in the language of the alternative audio presentation. In some cases, it may be useful for a user to be able to set preferences (e.g., at receiver device or server) with respect to applications in order to enable a receiver device to ensure that only applications that meet certain criteria are requested and/or executed. For example, a user with a disability may request that only applications that provide an adequate level of accessibility to the user are executed (e.g., only applications that are accessible to the visually impaired). Further, it may be useful for a service provider to classify applications based on accessibility. For example, it may be useful for a service provider to classify an application as: accessible to visually impaired, accessible to color blind, accessible to blind, accessible to low vision, accessible to hearing impaired, accessible to deaf, accessible to hard of hearing, accessible to mobility impaired, enables assistive technologies and the like. The systems and techniques described herein enable accessibility information associated with an application associated with a video and/or audio presentation to be signalled. It should be noted that although the techniques described herein are described with respect accessibility levels associated with disabilities, in some examples, the techniques described herein may be generally applicable to user preferences.
FIG. 2 is a block diagram illustrating an example of a system that may implement one or more techniques described in this disclosure. System 200 may be configured to communicate data in accordance with the techniques described herein. In the example illustrated in FIG. 2, system 200 includes one or more receiver devices 202A-202N, television service network 204, television service provider site 206, wide area network 212, one or more content provider sites 214A-214N, and one or more data provider sites 216A-216N. System 200 may include software modules. Software modules may be stored in a memory and executed by a processor. System 200 may include one or more processors and a plurality of internal and/or external memory devices. Examples of memory devices include file servers, file transfer protocol (FTP) servers, network attached storage (NAS) devices, local disk drives, or any other type of device or storage medium capable of storing data. Storage media may include Blu-ray discs, DVDs, CD-ROMs, magnetic disks, flash memory, or any other suitable digital storage media. When the techniques described herein are implemented partially in software, a device may store instructions for the software in a suitable, non-transitory computer-readable medium and execute the instructions in hardware using one or more processors.
System 200 represents an example of a system that may be configured to allow digital media content, such as, for example, a movie, a live sporting event, etc., and data and applications and multimedia presentations associated therewith (e.g., caption services), to be distributed to and accessed by a plurality of computing devices, such as receiver devices 202A-202N. In the example illustrated in FIG. 2, receiver devices 202A-202N may include any device configured to receive data from television service provider site 206. For example, receiver devices 202A-202N may be equipped for wired and/or wireless communications and may include televisions, including so-called smart televisions, set top boxes, and digital video recorders. Further, receiver devices 202A-202N may include desktop, laptop, or tablet computers, gaming consoles, mobile devices, including, for example, “smart” phones, cellular telephones, and personal gaming devices configured to receive data from television service provider site 206. It should be noted that although system 200 is illustrated as having distinct sites, such an illustration is for descriptive purposes and does not limit system 200 to a particular physical architecture. Functions of system 200 and sites included therein may be realized using any combination of hardware, firmware and/or software implementations.
Television service network 204 is an example of a network configured to enable digital media content, which may include television services, to be distributed. For example, television service network 204 may include public over-the-air television networks, public or subscription-based satellite television service provider networks, and public or subscription-based cable television provider networks and/or over the top or Internet service providers. It should be noted that although in some examples television service network 204 may primarily be used to enable television services to be provided, television service network 204 may also enable other types of data and services to be provided according to any combination of the telecommunication protocols described herein. Further, it should be noted that in some examples, television service network 204 may enable two-way communications between television service provider site 206 and one or more of receiver devices 202A-202N. Television service network 204 may comprise any combination of wireless and/or wired communication media. Television service network 204 may include coaxial cables, fiber optic cables, twisted pair cables, wireless transmitters and receivers, routers, switches, repeaters, base stations, or any other equipment that may be useful to facilitate communications between various devices and sites. Television service network 204 may operate according to a combination of one or more telecommunication protocols. Telecommunications protocols may include proprietary aspects and/or may include standardized telecommunication protocols. Examples of standardized telecommunications protocols include DVB standards, ATSC standards, ISDB standards, DTMB standards, DMB standards, Data Over Cable Service Interface Specification (DOCSIS) standards, HbbTV standards, W3C standards, and UPnP standards.
Referring again to FIG. 2, television service provider site 206 may be configured to distribute television service via television service network 204. For example, television service provider site 206 may include one or more broadcast stations, a cable television provider, a satellite television provider, or an Internet-based television provider. In the example illustrated in FIG. 2, television service provider site 206 includes service distribution engine 208 and database 210. Service distribution engine 208 may be configured to receive data, including, for example, multimedia content, interactive applications, and messages, and distribute data to receiver devices 202A-202N through television service network 204. For example, service distribution engine 208 may be configured to transmit television services according to aspects of the one or more of the transmission standards described above (e.g., an ATSC standard). In one example, service distribution engine 208 may be configured to receive data from one or more sources. For example, television service provider site 206 may be configured to receive a transmission including television programming through a satellite uplink/downlink. Further, as illustrated in FIG. 2, television service provider site 206 may be in communication with wide area network 212 and may be configured to receive data from content provider sites 214A-214N and further receive data from data provider sites 216A-216N. It should be noted that in some examples, television service provider site 206 may include a television studio and content may originate therefrom.
Database 210 may include storage devices configured to store data including, for example, multimedia content and data associated therewith, including for example, descriptive data and executable interactive applications. For example, a sporting event may be associated with an interactive application that provides statistical updates. Data associated with multimedia content may be formatted according to a defined data format, such as, for example, HTML, Dynamic HTML, XML, and JSON, and may include URLs and URIs enabling receiver devices 202A-202N to access data, e.g., from one of data provider sites 216A-216N. In some examples, television service provider site 206 may be configured to provide access to stored multimedia content and distribute multimedia content to one or more of receiver devices 202A-202N through television service network 204. For example, multimedia content (e.g., music, movies, and television (TV) shows) stored in database 210 may be provided to a user via television service network 204 on a so-called on demand basis.
Wide area network 212 may include a packet based network and operate according to a combination of one or more telecommunication protocols. Telecommunications protocols may include proprietary aspects and/or may include standardized telecommunication protocols. Examples of standardized telecommunications protocols include Global System Mobile Communications (GSM) standards, code division multiple access (CDMA) standards, 3rd Generation Partnership Project (3GPP) standards, European Telecommunications Standards Institute (ETSI) standards, European standards (EN), IP standards, Wireless Application Protocol (WAP) standards, and Institute of Electrical and Electronics Engineers (IEEE) standards, such as, for example, one or more of the IEEE 802 standards (e.g., Wi-Fi). Wide area network 212 may comprise any combination of wireless and/or wired communication media. Wide area network 212 may include coaxial cables, fiber optic cables, twisted pair cables, Ethernet cables, wireless transmitters and receivers, routers, switches, repeaters, base stations, or any other equipment that may be useful to facilitate communications between various devices and sites. In one example, wide area network 216 may include the Internet.
Referring again to FIG. 2, content provider sites 214A-214N represent examples of sites that may provide multimedia content to television service provider site 206 and/or receiver devices 202A-202N. For example, a content provider site may include a studio having one or more studio content servers configured to provide multimedia files and/or streams to television service provider site 206. In one example, content provider sites 214A-214N may be configured to provide multimedia content using the IP suite. For example, a content provider site may be configured to provide multimedia content to a receiver device according to Real Time Streaming Protocol (RTSP), or HTTP.
Data provider sites 216A-216N may be configured to provide data, including hypertext based content, and the like, to one or more of receiver devices 202A-202N and/or television service provider site 206 through wide area network 212. A data provider site 216A-216N may include one or more web servers. Data provided by data provider site 216A-216N may be defined according to data formats, such as, for example, HTML, Dynamic HTML, XML, and JSON. An example of a data provider site includes the United States Patent and Trademark Office website. It should be noted that in some examples, data provided by data provider sites 216A-216N may be utilized for so-called second screen applications. For example, companion device(s) in communication with a receiver device may display a website in conjunction with television programming being presented on the receiver device. It should be noted that data provided by data provider sites 216A-216N may include audio and video content. As described above, with respect to content delivery protocol model 100, data elements that describe applications may be delivered through HTTP. Thus, in one example, data provider sites 216A-216N may be configured generate data or documents including applications and/or data elements that describe applications according to one or more of the techniques described herein.
As described above, service distribution engine 208 may be configured to receive data, including, for example, multimedia content, interactive applications, and messages, and distribute data to receiver devices 202A-202N through television service network 204. FIG. 3 is a block diagram illustrating an example of a service distribution engine that may implement one or more techniques of this disclosure. Service distribution engine 300 may be configured to receive data and output a signal representing that data for distribution over a communication network, e.g., television service network 204. For example, service distribution engine 300 may be configured to receive one or more data streams and output a signal that may be transmitted using a single radio frequency band (e.g., a 6 MHz channel, an 8 MHz channel, etc.) or a bonded channel (e.g., two separate 6 MHz channels). A data stream may generally refer to data encapsulated in a set of one or more data packets.
As illustrated in FIG. 3, service distribution engine 300 includes transport package generator 302, transport/network packet generator 304, link layer packet generator 306, frame builder and waveform generator 308, and system memory 310. Each of transport package generator 302, transport/network packet generator 304, link layer packet generator 306, frame builder and waveform generator 308, and system memory 310 may be interconnected (physically, communicatively, and/or operatively) for inter-component communications and may be implemented as any of a variety of suitable circuitry, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware or any combinations thereof. It should be noted that although service distribution engine 300 is illustrated as having distinct functional blocks, such an illustration is for descriptive purposes and does not limit service distribution engine 300 to a particular hardware architecture. Functions of service distribution engine 300 may be realized using any combination of hardware, firmware and/or software implementations.
System memory 310 may be described as a non-transitory or tangible computer-readable storage medium. In some examples, system memory 310 may provide temporary and/or long-term storage. In some examples, system memory 310 or portions thereof may be described as non-volatile memory and in other examples portions of system memory 310 may be described as volatile memory. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), and static random access memories (SRAM). Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. System memory 310 may be configured to store information that may be used by service distribution engine 300 during operation. It should be noted that system memory 310 may include individual memory elements included within each of transport package generator 302, transport/network packet generator 304, link layer packet generator 306, and frame builder and waveform generator 308. For example, system memory 310 may include one or more buffers (e.g., First-in First-out (FIFO) buffers) configured to store data for processing by a component of service distribution engine 300.
Transport package generator 302 may be configured to receive data and generate a transport package according to a defined applicant transport package structure. For example, transport package generator 302 may be configured to receive one or more segments of encoded video data and generate a package based on MMTP. Transport/network packet generator 304 may be configured to receive a transport package and encapsulate the transport package into corresponding transport layer packets (e.g., UDP, Transport Control Protocol (TCP), etc.) and network layer packets (e.g., IPv4, IPv6, compressed IP packets, etc.). As described above, with respect to content delivery protocol model 100, data elements that describe applications may be delivered through MMTP over UDP and IP, through ROUTE over UDP and IP. Thus, in one example, transport package generator 302 may be configured to receive data or documents including applications and/or data elements that describe applications generate a package or a similar data structure based on received data according to one or more techniques of this disclosure.
Link layer packet generator 306 may be configured to receive network packets and generate packets according to a defined link layer packet structure (e.g., an ATSC 3.0 link layer packet structure). Frame builder and waveform generator 308 may be configured to receive one or more link layer packets and output symbols (e.g., OFDM symbols) arranged in a frame structure. As described above, a frame structure may include a bootstrap, a preamble, and a data payload including one or more PLPs. A frame may be referred to as a physical layer frame (PHY-Layer frame). A bootstrap may act as a universal entry point for a waveform. A preamble may include so-called Layer-1 signaling (L1-signaling). L1-signaling may provide the necessary information to configure physical layer parameters. Frame builder and waveform generator 308 may be configured to produce a signal for transmission within one or more of types of RF channels: a single 6 MHz channel, a single 7 MHz channel, single 8 MHz channel, a single 11 MHz channel, and bonded channels including any two or more separate single channels (e.g., a 14 MHz channel including a 6 MHz channel and a 8 MHz channel). Frame builder and waveform generator 308 may be configured to insert pilots and reserved tones for channel estimation and/or synchronization. In one example, pilots and reserved tones may be defined according to an orthogonal frequency division multiplexing (OFDM) symbol and sub-carrier frequency map. Frame builder and waveform generator 308 may be configured to generate an OFDM waveform by mapping OFDM symbols to sub-carriers. It should be noted that in some examples, frame builder and waveform generator 308 may be configured to support layer division multiplexing. Layer division multiplexing may refer to super-imposing multiple layers of data on the same RF channel (e.g., a 6 MHz channel). Typically, an upper layer refers to a core (e.g., more robust) layer supporting a primary media service and a lower layer refers to a high data rate layer supporting enhanced services. For example, an upper layer could support basic High Definition video content and a lower layer could support enhanced Ultra-High Definition video content.
As described above, data elements that describe applications may be included in a document defined according to an application service table. Table 4 provides an example of an application service table including data elements that describe accessibility of an application. It should be noted that similar to Table 3 above, Table 4 is a generalized application service table and includes a first layer of children elements under applicationDescriptor, i.e., type, controlCode, visibility, serviceBound, priority, version, applicationAccessibility, mhpVersion, and storageCapabilities. Thus, Table 4 represents an example of an application service table including data elements that describe accessibility of an application as part of the general properties of an application. It should be noted that in other examples, applicationAccessibility may be included as an element at the same level as appName, applicationIdentifier, applicationDescriptor, applicationUsageDescriptor, applicationBoundaryDescriptor, applicationTransport, applicationLocation, and applicationSpecificDescriptor.
Further, it should be noted that in Table 4, @language attribute includes a language identifier as defined by Internet Engineering Task Force (IETF) Best Current Practice (BCP) 47. It should be noted that BCP is a persistent name for a series of IETF RFCs whose numbers change as they are updated. The latest RFC describing language tag syntax is RFC 5646, Tags for the Identification of Languages, which is incorporated by reference herein, and it obsoletes the older RFCs 4646, 3066 and 1766. In another example, a particular version of IETF BCP 47 (identified by month and year of publication) may be used for defining the values that @language attribute may take on to facilitate lower receiver complexity. In yet another example, a particular RFC such RFC 5646 instead of IETF BCP 47 is used for defining the values that @language attribute may take on to facilitate lower receiver complexity. In RFC 5646, the length of value of xml:lang is variable. As such, IETF BCP 47 may be used to represent a language of a caption service in a more efficient manner than ISO 639.2/B and ISO 8859-1.
Figure JPOXMLDOC01-appb-I000014
Figure JPOXMLDOC01-appb-I000015
Figure JPOXMLDOC01-appb-I000016
As illustrated in Table 4, with respect to the description of applicationAccessibility reference is made to Table 6 and Table 7. As described in further detail below Table 6 represents an example where applicationAccessibility is based on defined levels of conformance and Table 7 represents an example where applicationAccessibility is based on defined groups. Accessibility levels of an application may be based on an established accessibility guideline. An example of an established accessibility guideline includes the “Web Content Accessibility Guidelines (WCAG) 2.0”, 11 December 2008, published by the W3C (hereinafter “WCAG 2.0”), which is incorporated by reference in its entirety. WCAG 2.0 includes a wide range of recommendations for making Web content more accessible. WCAG 2.0 anticipates that by authors of Web content following the guidelines included therein content will be made more accessible to a wider range of people with disabilities, including blindness and low vision, deafness and hearing loss, learning disabilities, cognitive limitations, limited movement, speech disabilities, photosensitivity and combinations thereof and further may make content more usable to users in general.
WCAG 2.0 provides the following 12 basic guidelines:
Guideline 1.1 Text Alternatives: Provide text alternatives for any non-text content so that it can be changed into other forms people need, such as large print, braille, speech, symbols or simpler language.
Guideline 1.2 Time-based Media: Provide alternatives for time-based media.
Guideline 1.3 Adaptable: Create content that can be presented in different ways (for example simpler layout) without losing information or structure.
Guideline 1.4 Distinguishable: Make it easier for users to see and hear content including separating foreground from background.
Guideline 2.1 Keyboard Accessible: Make all functionality available from a keyboard.
Guideline 2.2 Enough Time: Provide users enough time to read and use content.
Guideline 2.3 Seizures: Do not design content in a way that is known to cause seizures.
Guideline 2.4 Navigable: Provide ways to help users navigate, find content, and determine where they are.
Guideline 3.1 Readable: Make text content readable and understandable.
Guideline 3.2 Predictable: Make Web pages appear and operate in predictable ways.
Guideline 3.3 Input Assistance: Help users avoid and correct mistakes.
Guideline 4.1 Compatible: Maximize compatibility with current and future user agents, including assistive technologies.
Further, WCAG 2.0 includes success criteria for each guideline. A guideline may be considered met when success criteria is satisfied. WCAG 2.0 defines the following conformance levels: A (lowest), AA, AAA (highest) for success criteria. For example, for Guideline 2.4, a level A conformance may occur when a web page satisfies a success criteria by having titles that describe topic or purpose, a level AA conformance may occur when a web page, in addition to satisfying success criteria associated with level A, satisfies a success criteria by having headings and labels describe topic or purpose, and a level AAA conformance may occur when a web page, in addition to satisfying success criteria associated with level A and level AA, satisfies a success criteria by having section headings are used to organize the content. It should be noted that for each of the guidelines and success criteria in the WCAG 2.0, the working group has also documented a wide variety of informative techniques. The informative techniques fall into two categories: (1) those that are sufficient for meeting the success criteria and (2) those that are advisory. The advisory techniques typically go beyond what is required by the individual success criteria and allow authors to better address the guidelines. Some advisory techniques address accessibility barriers that are not covered by testable success criteria. Further, common known failures have also been documented by the W3C, for example in “Sufficient and Advisory Techniques in Understanding WCAG 2.0.” It should be noted that although the techniques described here are described with respect to WCAG 2.0, the techniques described herein are generally applicable to future versions of Web Content Accessibility Guidelines, published by the W3C. For example, the techniques described here are applicable to future Web Content Accessibility Guidelines including any combination of additional and/or redacted guidelines, success criteria, and conformance level.
Table 5 is a table summarizing success criteria and levels of conformances for WCAG 2.0. It should be noted, the success criteria listed under columns Level A, Level AA, Level AAA generally describe the success criteria, specific detail of success criteria can be found in WCAG 2.0 document. In the example in Table 5, for Level A conformance all the goals/success criteria identified under Level A column must be met. For level AA conformance all the goals/success criteria identified under Level A column and Level AA column must be met. For level AAA conformance all the goals/success criteria identified under Level A column and Level AA column and Level AAA column must be met. In yet another example, one may consider conformance for a Guideline. For example, for Guideline 2.4, respective success criteria are provided for each level of conformance. For example, level AAA conformance occurs if 2.4.X success criteria in column Level A and Level AA are satisfied as well as 2.4.X success criteria in the Level AAA column.
Figure JPOXMLDOC01-appb-I000017
Content, including an application enhancement may be classified based on properties included in the content that are included in Table 5. For example, an application may include the 25 properties included in the Level A column of Table 5, another application may additionally include the 13 properties included in the Level AA column of Table 5, and another application may additionally include the 23 properties included in the Level AAA column. In this manner, each application may be respective described as having a Level A level of conformance (i.e., the application satisfies all the Level A success criteria), a Level AA level of conformance (i.e., the application satisfies all the Level A and Level AA success criteria), and a Level AAA level of conformance, (i.e., the application satisfies all the Level A, Level AA and Level AAA success criteria). Further, levels of conformance may be provided on a respective guideline basis. Further, in one example, guidelines may be grouped and conformance may be provided for each group. For example, one group may be related to media and another group may be related to text. In another example, one group may be related to the visually impaired and another group may be related to hearing impaired, and another group may be related to mobility impaired.
It should be noted that in some examples, Table 5 may be modified to include fewer or more success criteria. Further, it should be noted that in some examples, success criterion included in one column in Table 5 may be moved to another column. Such modifications may be useful in cases where applications providing enhancements associated with a primary service have more particular accessibility requirements than web content not associated with another service. Further, in an example the media may have associated accessibility metadata information that may be signaled along with the media. For example, an audio track may be identified as intended for descriptive video service, a subtitle track may be identified as captions, a video track may be identified as being for sign language, a video track may be identified as including burnt-in captions, a video track may be identified as including sign language video (for e.g. using picture in picture). In such an event, related guidelines in Table 5 be ignored/removed when evaluating conformance.
In one example, applicationAccessibility may signal accessibility information based on the example levels of conformance illustrated in Table 5. Table 6 represents an example of an XML representation of applicationAccessibility that may be used to signal a level of accessibility based on the example levels of conformance illustrated in Table 5.
Figure JPOXMLDOC01-appb-I000018
Figure JPOXMLDOC01-appb-I000019
Referring to Table 6, attribute @name indicates the name of an established accessibility guideline. It should be noted that in some examples, an established accessibility guideline may include a guideline other than WCAG 2.0. As further illustrated in Table 6 @Level attribute may be a string specifying one of Level A, Level AA, or Level AAA conformance. Further, in one example, @Level attribute may signal a level of conformance using an unsignedByte. It should be noted that in the case where an unsignedByte is used, values may be provided for partial levels of conformance. For example, a value of 0 may indicate “A,” a value of 1 may indicate that 6 or the 13 criterion in Level 2 are satisfied, i.e., “partial AA,” a value of 2 may indicate “AA,” and a value of 3 may indicate “AAA”. In the example where @Level is an unsignedByte, other partial levels of conformance may be indicated using the 256 values available using the unsignedByte dataType. That is, any number of combinations of the 61 success criteria illustrated in Table 6 may be signalled as satisfied. As illustrated in Table 6, Exception enables success criteria that are not satisfied to be explicitly signalled. Thus, Table 6 provides multiple possible ways to signal whether success criteria included in Table 5 are satisfied. It should be noted that in examples where partial levels of conformance are signalled using @Level attribute, Exception element may be redundant and optional. In the example where @Level is an string, other partial levels of conformance may be indicated using a “0” of “1” corresponding to the 61 success criteria illustrated in Table 6, concatenated together, where “0” may indicate it is unknown if the corresponding success criteria is met and a “1” may indicate that the corresponding success criteria is met. In an alternative example “0” may indicate the corresponding success criteria is not met. In an alternative example a value “2” may also be used to form the concatenated string and indicate the corresponding success criteria is not met. It should be noted that an application may include multiple pages (e.g., multiple HTML pages). PagesWithUnkownAccessibiltyLevel element in Table 6 enables pages with unknown levels of accessibility to be signaled. It should be noted that in other examples, any suitable data structure (e.g., combination of binary values) may be used to communicate the accessibility information for one or more pages associated with an application.
FIG. 4A is a computer program listing illustrating an example of signalling a level of accessibility according to one or more techniques of this disclosure. In the example illustrated in FIG. 4A, an application signaling table specifies that a corresponding application has a level of accessibility corresponding to a Level AA level of conformance, as provided in Table 5, with the exception that the application does satisfy the 2.1.1 Keyboard criteria or the 2.1.2 No Keyboard Trap criteria, as provided in WCAG 2.0. As described in detail below, a receiver device may be configured to compare the accessibility metadata (e.g., level of accessibility and exceptions) and preferences set by the user and may take some action, (e.g., block an application from running, allow an application to run etc.). With respect to the specific accessibility illustrated in the example of FIG. 4A, it may be acceptable for a fixed television receiver to run the application for the visually impaired or hearing impaired because it lacks a keyboard, but it may not be acceptable for a tablet device having a virtual keyboard to run the application. It should be noted that in some examples, the example particular parent child level structure in Table 6 may be modified (i.e., elements or attributes may be moved up or down a level). Further, in one example one or more elements listed in the Table 6 may be an attribute and one or more attributes listed in Table 6 above may be an element. Further, as described above, in one example, conformance may be indicated on a guideline by guideline basis. In this case, an @Guideline attribute may be included in Table 6 to specify a particular guideline. FIG. 4B is a computer program listing illustrating an example of signalling a level of accessibility according to one or more techniques of this disclosure. In the example illustrated in FIG. 4B, an @Guideline attribute is included in the document.
As described above, it may be useful for a service provider to classify an application as: accessible to visually impaired, accessible to color blind, accessible to blind, accessible to low vision, accessible to hearing impaired, accessible to deaf, accessible to hard of hearing, accessible to mobility impaired, and/or enables assistive technologies. Each of visually impaired, color blind, blind, low vision, hearing impaired, deaf, hard of hearing, mobility impaired, and requiring assistive technologies may be referred to as a defined group. In one example, accessibility information may be signalled based on defined groups. Table 7 represents an example of an XML representation of applicationAccessibility that may be used to signal accessibility information based on defined groups. It should be noted that in other examples, Table 7 may include fewer or more defined groups.
Figure JPOXMLDOC01-appb-I000020
Figure JPOXMLDOC01-appb-I000021
Referring to Table 7, for each of elements VisuallyImpairedAccessible, HearingImpairedAccessible, and, MobilityImpairedAccessible may be associated with an unsignedByte value indicating one of the following:
0: application is not accessible to the defined group;
1: application is accessible to the defined group; or
2: it is unknown if the application is accessible to the defined group
Further, in one example, the default value inferred by receiver when an element is not present may be 0. In one example, the default value inferred by receiver when an element is not present may be 1. In one example, the default value inferred by receiver when an element is not present may be 2. In a manner similar to that described above with Table 6, accessibility for a defined group may be based on success criteria. For example, for VisuallyImpairedAccessible to indicate an application is accessible, one or more success criteria provided Guideline 1.1 Text Alternatives in WCAG 2.0 may be required to be satisfied. It should be noted that accessibility with respect to a particular group may be based on any number of combinations of success criteria, including success criteria from multiple guidelines. In this manner, a device configured to transmit an application signaling table including one or more elements included in either Table 6 or Table 7 represents an example of a device configured to signal accessibility characteristics associated with an application associated with a video and/or audio presentation.
In one example, the different application accessibility (e.g. Table 6, Table 7) signaling and the variants described herein may be included in metadata describing another file containing an application. For example, an application may be included in a future version of the International Organization for Standardization (ISO) Base Media File (BMFF) file (14496-12 ISO Base Media File Format) and the application accessibility signaling may be included in a Media Presentation Description (MPD). MPD is described in ISO/IEC 23009-1:2014, “Information technology -- Dynamic adaptive streaming over HTTP (DASH) -- Part 1: Media presentation description and segment formats”, incorporated by reference herein in its entirety.
FIG. 5 is a block diagram illustrating an example of a receiver device that may implement one or more techniques of this disclosure. That is, receiver device 500 may be configured to parse a signal based on the semantics described above with respect to Table 6 or Table 7 and execute applications based on the parsed signal. Receiver device 500 is an example of a computing device that may be configured to receive data from a communications network and allow a user to access multimedia content. In the example illustrated in FIG. 5, receiver device 500 is configured to receive data via a television network, such as, for example, television service network 204 described above. Further, in the example illustrated in FIG. 5, receiver device 500 is configured to send and receive data via a wide area network. It should be noted that in other examples, receiver device 500 may be configured to simply receive data through a television service network 204. The techniques described herein may be utilized by devices configured to communicate using any and all combinations of communications networks.
As illustrated in FIG. 5, receiver device 500 includes central processing unit(s) 502, system memory 504, system interface 510, data extractor 512, audio decoder 514, audio output system 516, video decoder 518, display system 520, I/O device(s) 522, and network interface 524. As illustrated in FIG. 5, system memory 504 includes operating system 506 and applications 508. Each of central processing unit(s) 502, system memory 504, system interface 510, data extractor 512, audio decoder 514, audio output system 516, video decoder 518, display system 520, I/O device(s) 522, and network interface 524 may be interconnected (physically, communicatively, and/or operatively) for inter-component communications and may be implemented as any of a variety of suitable circuitry, such as one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), discrete logic, software, hardware, firmware or any combinations thereof. It should be noted that although receiver device 500 is illustrated as having distinct functional blocks, such an illustration is for descriptive purposes and does not limit receiver device 500 to a particular hardware architecture. Functions of receiver device 500 may be realized using any combination of hardware, firmware and/or software implementations.
CPU(s) 502 may be configured to implement functionality and/or process instructions for execution in receiver device 500. CPU(s) 502 may include single and/or multi-core central processing units. CPU(s) 502 may be capable of retrieving and processing instructions, code, and/or data structures for implementing one or more of the techniques described herein. Instructions may be stored on a computer readable medium, such as system memory 504.
System memory 504 may be described as a non-transitory or tangible computer-readable storage medium. In some examples, system memory 504 may provide temporary and/or long-term storage. In some examples, system memory 504 or portions thereof may be described as non-volatile memory and in other examples portions of system memory 504 may be described as volatile memory. System memory 504 may be configured to store information that may be used by receiver device 500 during operation, including for example user preferences. System memory 504 may be used to store program instructions for execution by CPU(s) 502 and may be used by programs running on receiver device 500 to temporarily store information during program execution. Further, in the example where receiver device 500 is included as part of a digital video recorder, system memory 504 may be configured to store numerous video files.
Applications 508 may include applications implemented within or executed by receiver device 500 and may be implemented or contained within, operable by, executed by, and/or be operatively/communicatively coupled to components of receiver device 500. Applications 508 may include instructions that may cause CPU(s) 502 of receiver device 500 to perform particular functions. Applications 508 may include algorithms which are expressed in computer programming statements, such as, for-loops, while-loops, if-statements, do-loops, etc. Applications 508 may be developed using a specified programming language. Examples of programming languages include, JavaTM, JiniTM, C, C++, Objective C, Swift, Perl, Python, PhP, UNIX Shell, Visual Basic, and Visual Basic Script. In the example where receiver device 500 includes a smart television, applications may be developed by a television manufacturer or a broadcaster. As illustrated in FIG. 5, applications 508 may execute in conjunction with operating system 506. That is, operating system 506 may be configured to facilitate the interaction of applications 508 with CPUs(s) 502, and other hardware components of receiver device 500. Operating system 506 may be an operating system designed to be installed on set-top boxes, digital video recorders, televisions, and the like. It should be noted that techniques described herein may be utilized by devices configured to operate using any and all combinations of software architectures.
System interface 510 may be configured to enable communications between components of receiver device 500. In one example, system interface 510 comprises structures that enable data to be transferred from one peer device to another peer device or to a storage medium. For example, system interface 510 may include a chipset supporting Accelerated Graphics Port (AGP) based protocols, Peripheral Component Interconnect (PCI) bus based protocols, such as, for example, the PCI ExpressTM (PCIe) bus specification, which is maintained by the Peripheral Component Interconnect Special Interest Group, or any other form of structure that may be used to interconnect peer devices (e.g., proprietary bus protocols).
As described above, receiver device 500 is configured to receive and, optionally, send data via a television service network. As described above, a television service network may operate according to a telecommunications standard. A telecommunications standard may define communication properties (e.g., protocol layers), such as, for example, physical signaling, addressing, channel access control, packet properties, and data processing. In the example illustrated in FIG. 5, data extractor 512 may be configured to extract video, audio, and data from a signal. A signal may be defined according to, for example, aspects DVB standards, ATSC standards, ISDB standards, DTMB standards, DMB standards, and DOCSIS standards.
Data extractor 512 may be configured to extract video, audio, and data, from a signal generated by service distribution engine 300 described above. That is, data extractor 512 may operate in a reciprocal manner to service distribution engine 300. Further, data extractor 512 may be configured to parse link layer packets based on any combination of one or more of the structures described above.
Data packets may be processed by CPU(s) 502, audio decoder 514, and video decoder 518. Audio decoder 514 may be configured to receive and process audio packets. For example, audio decoder 514 may include a combination of hardware and software configured to implement aspects of an audio codec. That is, audio decoder 514 may be configured to receive audio packets and provide audio data to audio output system 516 for rendering. Audio data may be coded using multi-channel formats such as those developed by Dolby and Digital Theater Systems. Audio data may be coded using an audio compression format. Examples of audio compression formats include Motion Picture Experts Group (MPEG) formats, Advanced Audio Coding (AAC) formats, DTS-HD formats, and Dolby Digital (AC-3) formats. Audio output system 516 may be configured to render audio data. For example, audio output system 516 may include an audio processor, a digital-to-analog converter, an amplifier, and a speaker system. A speaker system may include any of a variety of speaker systems, such as headphones, an integrated stereo speaker system, a multi-speaker system, or a surround sound system.
Video decoder 518 may be configured to receive and process video packets. For example, video decoder 518 may include a combination of hardware and software used to implement aspects of a video codec. In one example, video decoder 518 may be configured to decode video data encoded according to any number of video compression standards, such as ITU-T H.262 or ISO/IEC MPEG-2 Visual, ISO/IEC MPEG-4 Visual, ITU-T H.264 (also known as ISO/IEC MPEG-4 Advanced video Coding (AVC)), and High-Efficiency Video Coding (HEVC). Display system 520 may be configured to retrieve and process video data for display. For example, display system 520 may receive pixel data from video decoder 518 and output data for visual presentation. Further, display system 520 may be configured to output graphics in conjunction with video data, e.g., graphical user interfaces. Display system 520 may comprise one of a variety of display devices such as a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device capable of presenting video data to a user. A display device may be configured to display standard definition content, high definition content, or ultra-high definition content.
I/O device(s) 522 may be configured to receive input and provide output during operation of receiver device 500. That is, I/O device(s) 522 may enable a user to select multimedia content to be rendered. Input may be generated from an input device, such as, for example, a push-button remote control, a device including a touch-sensitive screen, a motion-based input device, an audio-based input device, or any other type of device configured to receive user input. I/O device(s) 522 may be operatively coupled to receiver device 500 using a standardized communication protocol, such as for example, Universal Serial Bus protocol (USB), Bluetooth, ZigBee or a proprietary communications protocol, such as, for example, a proprietary infrared communications protocol.
Network interface 524 may be configured to enable receiver device 500 to send and receive data via a local area network and/or a wide area network. Network interface 524 may include a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device configured to send and receive information. Network interface 524 may be configured to perform physical signaling, addressing, and channel access control according to the physical and Media Access Control (MAC) layers utilized in a network. Receiver device 500 may be configured to parse a signal generated according to any of the techniques described above. Receiver device 500 may be configured to receive and parse any of the application signaling tables described above and further store user preferences, including for example accessibility preferences. Receiver device 500 may further cause an action associated with an application to occur based on accessibility information included in an application signaling table and user preference information. In this manner, receiver device 500 represents an example of a device configured to parse a syntax element indicating accessibility of the application, and perform an action based on the parsed syntax element.
In one or more examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer-readable storage media which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but are instead directed to non-transitory, tangible storage media. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor,” as used herein may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described herein. In addition, in some aspects, the functionality described herein may be provided within dedicated hardware and/or software modules configured for encoding and decoding, or incorporated in a combined codec. Also, the techniques could be fully implemented in one or more circuits or logic elements.
The techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a codec hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
Moreover, each functional block or various features of the base station device and the terminal device used in each of the aforementioned embodiments may be implemented or executed by a circuitry, which is typically an integrated circuit or a plurality of integrated circuits. The circuitry designed to execute the functions described in the present specification may comprise a general-purpose processor, a digital signal processor (DSP), an application specific or general application integrated circuit (ASIC), a field programmable gate array (FPGA), or other programmable logic devices, discrete gates or transistor logic, or a discrete hardware component, or a combination thereof. The general-purpose processor may be a microprocessor, or alternatively, the processor may be a conventional processor, a controller, a microcontroller or a state machine. The general-purpose processor or each circuit described above may be configured by a digital circuit or may be configured by an analogue circuit. Further, when a technology of making into an integrated circuit superseding integrated circuits at the present time appears due to advancement of a semiconductor technology, the integrated circuit by this technology is also able to be used.
Various examples have been described. These and other examples are within the scope of the following claims.
According to one example of the disclosure, a method for signalling information associated with an application associated with a video or audio service, comprises signalling a syntax element indicating a level of conformance associated with the application, and signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
A method for signalling information associated with an application associated with a video or audio service, comprises for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
According to another example of the disclosure, a device for signalling information associated with an application associated with a video or audio service comprises one or more processors configured to signal a syntax element indicating a level of conformance associated with the application, and signal zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
According to another example of the disclosure, a device for signalling information associated with an application associated with a video or audio service comprises one or more processors configured to for one or more defined groups, signal a respective syntax element indicating accessibility information of the application with respect to defined groups.
According to another example of the disclosure, an apparatus for signalling information associated with an application associated with a video or audio service comprises means for signalling a syntax element indicating a level of conformance associated with the application, and means for signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
According to another example of the disclosure, an apparatus for signalling information associated with an application associated with a video or audio service comprises means for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
According to another example of the disclosure, a non-transitory computer-readable storage medium comprises instructions stored thereon that upon execution cause one or more processors of a device to signal a syntax element indicating a level of conformance associated with the application, and signal zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
According to another example of the disclosure, a non-transitory computer-readable storage medium comprises instructions stored thereon that upon execution cause one or more processors of a device to for one or more defined groups, signal a respective syntax element indicating accessibility information of the application with respect to defined groups.
According to one example of the disclosure, a method for parsing information associated with an application associated with a video or audio service, comprises parsing a syntax element indicating accessibility of the application, and performing an action based on the parsed syntax element.
According to another example of the disclosure, a device for parsing information associated with an application associated with a video or audio service comprises one or more processors configured to parse a syntax element indicating accessibility of the application, and perform an action based on the parsed syntax element.
According to another example of the disclosure, an apparatus for parsing information associated with an application associated with a video or audio service comprises means for parsing a syntax element indicating accessibility of the application, and means for performing an action based on the parsed syntax element.
According to another example of the disclosure, a non-transitory computer-readable storage medium comprises instructions stored thereon that upon execution cause one or more processors of a device to parse a syntax element indicating accessibility of the application, and perform an action based on the parsed syntax element.

Claims (29)

  1. A method for signalling information associated with an application associated with a video or audio service, the method comprising:
    signalling a syntax element indicating a level of conformance associated with the application, and
    signalling zero or more syntax elements identifying success criteria included in the level of conformance that the application does not satisfy.
  2. The method of claim 1, wherein signalling a syntax element indicating a level of conformance includes signalling one of a first level of conformance, a second level of conformance, and a third level of conformance.
  3. The method of claim 2, wherein each of the first level of conformance, the second level of conformance, and the third level of conformance include a set of success criteria.
  4. The method of claim 3, wherein the third level of conformance includes the set of success criteria included in the second level of conformance, and the second level of conformance includes the set of success criteria included in the first level of conformance.
  5. The method of any of claims 1-4, wherein a level of conformance is based on an accessibility guideline.
  6. The method of claim 5, wherein an accessibility guideline includes a web content accessibility guideline.
  7. The method of any of claims 1-6, wherein signalling a syntax element indicating a level of conformance associated with the application includes signalling the syntax element in an application signaling table.
  8. The method of claim 7, wherein signaling an application signaling table includes transmitting a signaling object file.
  9. The method of claim 8, wherein a signaling object file includes an eXtensible Markup Language document.
  10. A method for signalling information associated with an application associated with a video or audio service, the method comprising:
    for one or more defined groups, signalling a respective syntax element indicating accessibility information of the application with respect to defined groups.
  11. The method of claim 10, wherein a syntax element indicates one of: the application is not accessible to the defined group, the application is accessible to the defined group, or it is unknown whether the application is accessible to the defined group.
  12. The method of any of claims 10 or 11, wherein the one or more defined groups are selected from the following defined groups: visually impaired, color blind, blind, low vision, hearing impaired, deaf, hard of hearing, mobility impaired, and requiring assistive technology.
  13. The method of any of claims 10 or 11, wherein the one or more defined groups include visually impaired, hearing impaired, and mobility impaired.
  14. The method of any of claims 10-13, wherein accessibly is based on satisfying a set of success criteria associated with a defined group.
  15. The method of claim 14, wherein a set of success criteria is based on an accessibility guideline.
  16. The method of claim 15, wherein an accessibility guideline includes a web content accessibility guideline.
  17. The method of any of claims 10-16, wherein signalling a respective syntax element indicating accessibility information of the application with respect to defined groups includes signalling the syntax element in an application signaling table.
  18. The method of claim 17, wherein signaling an application signaling table includes transmitting a signaling object file.
  19. The method of claim 18, wherein a signaling object file includes an eXtensible Markup Language document.
  20. A device for signaling information associated with an application associated with a video or audio service, the device comprising one or more processors configured to perform any and all combinations of the steps included in claims 1-19.
  21. The device of claim 20, wherein the device includes a service distribution engine.
  22. The device of claim 20, wherein the device includes a server.
  23. A device for parsing information associated with an application associated with a video or audio service, the device comprising one or more processors configured to parse a signal generated according to any and all combinations of the steps included in claims 1-19.
  24. The device of claim 23, where the one or more processors are further configured perform an action with respect to the application based on parsed information associated with the application and a user preference.
  25. The device of claim 24, wherein performing an action with respect to the application based on parsed information associated with the application and a user preference includes running the application if the application satisfies accessibility criteria provided by a user.
  26. The device of any of claims 23-25, wherein the device is selected from the group consisting of: a desktop or laptop computer, a mobile device, a smartphone, a cellular telephone, a personal data assistant (PDA), a television, a tablet device, or a personal gaming device.
  27. A system comprising:
    one of: the device of claim 20, the device of claim 21, or the device of claim 22; and
    one of: the device of claim 23, the device of claim 24, the device of claim 25, or the device of claim 26.
  28. An apparatus for signaling information associated with an application associated with a video or audio service, the apparatus comprising means for performing any and all combinations of the steps included in claims 1-19.
  29. A non-transitory computer-readable storage medium having instructions stored thereon that upon execution cause one or more processors of a device to perform any and all combinations of the steps included in claims 1-19.
PCT/JP2016/085118 2015-12-02 2016-11-28 Systems and methods for signalling application accessibility WO2017094645A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562262326P 2015-12-02 2015-12-02
US62/262,326 2015-12-02

Publications (1)

Publication Number Publication Date
WO2017094645A1 true WO2017094645A1 (en) 2017-06-08

Family

ID=58796674

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/085118 WO2017094645A1 (en) 2015-12-02 2016-11-28 Systems and methods for signalling application accessibility

Country Status (1)

Country Link
WO (1) WO2017094645A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111919452A (en) * 2018-03-26 2020-11-10 夏普株式会社 System and method for signaling camera parameter information

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012244404A (en) * 2011-05-19 2012-12-10 Nippon Hoso Kyokai <Nhk> Cooperative broadcast and communication receiver device and server
JP2013242704A (en) * 2012-05-21 2013-12-05 Nippon Hoso Kyokai <Nhk> Receiver, reception system and reception program
WO2015076178A1 (en) * 2013-11-21 2015-05-28 シャープ株式会社 Web application access control system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012244404A (en) * 2011-05-19 2012-12-10 Nippon Hoso Kyokai <Nhk> Cooperative broadcast and communication receiver device and server
JP2013242704A (en) * 2012-05-21 2013-12-05 Nippon Hoso Kyokai <Nhk> Receiver, reception system and reception program
WO2015076178A1 (en) * 2013-11-21 2015-05-28 シャープ株式会社 Web application access control system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111919452A (en) * 2018-03-26 2020-11-10 夏普株式会社 System and method for signaling camera parameter information

Similar Documents

Publication Publication Date Title
TWI708507B (en) Systems and methods for link layer signaling of upper layer information
US11025940B2 (en) Method for signalling caption asset information and device for signalling caption asset information
US11615778B2 (en) Method for receiving emergency information, method for signaling emergency information, and receiver for receiving emergency information
US10506302B2 (en) Method for signaling opaque user data
US11722750B2 (en) Systems and methods for communicating user settings in conjunction with execution of an application
WO2018030133A1 (en) Systems and methods for signaling of emergency alert messages
CA3021659C (en) Systems and methods for signaling of emergency alerts
TW201826806A (en) Systems and methods for signaling of emergency alert messages
WO2017183403A1 (en) Systems and methods for signaling of an identifier of a data channel
US20180109577A1 (en) Systems and methods for enabling communications associated with digital media distribution
CA2978534C (en) Systems and methods for content information message exchange
WO2017094645A1 (en) Systems and methods for signalling application accessibility
KR102151595B1 (en) Systems and methods for signaling emergency alert messages
WO2017213234A1 (en) Systems and methods for signaling of information associated with a visual language presentation
JP7073353B2 (en) Systems and methods to enable communication associated with digital media distribution

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16870582

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16870582

Country of ref document: EP

Kind code of ref document: A1