WO2008061372A1 - Système de signalisation numérique avec affichages sans-fil - Google Patents

Système de signalisation numérique avec affichages sans-fil Download PDF

Info

Publication number
WO2008061372A1
WO2008061372A1 PCT/CA2007/002116 CA2007002116W WO2008061372A1 WO 2008061372 A1 WO2008061372 A1 WO 2008061372A1 CA 2007002116 W CA2007002116 W CA 2007002116W WO 2008061372 A1 WO2008061372 A1 WO 2008061372A1
Authority
WO
WIPO (PCT)
Prior art keywords
player
site controller
players
frame
content
Prior art date
Application number
PCT/CA2007/002116
Other languages
English (en)
Inventor
Marc Boscher
Original Assignee
Digicharm Communications Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digicharm Communications Inc. filed Critical Digicharm Communications Inc.
Publication of WO2008061372A1 publication Critical patent/WO2008061372A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4886Data services, e.g. news ticker for displaying a ticker, e.g. scrolling banner for news, stock exchange, weather data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/06Remotely controlled electronic signs other than labels

Definitions

  • the present invention relates to devices for displaying graphical content and, in particular, to large scale Digital Signage systems.
  • Conventional Digital Signage (DS) systems include large, full-color off-the-shelf displays capable of displaying visual content received from a local server or the Internet.
  • Conventional Digital Signage displays support most industry-standard content formats such as MPEG, Quicktime, Flash, Web Pages, PAL, NTSC, HDTV, etc, decoding and adapting the content at playback time, which requires additional processing power.
  • the high cost of these large high-quality displays requires significant initial deployment investment and may prevent large scale deployments.
  • An object of the present invention is to overcome the shortcomings of the prior art and provide a cost-effective scalable DS system having multiple wireless displays.
  • the present invention relates to a system for rendering and displaying a plurality of frame sequences
  • a core system comprising: a core instruction memory storing a core system instruction set for converting images into the plurality of frame sequences; a descriptor memory for storing one or more device descriptors; and a core system processing means for executing the core system instruction set and forming the plurality of frame sequences.
  • the system further comprises a plurality of RF transmitters, each comprising first communication means for communicating with the core system over a fiber, cable or wire link; and second communication means for wirelessly transmitting the plurality of frame sequences to a plurality of players; each player comprising: an RF receiver for receiving one of the plurality of frame sequences; a buffer for storing said frame sequence; and a display for displaying said frame sequence; wherein each player is associated with one of the one or more device descriptors; and each of the plurality of frame sequences received by said player has a format dependent on parameters of said device descriptor; so that said player does not perform any step of: resizing, padding, color conversion, and gamma-correction on said frame sequence.
  • Another aspect of the present invention relates to a system comprising memory storing design instructions for providing a dynamic image file including one or more placeholder fields for receiving values, query instructions for providing the values for the one or more placeholder fields, and a final converter instructions for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor, as well as processing means for executing these instructions.
  • Another aspect of the present invention relates to a system comprising a plurality of players, wherein each of the players comprises an IEEE 802.15.4 receiver, processing means consisting of a 8-bit microcontroller, and an OLED display with resolution not higher than 200 x 100.
  • Figure 1 is a block scheme of the method of creating and displaying graphical content in accordance with the instant invention
  • Figure 2 is a schematic illustration of data transformations within one frame
  • Figure 3 is an illustration to a rendering process
  • Figure 4 is a block scheme of a system for creating and displaying a plurality of frame sequences in accordance with the instant invention
  • Figure 5 illustrates components of a core system instruction set
  • Figure 6 is a schematic representation of distributed system in accordance with the instant invention.
  • FIG. 7 is a schematic representation of WLAN
  • FIG. 8 is a schematic representation of the gateway 320
  • FIG. 9 is a schematic diagram of communication protocols used in WLAN
  • Figure 10 is a block schema of a method of displaying content at a player in accordance with one embodiment of the instant invention.
  • Table 1 is a representation of the rendering process.
  • One application of the instant invention is a retail DS system for presenting customers with product related advertisement and price information, including at least 40 content players. Typically, it is a multi-store system having as many as 20,000 store shelf displays, or players, at a single retail location. This system will be used for illustration purposes throughout the instant application, however the invention is not limited to such systems.
  • FIG. 1 presents method steps
  • FIG. 2 illustrates transformations of data within one frame.
  • a content choosing step 110 graphical content for displaying at a DS player is chosen in an iterative process which includes selecting variable content and permanent content, and displaying a resulting image for evaluation.
  • the permanent content includes such assets as images, video, animation, and text, as well as content meant for another media, such as television or the Internet.
  • the permanent content at least in part, is chosen from an asset store within the DS system or is imported.
  • the permanent graphic content or its part is created using any editor for the particular system, for example by a professional graphic artist.
  • the variable content includes dynamic variables, such as a product price.
  • the dynamic variables can be associated with products, product categories, customers, or entire networks.
  • a data source for providing value(s) for a dynamic variable for example a retail backend database, may be identified at this step.
  • a dynamic variable can be parameterized by specifying its data type, maximum length, or dimensions of the space available on display.
  • a resulting frame or frame sequence, combining the permanent content and the variable content, is then displayed to simulate, as close as possible, the experience of a user viewing the DS player, wherein default values are displayed for not yet known dynamic variables.
  • a device descriptor associated with the DS player is used at this step so that the frame or frame sequence is created and optimized specifically for the particular player model; the process of creating the frame sequence ready for viewing is described in more detail later in reference to steps 120 and 140.
  • the device descriptor includes a set of parameters related to a target player, such as display resolution, color depth, physical size, storage capacity, color palette, supported functionalities. In the instance of two players having different parameters, more than one device descriptors may be employed.
  • the chosen assets are preferably adapted to the right color depth and resolution, wherein the conversion can be optimized using visual tools to provide the best output.
  • One or more project files result from the content choosing step 110.
  • the project files include the chosen assets, or references to them, the chosen dynamic variables, and, optionally, the device descriptor and additional effects and transitions.
  • some of the chosen assets within the project file are adapted using the device descriptor as described hereinbefore.
  • a dynamic image file creating step 120 all the frame data which is available, such as the permanent content and the additional effects and transitions, is rendered, or converted, into a format that can be played back without decoding by a player, also referred to herein as a player specific format and raw data format.
  • the dynamic content is stored as placeholders with information on which dynamic variables to use and how to format them. Dependent on the parameters in the device descriptor, this partial formatting may significantly reduce content size by removing unused asset data.
  • the project file is transformed into a dynamic image file (DIG) including one or more placeholder fields for receiving values.
  • DIG dynamic image file
  • the dynamic image file is structured so to contain various information blocks describing the content and its location within a frame.
  • the DIG Format is structured in that it contains various information blocks describing the content and its frame timing.
  • any frame data within a dig file is player-model specific.
  • a raw frame data can be described as a sequence of bytes that can be sent directly from the player's storage unit to the video controller of the display without analysis or processing. More specifically, the data itself is: a set of color indexes representing pixels, a sequencing order for these indexes, a way of packing this sequence of indexes in bytes, where the color indexes, sequencing order, and packing method are defined by and are specific to the display or its video controller.
  • a frame 220 illustrates adaptation of the image 200 using a player's device descriptor, which specifies, by way of example, a black and white M x N display.
  • the upper K rows of the frame 220 contain the image 200 converted to K x N pixels, and the bottom (M - K) rows are reserved for the value of the dynamic variable 215.
  • each black or white pixel is represented by a single bit, and a first bit sequence 25Oi represents the first row in the frame 220, and the last bit sequence 250 ⁇ represents the last row in the frame 220.
  • the placeholder 240 contains information related to the dynamic variable 215.
  • the rendering process consists of 4 steps, each step working with one layer of the content.
  • a Screen Layer is a parent layer upon which other layers are drawn; it includes the background area is within the bounds of the actual display. Any part of a layer that is outside the bounds of the screen layer will not be visible. The actual size of this layer in pixels is determined by the target display.
  • the screen layer is in screen space which has coordinates (0,0) to (device resolution width - 1 , device resolution height - 1 ).
  • An Image Layer contains the asset/source image/animation/video drawn onto the screen layer using an affine transformation defined by a view. Varying this view changes the position of the image on screen and can create animations.
  • the view is automatically generated at creation such that the image is scaled to fit completely within the screen space. This means the image can fill the screen horizontally and/or vertically, but always at least in one dimension, i.e. there may be empty borders around the image.
  • the image layer is in image space which has coordinates (0,0) to (image width - 1, image height - 1).
  • a Text Layer a sequence of text labels drawn on top of the image layer but relative to the screen layer using an affine transformation defined by a view. Varying this view changes the position of the text on screen and can create animations. Note that the (0,0) coordinate of a text label is the left point on the text's baseline (i.e. the bottom left corner of the bounding rectangle).
  • a View layer gives the set of parameters that specify the scaling and translation transformations that must be applied to a source, for example one chosen from the asset store 205, when placed in a target space. Scaling is applied to the source's dimensions directly, but the translation specified by the minX and minY properties is relative to the target space. In other words, the results of applying a View is typically to scale the source first, and then place it at point (minX, minY) in the target space.
  • the result will be a scaled source image with size 80x60 positioned at (10, 20) in target space.
  • FIG. 3 provides an example of Text Labels drawn into screen space.
  • Table 1 lists the steps of the rendering pipeline.
  • the creation of a dynamic image file from a project file includes color reduction of all project assets to a player-specific color palette, for example in the RGB or RGBa color space, and encoding individual frames or frame portions to the raw data format compatible with the video controller of the target display.
  • assets in the form of text, images or video are adapted from their original format and color depth to that of the player, using a standard color reduction technique, such as a nearest neighbor algorithm or an error diffusion algorithm with a Floyd-Steinberg filter disclosed, for example, in U.S. Patent Nos. 6,844,882 issued Jan. 18, 2005 to Clauson, 7,171,045 issued Jan. 30, 2007 to Hamilton, and 6,201,612 issued Mar. 13, 2001 to Matsushiro et al., and a player-dependent palette, by way of example, a palette with 16 shades of one color or a 8-bit gray scale palette.
  • the palette is indexed by attaching an index to a color, wherein the index is typically a 1 , 2, 4 or 8 bit value.
  • the player is configured to exhibit a gamma curve as smooth as possible by plotting electrical characteristics of the display for each color index and adjusting the palette so that the difference in apparent intensity as perceived by the human eye between any two consecutive color indices is constant. Then each color of the palette is converted into the RGB metric.
  • this technique eliminates the need for gamma correction on the player
  • each pixel has an RGB value available in the palette of the target player.
  • a frame converts the image data to the raw data format by converting RGB values into color indexes, followed by a byte packing algorithm that aligns the color indexes within a byte stream. The result is a raw byte sequence for each frame that can be streamed directly to the display's video controller.
  • the same algorithm of encoding assets into DIG format is used in the content choosing step 110 for simulating the experience of a user viewing the DS player.
  • values of dynamic variables to fill the placeholder fields in the dynamic image file are obtained, preferably by querying a database 245, which can be a part of the DS system or an external database such as s retail backend database.
  • the current values of dynamic variables are provided by a central distribution system or by a local backend system such as a store back office.
  • the values are provided manually.
  • the DS system synchronizes the displayed values with the database so as to change the displayed content as some of the dynamic variables change their values stored in the database.
  • a creating static image file step 140 after the dynamic image file and the values obtained in the query step 130 are combined into a static image file (SDIG) containing a frame sequence having the player specific format dependent on the parameters of the device descriptor, so that the resulting static file can be played back by the player without decoding, as will be discussed further in this specification.
  • SDIG static image file
  • All placeholders are substituted with the values obtained in the query step 130 converted to pixel data, as described above in reference to the dynamic file creation step 120, and merged with pre-formatted frames. This process occurs every time a dynamic DIG file changes or every time a value of a dynamic variable changes. In the instance of a dynamic variable used by several DIG files, following the change of this variable all the files are updated and sent to the corresponding players.
  • the static image file contains frame data in a device-specific way, in a format dependent on parameters of the device descriptor, and does not require any decoding at the player. From the very beginning, content is created for a target player by a content creation tool described later.
  • the frame data is a sequence of bytes which can be sent directly from the player's memory to the video controller of the display without analysis or processing, such as decompression, resizing, padding, color conversion, gamma- correction, etc.
  • the data itself is a set of color indexes representing pixels, sequenced for being sent directly from a buffer to the video controller of the display, and packed into bytes, wherein the color indexes, sequencing order, and packing method are defined by and are specific to the player.
  • the static image file 260 includes bit sequences 25Oi - 250 ⁇ copied from the dynamic file 230, and bit sequences 250 ⁇ +i - 250 M containing the value 247 of the dynamic variable 215 in the pixel form.
  • a step of defining distribution parameters 150 defines playback schedule, target product, geographical and other parameters.
  • the defining distribution parameters step 150 is optional: in the instance of a DS system providing the same frame sequence to all the DS players having the same device descriptor, the step of defining distribution parameters 150 is not necessary. Choice of the DS distribution parameters denoted by step 150 can be performed at any time within the timeframe of the method until the moment these parameters are used in step 160.
  • the distribution step 160 is for distributing content and schedules over the Internet or any IP-based network to multiple individual site controllers, each managing multiple players at a single location, such as a store.
  • the site controllers will be discussed in more detail further in this specification.
  • the content is distributed in the form of dynamic image files including one or more placeholder fields for receiving values, whereas each site controller provides the values independently; relative to FIG. 1, the distribution step 160 is performed after the step of creating a dynamic file 120 and before the query step 130.
  • the content is distributed in the form of static image files wherein all the placeholders are already substituted with queried values, so that the distribution step 160 is performed after the creating static file step 140. In a very small configuration of the system including only one site controller performing the steps of content creation 110-140, the distribution step 160 is not performed.
  • the values of dynamic variables can also be transmitted as part of the distribution process if these values are imported, received, or synchronized by the distribution system.
  • the transmission of content and dynamic values is usually independent since they change separately and asynchronously.
  • a content delivery step 170 includes transmitting static image files over a wireless LAN (WLAN) from the site controllers to multiple players.
  • the static image files are compressed before the transmission.
  • the content is loaded in accordance with the campaign schedule defined at the step 150 or after the static image file has been updated. Content delivery, or load, is initiated by the site controller.
  • a player Upon receiving a static image file containing a frame sequence over the WLAN, a player stores this content in local, non- volatile memory, in a content storing step 180. This process can include decompression that occurs only once per transmission but is performed after transmission has completed.
  • a microprocessor within the player reads its local nonvolatile memory and sends the frame data directly to the display's video controller. No processing is performed on the frame data, in particular, no steps of decompression, resizing, padding, color conversion, and gamma-correction are performed on the data.
  • FIG. 2 shows image 270 displayed by the player.
  • a system for creating and displaying a plurality of frame sequences shown in FIG. 4 includes a core system 300, a plurality of RF transmitters, and a plurality of players, wherein only one RF transmitter 320 and one player 301 are shown.
  • the core system 300 creates graphical content in the form of frame sequences and wirelessly loads it into the players 301 for displaying.
  • the core system 300 includes memory 305 having at least two parts: a core instruction memory 306 storing a core system instruction set for converting images into the plurality of frame sequences, the asset store 205, and a descriptor memory 307 for storing one or more device descriptors associated with the players 301 so that each player 301 is associated with one device descriptor.
  • the examples of memory components 305 and 307 include random access memory (RAM), non-volatile memory such as a read-only memory (ROM), flash memory, Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), etc., and a disk storage device.
  • a disk storage device can include any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), a DVD, a DVD+RW, and the like.
  • the core system 300 also includes a core system processing means 310 such as one or more processors, microprocessors, controllers, and the like, for executing the core system instruction set and forming the plurality of frame sequences.
  • a core system processing means 310 such as one or more processors, microprocessors, controllers, and the like, for executing the core system instruction set and forming the plurality of frame sequences.
  • the core system instruction set 400 stored in the core instruction memory 306 includes an image converter 401 for converting an image into a part of a frame for displaying on a screen of the player 301.
  • the image converter 401 executed by the processing means 310 is employed in the dynamic image file creating step 120 and the static image file creating step 140.
  • the converter image 401 converts the upper part of the screen 210 during the dynamic image file creating step 120, and the bottom part of the screen 220 during the static image file creating step
  • the image converter 401 includes: first instructions 410 for providing frame pixels based on the image, wherein a number of the frame pixels is equal to a number of screen pixels in a corresponding part of the display screen; and second instructions 420 implementing one of the color reduction algorithm discussed relevant to the step 120, for reducing the number of colors of the image not to exceed the color resolution of the player 301 and for color-indexing the image pixels; wherein the number of screen pixels, the number of colors, and the palette of the player 301 are obtained from the device descriptor associated with the player 301; so that the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction, when displaying the frame sequence.
  • the core system instruction set 400 includes design instructions 430 for providing a dynamic image file including one or more placeholder fields for receiving values, executed at the creating dynamic image file step 120; query instructions 440 for providing the values for the placeholder fields, executed at the query step 130; and a final converter 450 for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor, executed at the creating static image file step 140.
  • the aforedescribed converter instructions 400 are called by the designer 430 and the final converter 450.
  • the core system instruction set 400 includes campaign manager 460 for defining DS distribution parameters, and supporting scheduled delivery of frame sequences to the players.
  • the method illustrated by FIG. 1 is simplified to disallow use of dynamic variables, therefore to exclude the creating dynamic file step 120 and query step 130.
  • the core instruction set 400 does not include the design instructions 430 and the query instructions 440.
  • the core system 300 can reside on one computer or be a distributed system, wherein parts of the system 300 communicate over a network.
  • the core system 300 is a distributed system 30O 1 consisting of an animator 460, a distribution portal 470, and one or more site controllers 480 spatially separated from the rest of the system 30O 1.
  • the animator 460 is a content creation tool for micro-advertisement and micro- promotions designed as a creative tool for non-professionals, and an adaptation and conversion tool for content creation professional.
  • the animator 460 is for executing at least a part of the design instructions to perform the content choosing step 110, and, optionally, the steps 120 -150 of creating the dynamic file, querying at least some of the dynamic variables, creating the static image file if all the dynamic variables are found, and defining the distribution parameters.
  • the animator 460 is the MicroAnimator software implemented by DigiCharm Inc.
  • the distribution portal 470 is for performing the distribution step 160 and providing centralized management of all players 301 on all sites, regrouped by customer, region, store, product, etc., as well as advertisement and promotion campaign management, including scheduling and regional targeting.
  • the distribution portal 470 includes the database
  • the distribution portal 470 may be an external application integrated into the architecture shown in FIG. 6 by interfacing with the animator 460 and the site controllers 480. Most existing Content Distribution systems can be used, for example such as disclosed in U.S. Patent No. 6785704.
  • the animator 460 and the distribution portal 470 form together a design station, either distributed or residing in one location.
  • the design station has the design instructions 430 stored in a design station memory and a design station interface for providing the dynamic image file or the static image file over a network to the site controllers 480, such as an Ethernet card or chip, a wireless transceiver, or any other interface.
  • the query instructions and the final converter can be stored at the design station or at the site controller.
  • Each site controller 480 is an embedded PC or small server running a software application that manages an entire location, such as a store.
  • the site controller 480 interfaces with the distribution system 470 to receive and store content and schedules for a single site.
  • the site controller 480 has first communication means for communicating with the design station over the network, including an interface, such as an Ethernet card or chip, a wireless transceiver, or any other interface, and software such as a TCP/IP stack.
  • the site controller 480 also has second communication means for communicating with at least one of the RF transmitters, located separately from the site controller, over a fiber, cable, wire link, or the like, which is preferably an USB interface, and can be an Ethernet or any other interface.
  • different parts of the two communication means are software instructions providing different addressing and different treatment of received data.
  • the site controller 480 is responsible for finalizing the static image file, if it has not been done yet by other components of the system, and, for this purpose, receives the dynamic image file and the remaining dynamic variables from the distribution system 470.
  • the dynamic variables are queried from the database 245, or from a local backend system such as the store back office, and/or the site controller 480 provides an interface for manual input of values for the dynamic variables. After values for all the dynamic variables are provided, the site controller converts the dynamic image file into the static image file.
  • the site controller 480 distributes the static image file(s) to the players 301 based on the schedules received from the distribution system 470, manages and monitors the players 301 within the location.
  • the site controller For converting a plurality of frame sequences in a player-specific format, the site controller has the design instruction, the query instructions, and the final converter instructions stored in the memory of the site controller and processing means for executing these instructions, as well as a device descriptor memory for storing one or more device descriptors, so that each player is associated with one device descriptor; and each frame sequence received by the player 301 has a format dependent on parameters of the device descriptor associated with the player 301; so that the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction on said frame sequence.
  • the site controller 480 has at least one port for communicating with the animator 460 and/or the distribution portal 470 over a network, and at least one port for communicating with at least one of the RF transmitters, located separately from the site controller, over the fiber, cable, or wire link, by way of example an USB port.
  • the site controller 480 is connected to a conventional Digital Signage infrastructure and provides distribution of the content and schedules supplied by the infrastructure to the players 301 and reports playback statistics and errors to the central system.
  • the site controller 480 includes an RF transmitter for communication with the players 301.
  • the site controller 480 controls a wireless local access network (WLAN) 490 formed by the plurality of RF transmitters 320 for providing wirelessly connection between the site controller 480 and the players 301.
  • the WLAN 490 has a two-layer star topology shown in FIG. 7, wherein one or more RF transmitters 320, also referred to as gateways 320, are connected to the site controller 480. Since each wireless hop has a high impact on total throughput, the WLAN 490 has wireless hops 485 only between the gateway 320 and the players 301.
  • Each gateway 320 is a combination of hardware and firmware supporting and bridging two communication protocols.
  • the gateway 320 includes two types of communication means: a first interface 710 for communicating with the site controller 480 which is a part of the core system 300, over a fiber, cable or wire link 495; and a second interface 720 for wirelessly transmitting the plurality of frame sequences to the players 301.
  • the first interface 710 can be a USB interface, or an Ethernet card or chip, or the like, and the second interface 720 is a wireless transceiver.
  • Processing means 730 are for protocol translation between wireline and wireless connections.
  • the WLAN 490 supports a wireless standard IEEE 802.15.4 and is segmented into sub-networks or zones each having a radius of approximately 10 meters and a number of players ranging between 20 and 1000, dependent of the geometry of the location and the size and renewal rate of the content, wherein the gateway 320 is disposed at the center of the zone.
  • the numbers of players in each zone should be balanced across the zones.
  • the gateway 320 supports a wireless standard IEEE 802.15.4 transmits using one or more channels selected out of the 16 channels. In order to reduce interference, the channels are selected so as to avoid having two adjacent or overlapping zones communicating on the same channel.
  • all the gateways 320 support IEEE 802.15.4 standard for communication with the plurality of players.
  • the site controller 480 includes means for dynamically managing sub-networks formed of the plurality of players in proximity to each of the RF transmitters connected to the site controller 480, implemented as a network management software.
  • the site controller 480 manages all the sub-networks, including discovery of devices in the network, maintains adjacency lists specifying communication distances between two devices on the network and dynamically forms sub-networks.
  • the site controller 480 monitors the local WLAN 490, collects playback statistics for the location, and provides the playback statistics and monitoring information including any detected errors to the distribution system 470.
  • the site controller 480 provides load balancing distributing the content among the gateways 320 to achieve on wireless links 485 as high as possible "link quality" defined by the IEEE 802.15.4 standard, so that the WLAN 490 is a network having the site controller 480, a plurality of gateways 320 each connected over the fiber, cable, or wire link 495 to the site controller 480, and a plurality of displays 301, each wirelessly connected using the IEEE 802.15.4 protocol to one of the gateways 320, wherein at least one of the displays 301 is a dual homed display 301 1 so it can receive wireless signals from two of the gateways 32O 1 and 32O 2 , and wherein the site controller performs load balancing by sending messages to the dual homed display 301 j via a less busy of the two gateways 32O 2 .
  • the site controller 480 provides authentication of players using an Access Control List (ACL), message integrity, and sequential freshness to avoid replay attacks.
  • ACL Access Control List
  • the site controller 480 implements a protocol stack 610 shown in FIG. 9.
  • the link 495 between the site controller 480 and the gateway 320 is governed by Universal Serial Bus (USB) standard.
  • USB Universal Serial Bus
  • the link 495 is governed by RS- 232 or RS-485 standards developed by the Electronic Industries Association, or by TCP/IP.
  • the link 495 supports Ethernet protocol or any other physical and link layer protocols satisfying throughput requirements of the system.
  • the protocol stack 610 employs a distributed network protocol (DNP) for providing end-to-end packet transmission, from the site controller 480 to the player
  • DNP distributed network protocol
  • a distributed network control protocol is for providing management functionality for the network itself including network formation, monitoring, error management and channel allocation. According to the instant invention, the DNCP is used for managing both the gateway 320 and the player 301.
  • the DNP and DNCP transmit messages between the Site Controller 480 and MicroPlayers 301, enabling the Site Controller 480 to discover Gateways 320 as they become connected or disconnected, and to query the gateway 32Oi for a list of connected MicroPlayers and for a list of neighboring Gateways within communication range of the gateway 32O 1 , thus providing the site controller 480 with the means for dynamically managing sub-networks. Gateways within communication distance from one another should use different channels to maximize bandwidth usage.
  • the DNP and DNCP provide a reliable communication channel with authentication, message integrity, sequential freshness, and access control enabling the players 301 to communicate only with the gateway 320.
  • the Simple Network Management Protocol SNMP
  • the TCP/IP stack is used as a DNP.
  • the protocol stack 610 includes a distributed application protocol (DAP) supporting management of the players 301 and transmission of content thereto, independently of the underlying communication technology.
  • DAP distributed application protocol
  • the DAP is used by the site controller 480 to communicate with the players 301 to transport content and commands to the players 301.
  • Each of the players 301 has its own DIG Address and the particular player 301 1 would only pickup messages that are meant for its specific address, allowing targeted messaging on shared communication channels.
  • Each of the players 301 may have different configuration parameters and support different operations.
  • the protocol uses a general block type to handle setting any name-value pairs.
  • Each name-value pair is called a property and has a unique key.
  • the player 301 has a list of property keys it supports and would receive Set Property operations for those keys only.
  • Each DAP operation is encapsulated in a block and all blocks have common structural elements, including a special type code identifying the operation.
  • the DAP blocks are build at the site controller 480, they are transmitted to the player 301 in messages of an underlying protocol, such as the TCP or UDP over IP.
  • Each operation block includes an operation identifier; an address this block is meant for; a data size which is the number of bytes in the data field of the block; data having structure defined by the operation type.
  • the DAP supports the following operations:
  • Ping operation is used to determine the presence of the player 301 and to initialize the connection with it; the meaning of initializing the connection depends on the communication medium being used.
  • the block has no data and the data size of 0.
  • Load DIG Content operation sends a piece of playable content, i.e. a portion of the frame sequence, to the player 301 in the form of a DIG frame sequence for storing in the buffer 340.
  • the data type is raw binary data and the data size is the size of the frame sequence.
  • the player 301 loads the received portion of the frame sequence into the buffer 340.
  • Erase Content operation makes the player 301 to remove all content from the buffer 340.
  • the block has no data and the data size of 0.
  • Reset Device operation makes the player 301 to reset.
  • the block has no data and the data size of 0.
  • Set Property operation advises the player 301 to set a new value for a property specified by its key.
  • the data portion of the block consists of two fields: a Property Key field of a fixed length, and a Value field having length and type dependent on the Property Key, wherein the Property Key represents the property, and the Value provides a new value for the property.
  • a property can be a contrast parameter of the display 370.
  • the DAP supports 1-way mode of operation so that the player 301 can receive data, but cannot send anything back as a response.
  • the player 301 may displays different visual indications to the user. For example, the player 301 provides a visual cue that it was pinged.
  • the DAP supports 2- way mode of operations, wherein the player 301 is capable of sending response messages indicating success of failure of a requested operation.
  • the gateway 320 implements a protocol stack 620, including USB or another standard for communication over the link 495, and IEEE 802.15.4 standard for wireless communication between the gateway 320 and players 301.
  • the players 301 implement a protocol stack 630 including the IEEE 802.15.4 standard.
  • the player 301 includes an RF receiver 330 for receiving one of the frame sequences; a buffer 340, also referred to as player memory 340, for storing the received frame sequence; a display 370 and a video controller 371, associated with the display
  • the RF receiver 330 is a single IEEE 802.15.4 transceiver, for example, MC13202FC commercially available from Freescale Semiconductor, Inc.
  • the buffer 340 may be flash memory, EPROM, or EEPROM.
  • the processing means 360 can be a general purpose processor, one or more microcontrollers, one or more FPGAs, and a combination thereof.
  • the processing means 360 is a single 8- bit microcontroller, also available from Freescale Semiconductor, Inc., and the player 301 has no other processing means beside the 8-bit microcontroller 360, the display microcontroller 371, and a transceiver controller.
  • the display 370 is a small Organic Light Emitting Diode (OLED) display or a Liquid Crystal Display (LCD), such as available from OSRAM Opto Semiconductors.
  • OLED Organic Light Emitting Diode
  • LCD Liquid Crystal Display
  • the display 370 is an OLED display having 128 X 64 resolution and 16 gray scale palette.
  • the players 301 of the instant invention have small screen, having a diagonal size less than 4 inches, and screen resolution, denoted as M x N on FIG. 2, not higher than 320 x 240, and color depth preferably 4 bit or less. By way of example, it takes less than half-an-hour to load 1MB of media content to 10,000 players divided into 25 zones.
  • the firmware running on the microcontroller Upon receiving a message containing a portion of the Load DIG Content block, the firmware running on the microcontroller writes the received portion of the frame sequence directly and linearly into the buffer 340 without any data manipulation, augmenting the previously written part of the frame sequence.
  • a playback the process of displaying a received frame sequence at the player 301, begins automatically if no DAP message is received after a predetermined short delay, such as 2 seconds.
  • the playback includes the following steps: setting the video controller 371 to the origin point of a display; positioning the player buffer 340 to the beginning of the frame sequence, wherein the firmware specifies the memory address of the beginning of the frame sequence in the player buffer; reading the frame sequence data one byte after another from the buffer 340, and sending each byte to the display 370.
  • the display 370 has a 8 bit communication bus, otherwise the frame data is read in portions of a different size dependent on the display communication bus.
  • the playback substantially consists of the above steps, meaning that only minor, less important steps are omitted, such as waiting for read or write operations to complete.
  • the playback as specified is effected by the facts (A) that, in response to the 'load' operation, the raw data of the frame sequence is written linearly into the buffer memory 340 wherein two consecutively received data portions are written in sequential parts of buffer 340 contrary to a conventional file system technique of decomposing data into blocks and storing these blocks of data in multiple locations not necessarily adjacent to each other, and (B) that each frame in the frame sequence received by the player 301 has the format dependent on parameters of the device descriptor associated with the player 301, in particular, each frame contains the same number of pixels as the display 370. Accordingly, the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction on the received frame sequence.
  • the core system 300 includes a mobile controller 481, which can be a PDA, tablet PC, or portable device specialized for retail, executing a software application for configuring the players 301 on site, by connecting to the site controller.
  • the mobile controller 481 connects to the site controller 480 using the WLAN 490 or a separate overlaid network such as a Wi-Fi network supporting IEEE 802.11 standard.
  • partial rendering of content is used to update only a part of the static image file already loaded into the player 301, for example if only the price variable 215 changes and the rest of the frame sequence stays the same.
  • the site controller 480 can transfer and replace only the changed frames to the player, therefor the site controller 480 manages the buffer 340 of the player 301 on a byte level. This is implemented by adding storage byte- addressing to the application protocol which handles loading, or by dividing SDIG files into multiple chunks and addressing by chunks.
  • the site controller 480 includes means for remote management of the buffer for replacing a portion of the frame sequence stored in the buffer: a memory map for the player 301, identifying the current SDIG file(s) loaded on the player 301, memory location where these file(s) are stored at the player 301 ; and a map matching dynamic variables with frames in these SDIG files.
  • the Site Controller as it re-renders new SDIG files, identifies the frames or portions of frames that have actually changed. The Site Controller then transfers only the changed frames or portions to the appropriate players using an addressable load content command.
  • the storage capacity of the player 301 is doubled, providing the ability to preload or buffer content on the player, without replacing existing content. Then, at a specific time, a simple and very fast command can be sent to players to switch to the new content.
  • the low price of storage allows this approach to be taken without greatly increasing the price or complexity of players.
  • the SDIG file is compressed at the site controller 480 before transmission to the player 301.
  • the player 301 stores the compressed content, and after the transfer is complete, the player 301 decompresses the loaded content into an unused portion of storage, retrieving the original static image file. The following playback is not affected by the compression.
  • step 640 the site controller 480 loads a first frame sequence into the first buffer 340 of the player 301, and the player 301 starts playing the content of the first frame sequence, step 645. Then, the site controller 480 compresses a second frame sequence and sends it to the player 301, step 655.
  • step 660 the player 301 receives and stores the compressed file in an unused portion of its storage, a second buffer, not shown in FIG. 4, while the player 301 continues playback its current content.
  • the site controller 480 sends a short command to the player, instructing it to switch to the new still compressed content, step 665.
  • step 670 the player stops playback of the current content, and decompresses the second SDIG file from the second buffer into the first buffer 340 on top of the current content, overwriting it.
  • step 680 the player 301 starts playback of the newly decompressed content and now has a free buffer.

Abstract

La présente invention concerne un système pour rendre et afficher un contenu graphique sous forme d'une séquence de trames, consistant en un animateur, un portail de distribution, un ou plusieurs contrôleurs de site, un ou plusieurs émetteurs-récepteurs sans fil, chacun connecté à l'un des contrôleurs de site par le biais d'une liaison par fibre, câble ou fil. Et elle concerne une pluralité de joueurs connectés en mode sans fil aux émetteurs-récepteurs.
PCT/CA2007/002116 2006-11-23 2007-11-23 Système de signalisation numérique avec affichages sans-fil WO2008061372A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US86711806P 2006-11-23 2006-11-23
US60/867,118 2006-11-23

Publications (1)

Publication Number Publication Date
WO2008061372A1 true WO2008061372A1 (fr) 2008-05-29

Family

ID=39429351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2007/002116 WO2008061372A1 (fr) 2006-11-23 2007-11-23 Système de signalisation numérique avec affichages sans-fil

Country Status (1)

Country Link
WO (1) WO2008061372A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130076765A1 (en) * 2011-09-28 2013-03-28 Hakhyun Nam Image Data Displaying System and Method for Displaying Image Data
DE102015015542A1 (de) * 2015-11-16 2017-06-01 Infoscreen Gmbh Verfahren und digitales Beschilderungssystem zum Anzeigen von darstellbaren Zeichenketten auf verteilten Anzeigeeinrichtungen

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005038629A2 (fr) * 2003-10-17 2005-04-28 Park Media, Llc Systeme de presentation de support numerique
US20060015531A1 (en) * 2004-07-19 2006-01-19 Moshe Fraind Device and system for digital signage
CA2591305A1 (fr) * 2005-01-04 2006-07-13 Avocent California Corporation Systemes sans fil de radiodiffusion d'emissions en continu, dispositifs et procedes

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005038629A2 (fr) * 2003-10-17 2005-04-28 Park Media, Llc Systeme de presentation de support numerique
US20060015531A1 (en) * 2004-07-19 2006-01-19 Moshe Fraind Device and system for digital signage
CA2591305A1 (fr) * 2005-01-04 2006-07-13 Avocent California Corporation Systemes sans fil de radiodiffusion d'emissions en continu, dispositifs et procedes

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130076765A1 (en) * 2011-09-28 2013-03-28 Hakhyun Nam Image Data Displaying System and Method for Displaying Image Data
DE102015015542A1 (de) * 2015-11-16 2017-06-01 Infoscreen Gmbh Verfahren und digitales Beschilderungssystem zum Anzeigen von darstellbaren Zeichenketten auf verteilten Anzeigeeinrichtungen
DE102015015542B4 (de) * 2015-11-16 2017-10-19 Infoscreen Gmbh Verfahren und digitales Beschilderungssystem zum Anzeigen von darstellbaren Zeichenketten auf verteilten Anzeigeeinrichtungen

Similar Documents

Publication Publication Date Title
US8898255B2 (en) Network digital signage solution
CN1842088B (zh) 用于丰富交互用户界面的有效远程投影的系统
CN100437453C (zh) 标签信息显示控制设备及方法、信息处理设备和显示设备
CN1813240B (zh) 用于显示媒体内容数据的方法和系统
CN102571930B (zh) 利用动态清单的分布式流畅的流发送
CN100585550C (zh) 用于投影来自计算装置的内容的系统和方法
CN100385435C (zh) 提供颜色管理的系统和方法
CN105594204A (zh) 通过hdmi传输显示管理元数据
JP4413629B2 (ja) 情報表示方法、情報表示装置および情報配信表示システム
JP5361746B2 (ja) 多分割表示コンテンツ及びそのシステムの作成方法
CN105144729A (zh) 通过hdmi传输显示管理元数据
CN103650526A (zh) 用于实时或近实时流传输的播放列表
CN103890794A (zh) 用于使用近场通信(nfc)与显示器进行交互的方法、装置和系统
JP2007018198A (ja) リンク情報付きインデックス情報生成装置、タグ情報付き画像データ生成装置、リンク情報付きインデックス情報生成方法、タグ情報付き画像データ生成方法及びプログラム
JP2005198204A (ja) 情報配信表示システムおよび情報配信表示方法
JP4891400B2 (ja) 広告情報の配信表示方法、広告情報配信表示システム及びコンピュータプログラム
EP2422316B1 (fr) Utilisation de gpu pour une mise en paquets de réseau
WO2010114512A1 (fr) Système et procédé de transmission de données d'affichage vers un dispositif de téléaffichage
JP2011034304A (ja) 端末装置、クーポン属性変更方法およびクーポン属性変更プログラム
AU700680B2 (en) Motion picture reproducing system by bidirectional communication
WO2008061372A1 (fr) Système de signalisation numérique avec affichages sans-fil
JP2007019768A (ja) タグ情報生成装置、タグ情報生成方法及びプログラム
CN113727154A (zh) 拼接屏的同步播放系统、方法、装置和存储介质
WO2012113460A1 (fr) Procédé et système de combinaison de plus d'une donnée de contenu à un média unique en résultant et mise à disposition du média unique en résultant pour service de répertoire de contenus (upnp) ou un serveur de média numériques (dlna)
JP2018055483A (ja) 表示装置、表示制御方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07845581

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07845581

Country of ref document: EP

Kind code of ref document: A1