GB2408164A - Controlling a dynamic display apparatus - Google Patents

Controlling a dynamic display apparatus Download PDF

Info

Publication number
GB2408164A
GB2408164A GB0326391A GB0326391A GB2408164A GB 2408164 A GB2408164 A GB 2408164A GB 0326391 A GB0326391 A GB 0326391A GB 0326391 A GB0326391 A GB 0326391A GB 2408164 A GB2408164 A GB 2408164A
Authority
GB
United Kingdom
Prior art keywords
camera
audio signal
dab
view
display apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0326391A
Other versions
GB0326391D0 (en
Inventor
Alastair Breward
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB0326391A priority Critical patent/GB2408164A/en
Publication of GB0326391D0 publication Critical patent/GB0326391D0/en
Priority to PCT/GB2004/004782 priority patent/WO2005048225A1/en
Publication of GB2408164A publication Critical patent/GB2408164A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Studio Devices (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Pan, tilt and zoom sensors are coupled to one or more television cameras at a live event in order to determine the field of view of the (or each) camera. This information is input to a computer containing data about the positions and orientation of one or more dynamic advertising billboards ("DABs") present at the live event, being used to display advertisements visible to cameras. The computer is programmed to detect occasions when a DAB is visible in the image being recorded by the (or any) camera. Subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to instruct each such DAB, when so visible, to change the advertisement it is displaying. The change occurs when the DAB is being viewed by the television camera, and hence (in most cases) by a far larger audience than might be viewing it but for its presence in the field of view of one or more cameras. It is generally established that the human eye is drawn to motion and therefore any advertisement which scrolls into view within a viewed scene is more effective than one already on display in that scene.

Description

Methods of, and Systems for. Controlling a Dynamic Display Apparatus
Field of the Invention
This invention relates to methods and systems for controlling a dynamic display apparatus for use at live events.
Background of the Invention
Live events, such as sporting events, frequently attract large audiences, both physically and by means of television broadcast. For this reason, stir] media such as picture advertisements are typically displayed at the venue where they will be seen by these audiences. A proportion of these advertisements are not displayed continuously but alternated mechanically or electronically. For example, several advertisements may be printed onto a single loop of material which is then scrolled so that individual advertisements are viewed in turn.
Alternatively, the multiple advertisements are electronically stored and displayed in turn on a single display device.
In part this allows sharing of the one physical venue, and in part the technique draws the attention of the audience, since it is well known that the human eye is drawn to moving elements in a relatively static background, and hence the advertisements are more noticed and therefore more effective. It is assumed that on average through the period of the event, a certain number of people will be looking sufficiently near to the display device for any motion on it to catch their attention and so cause them to see the advertisement.
It is an object of the invention to increase the effectiveness of display apparatus for advertisements, and other types of display apparatus, at live events.
Summary of the Invention
In accordance with one aspect of the invention, there is provided a method of controlling a display system at a live event, the display system comprising a dynamic display apparatus, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the dynamic display apparatus in response to the monitoring.
In accordance with a further aspect of the invention, there is provided a display system for use at a live event, the system comprising a dynamic display apparatus, means for receiving monitoring signals indicative of characteristics of a field of view of a camera at the live event, and means for controlling the dynamic display apparatus in response to the monitoring.
The present invention enables advertisers, or those providing advertising space to them, to increase the effectiveness of advertisements provided by means of the dynamic display apparatus, by increasing the correlation between (i) the fact that a large number of people are, at a given moment, either (if viewing remotely via television) seeing an image which contains the space where the advertisement is to be displayed or (if present at a venue) looking in the general direction of that space (on the assumption that, broadly, those present will look at the place where the action is, which is also where the cameras will be pointed), and (ii) the incidence of motion in that space which draws the eyes of those people, and/or the occurrence of other noticeable events (such as sound effects) related to the advertisement being (or about to be) displayed in that space.
In preferred embodiments of the invention, pan, tilt and zoom sensors are coupled to one or more television cameras at a live event in order to determine the field of view of the (or each) camera. Other methods to determine the field of view may be used. The field of view information is input to a computer containing data about the positions and orientation of one or more dynamic advertising billboards ("DABs") present at the live event, being used to display advertisements visible to cameras (and usually also to persons physically present at the event). One example of a DAB is a billboard which contains two or more printed advertisements on a loop of material, and a device to scroll the material round periodically (or on some cue) so that a different advertisement is visible, thus alternating two or more advertisements in a single space. DABs may also use electronic display devices (either to display a sequence of individually still image advertisements, or to display a moving-image advertisement or a series of them in rotation). Such electronic display devices are capable of displaying ) different images in dependence upon an image signal input in to the device. The computer is programmed to detect occasions when a DAB is visible in the image being recorded by the (or any) camera. Subject to certain overriding logic to deal with various special circumstances, the computer is also programmed to instruct each such DAB, when so visible, to change the advertisement it is displaying. Examples of special circumstances include: (i) inhibiting or delaying instructions to take account of other events in the camera's field of view such as key game incidents, or (ii) for multi-camera live events, polling a source of information about which camera is 'live' at any given moment. The change occurs when the DAB is being viewed by the television camera, and hence (in most cases) by a far larger audience than might be viewing it but for its presence in the field of view of one or more cameras. It is generally established that the human eye is drawn to motion and therefore any advertisement which scrolls into view within a viewed scene is more effective than one already on display in that scene. Where a DAB is displaying a moving-image advertisement, or series of them, the computer could, instead of effecting a change to another advertisement, be programmed to initiate a change of state within the currently displayed advertisement. The computer may also output instructions to other systems, such as audio systems, so that an audio signal is triggered as well as or instead of a change in state of the DAB. Indeed, this latter kind of output may be combined with static as opposed to dynamic advertising billboards.
Brief Description of the Drawings
Figure 1 shows a perspective view of part of a sports arena, and certain physical elements relevant to use of the invention.
Figure 2 is a block diagram of a subset of the components which make up a camera sub-system arranged in accordance with an embodiment of the invention.
Figure 3 is a block diagram of a subset of the components which make up a dynamic advertising sub-system arranged in accordance with an embodiment of the invention.
Figure 4 is a flow chart describing the operation of television recording and dynamic apparatus arranged in accordance with an embodiment of the invention.
Figure 5 is a flow chart describing the operation of television recording and dynamic apparatus arranged in accordance with a further embodiment of the invention.
Detailed Description of Preferred Embodiments of the Invention A first embodiment A first embodiment of the invention will now be described. Figure 1 shows a typical live event venue, in this case a soccer stadium, including a camera and a DAB. For ease of illustration, it is assumed there are only one camera and only one DAB at the live event, but in practice there will generally be a number of each.
The camera is fitted with means repeatedly to detect which points in space are at any time visible through that camera. These points form an approximate cone ("view-cone"), whose axis is the line along which the camera is pointing, whose viewing angles (the angles made at the vertex of the view cone in horizontal and vertical planes, respectively, each such plane passing through the axis of the view-cone) are determined by the design of the camera and the degree of 'zoom' being used by the camera operator, and whose length may be either regarded as infinite or else arbitrarily set to a size greater than the longest distance the camera could view within the venue. Figure 1 shows the view-cone truncated by the ground and by the plane of the DAB only for ease of illustration. These truncations are shaded. In this embodiment of the invention, both the dimensions and orientation of the camera's view-cone are monitored continuously by frequent measurement of relevant data and re-computation.
Figure 2 is a block diagram of a subset of the components which make up the invention, namely one camera sub-system. Camera 140 has a zoom lens, including a 2X expander (range extender). Connected to camera 140 is a 2X expander/zoom/focus sensor 152 (collectively a "zoom sensor") which senses the zoom in the camera, the focal distance of the camera, and whether the 2X expander is being used. The analogue output of sensor 152 is sent to an analogue to digital converter l S4, which converts the analogue signal to a digital signal, and transmits the digital signal to processor 156. One alternative includes using a zoom sensor with a digital output, which would remove the need for analogue to digital conversion.
Camera 140 is mounted on tripod 144 which includes pan and tilt heads that enable broadcast camera 140 to pan and tilt. Attached to tripod 144 are pan sensor 146 and tilt sensor 148, both of which are connected to pan-tilt electronics 150. Alternatively camera 140 can include a built in pan and tilt unit. In either configuration, pan sensor 146, tilt sensor 148 and zoom sensor 152 are considered to be coupled to camera 140 because they can sense data representing the pan, tilt, zoom and focus of broadcast camera 140.
Processor 156 is an Intel Pentium processor with supporting electronics; however, various other processors could be substituted (provided the processor is powerful enough to compute view-cone data in time for the DAB control processor to issue update decisions in time for DABs to implement them while still being viewed in the associated cameras, as further explained below).
Processor 156 also includes memory and a disk drive to store data and software.
In addition to being in communication with pan-tilt electronics 150 and analogue to digital converter 154, processor 156 is in communication with a DAB control centre, which is described below (in relation to Figure 3).
In one embodiment, pan sensor 146 and tilt sensor 148 are optical encoders that output a signal, measured as a number of clicks, indicating the rotation of a shaft. Forty thousand (40,000) clicks represent a full 360 rotation.
Thus, a processor can divide the number of measured clicks by 40,000 and multiply by 360 to determine the pan or tilt angle in degrees. The pan and tilt sensors use standard technology known in the art and can be replaced by other suitable pan and tilt sensors known to those skilled in the relevant art. Pan/tilt electronics receives the output of pan sensor 146 and tilt sensor 148, converts the output to a digital signal (representing pan and tilt) and transmits the digital signal to processor 156. The pan, tilt and zoom sensors are used to determine the view-cone of the camera. Thus one or more of the pan, tilt and zoom sensors can together be labelled as a view-cone sensor. For example, if a camera cannot tilt or zoom, the view-cone sensor would only include a pan sensor.
The pan, tilt and zoom sensors may be embodied as described above, but other embodiments are possible. Two examples are taught in International patent application number W098/18261, pp 8-9 starting at line 21, namely (i) marking known locations in the venue such that each mark looks different and one will always be in view to a camera, so that an image thereof is detectable to the processor analysing the image recorded by the camera, or (ii) placing electromagnetic, infra-red or similar emitters around the stadium which are sensed by sensors on the cameras, again enabling computation of relative location and orientation.
Where multiple cameras are used, it may be practicable to reduce the number of processors used below the number of cameras, so that for example, a single processor 156 is linked to multiple cameras and computes the view-cones for all of them.
At some venues, cameras may be mobile, for example mounted on straight, delimited tracks parallel to the field of play. In such cases, it will be necessary to track the location of the camera mounting on the track, and take account of this when computing the location and orientation of the view-cone.
Figure 3 is a block diagram of a subset of the components which make up the invention, namely the DAB sub-system. Processor 200 is an Intel Pentium processor with supporting electronics; however, various other processors could be substituted (provided the processor is powerful enough to compute update decisions in time for DABs to implement them while still being viewed in the associated cameras, as further explained below). Processor 200 also includes memory 202, a disk drive 204 to store data and software, a VDU/keyboard/mouse 206, a removable drive 208 or other means to load pre created data, e.g. defining different camera systems, different venue topographies. In more detail, the pre-created data would include the location and orientation in space of all DABs (in order to establish whether or not at any given time, a DAB is in principle within a view-cone), and also the angle at which a DAB is no longer considered to be oriented to a camera, and any occlusions present. Certain DABs which are in principle 'in view' may be facing too obliquely to be seen effectively via the camera, or may be occluded.
For example, if a DAB is perpendicular to the line of view of a camera (and facing that way) it is clearly in view, and maybe so when at an angle of 45 degree to perpendicular, but not when at an angle of 30 degrees, or behind a known obstacle.
Collectively, this processor and peripherals may be called the DAB control centre. Processor 200 is connected to one or more DABs 210 by means enabling the processor to instruct each DAB to update its display, and ideally also enabling the DAB to inform the processor whenever its display is updated.
The processor is also connected to the (or each) camera subsystem. Processor may also be connected to other systems, such as position sensing systems for specific objects of significance such as sports balls, or audiovisual editing studios, or systems such as joysticks or customised consoles for inputting manual change commands to over-ride those generated by the processor.
Figure 4 is a flow chart describing the operation of the invention, when simplified to comprise only one camera and one DAB. In step 302, pan, tilt and zoom data is sensed, and the view-cone (that is, its size, shape, location and orientation) is determined in step 304 using the data sensed in step 302 together with other stored data including the length of the view-cone, the positions within the venue of the camera, and the rest angle of view of the camera, and communicated to the DAB control centre. In step 306, processor 200 determines whether the DAB is within the view-cone of the camera. In step 308, if the DAB is within the view-cone, the processor analyses whether there is any over-riding reason not to issue an instruction to the DAB to update its image. The most basic of reasons not to update would be that the processor's record of past updates and timings shows that the DAB has been instructed to update within the last few seconds (where the precise number of seconds is user determined). Another reason would be if the processor has received information from the DAB that it has self-updated recently, in which case at this step the processor updates its own records of past updates and timings. Another reason might be, if the relevant event were a soccer match and the soccer ball contained a position recording device whose output was available to the DAB control centre, that the soccer ball was in the view-cone, stationery and located on the penalty spot while the DAB was located behind the goal area, so that an update to the DAB would potentially interfere with the concentration of the player taking the penalty kick. Such object tracking may be accomplished by various technical means, including treating the ball or other object with spectral coatings so that it will reflect (or emit) a distinct signal which can be detected by the camera(s) with a filter or other sensor. In step 310, if the answer to whether to veto was negative, then processor 200 instructs the DAB to update, and also updates its own records of past updates and timings.
Variations and enhancements to the first embodiment Figure 5 is a flow chart describing the operation of the invention, when applied to a system comprising multiple cameras and multiple DABs. In step 402, pan, tilt and zoom data is sensed for each camera, and the view-cone of each is determined in step 404, and communicated to the DAB control centre.
In step 405, processor 200 determines which DABs are within the viewcones of which cameras, and stores this data as an array, and checks that the array is not empty in step 406. If there are no DABs within any viewcones, then the processing ends, and reiterates from step 402. In step 407, the processor analyses the array to reach initial decisions about which DABs to instruct to update, based on stored rules and data. This analysis may involve assigning values to the different cameras, possibly based on real-time data input from the audiovisual editing studio as to which cameras are 'live', and possibly from stored heuristic rules as further described below in the next paragraph. The analysis can if desired also take account of recent DAB update history. Once a provisional decision is reached about which DAB(s) if any to update, step 407 ends. In step 408, the processor analyses whether there is any over- riding reason not to issue image-update instructions to any of the DABs provisionally selected for update. In step 410, for those DABs for which the decision was not to veto, processor 200 instructs the DABs to update. In one embodiment, the processor limits the number of change commands issued, and therefore may suppress change commands to some DABs on the provisional list.
Where multiple cameras are in use, it becomes desirable for processor to have best information as to which cameras are currently 'relevant' meaning that they are recording events either which are being viewed live, or which will be viewed later, or which may be viewed later. In certain circumstances (such as events broadcast live, via a set of cameras under the unitary control of one editor, employing systems to switch between cameras which can output electronic data to the computer), it may be possible and practicable to supply the computer with precise data about which camera is relevant' at every moment throughout an event. Otherwise, it will be practicable to create heuristic rules for use by the computer in assessing how to prioritise cameras. These rules in a simple form might assign higher relevance to cameras in specific (e.g. nearer, more central, etc) locations, and/or to cameras which are moving (panning and tilting) at certain rates or in certain patterns. For example, if the event is a soccer match, certain events like corner kicks would generate camera movements which follow a discernable pattern (that is several cameras are demonstrably tracking the same object, and that object is travelling in from the corner toward the goal-mouth). In such cases, it is possible to identify one camera as the most commonly televised, and assign higher priority to that camera. It might also be the case that such patterns indicate that a moderately high relevance should be assigned to numerous other cameras, because the recordings of those other cameras will be shown as well, as action replays. In more complex form, image analysis or other technology might be used to determine whether or not the camera is viewing a specific object (such as a soccer ball) and if so from how far away.
Although it is desirable in multi-camera systems to have good information about the relative (and absolute) priorities of the cameras over time, it is - in most configurations as to the number of cameras etc - not fatal to the usefulness of the invention, because a positive effect on noticeability of advertisements will still occur even if the computer cannot determine precisely which footage will eventually be seen by the audience. DAB changes will occur, and while many will be redundant, many others will be seen. In such cases, it might be appropriate to vary the other instructions given to the computer so as to raise in general the number of 'change' instructions issued to 1 0 DABs.
It is feasible for the computer to take note of the speed with which a camera is moving (panning and tilting) and to instruct the computer to inhibit change instructions to DABs in view to that camera, if it is predictable that the DAB will be out of view (or partly out of view, or about to be out of view) by the time when it actions the instruction to change, or if the relevant advertisements require more time for digestion by the audience.
It is feasible, when using this invention, to operate new forms of advertising, in which the message to be communicated to the audience relies on the fact that DAB-updates can be linked to events or otherwise controlled.
Advertising is a highly creative field, but examples of how this might occur include: (i) At a soccer event, there are scrolling DABs with 3 panels each around the ground. One sponsor of a soccer team reserves all panels on certain DABs near the goal areas, on two of which appear a neutral slogan, and on the third is a message thought suitable for display when the sponsored team is moving forward in attack. (During each half, the 'attack' message near the team's own goal area is unused.) When the team surges forward, the audience is shown the 'attack' message, which is therefore additionally effective because synchronized with suitably reinforcing live events.
(ii) The computer is linked to an audio output system, which plays an audio track designed to work in concert with the change of certain displays.
This operates best when there is a single camera or definitive information as to which camera is live at any moment. As a particular visual message scrolls into view the audio track is played in synchronization with it (by which we mean in time with or, with a controlled time lag, before or afterwards). The track might be output as an audio signal and digitally overlaid onto an audio track being recorded or otherwise prepared for use with the visual images obtained from the television cameras, or it might be fed as an audio signal into loud speakers at the venue so as to form part of the ambient sound and hence get captured in any such sound recording being so prepared. The track might be words, a jingle, or a sound effect. For example, every time (or at least, on some occasions) when a tizzy drink advertisement comes into view (whether scrolling or not), there is a background sound of a bottle opening and fizzing enticingly.
(iii) Where two DABs are frequently in view together (to the same camera) or are frequently or invariably seen in a certain sequence, their changes may be co-ordinated (i.e. synchronized, by which we mean in time with each other or with a controlled lag between them) to relay a particular message.
When the invention is used, it will become possible to introduce new methods of pricing or quantifying advertising which more accurately reflect the benefits given or received. For example, it will be possible to count 'mass audience visibility' or even 'mass-audience displays', and sell these instead of seconds of 'in principle' visibility. Advertisements sharing one display device would no longer be displayed for fixed periods.
This invention involves information processing relating both to camera systems (which are typically owned and operated by television and broadcast entities) and DAB systems (which are typically owned and operated by sports clubs or venue operators). Given that much of the specific pre-programmed information required relates to the physical characteristics of the venue and the kinds of event which take place there, it would seem prima facie simpler for venue operators to control the computing functions, and receive feeds from cameras recording events on specific occasions, and this is the approach taken in this description, i.e. where multiple cameras and DABs exist, the cameras all singly feed data to a unitary DAB control system. There is no need for the invention to be exploited this way, and it could be operated either independently or with the central control unit perceived as an adjunct to the camera and recording technologies, from which messages are sent as required to individual DABs.
It is feasible that the computer-automated functions of determining when particular DABs should change display could be over-ridden manually for particular reasons. Thus a person watching the image could intervene and control the DABs directly. For example, where a DAB change behind a goal in a soccer game is deferred while a penalty kick is taken, and then permitted, it would be advantageous (as a further example of inventive advertising) to select between two or more alternative messages to be displayed in such situations.
While in principle, the determination as to where the penalty resulted in a goal or not could be automated (based on noise levels, image analysis, data-feed from referee, etc), it might be simpler to allow an individual to make an instant decision and activate a change to the DABs behind the goal which differs according to the penalty outcome. In an international match where an advertisement is primarily aimed at one nation, that advertisement will perhaps be more effective if linked to a successful event (score or save) than the reverse.
It is usual but not essential for each individual advertisement to be a still image. It is quite feasible that an individual element of advertising material could involve motion, for example it could be a short sequence of video footage (displayed on an electronic display device, or by projection on a screen), or a sequence of two or more still images, intended to be displayed in order.
It is feasible to use the invention with mobile DABs. In such a case, the processor 200 needs to receive information about the position and orientation of DABs as this data changes.
It is usual but not essential that the same DABs are visible to the physically present audience and to the television audience. It would be feasible in some circumstances to position DABs so that they are capable of being seen only by the cameras.
Note that, whilst the above description relates to dynamic advertising media, the invention may be applied to other types of dynamic media, such as information display boards, laser projection apparatus, public address systems, etc. Further features and advantages of the invention will become apparent from the following description of preferred embodiments of the invention, given by way of example only, which is made with reference to the accompanying drawings.

Claims (9)

  1. Claims 1. A method of controlling a display system at a live event, the
    system comprising a dynamic display apparatus, the method comprising monitoring signals indicative of characteristics of a field of view of a camera at the live event, and controlling the dynamic display apparatus in response to the monitoring.
  2. 2. A method according to claim 1, wherein the dynamic display apparatus comprises two or more printed images, which are selectively displayable.
  3. 3. A method according to claim 1, wherein the dynamic display apparatus comprises an electronic display device capable of displaying different images in dependence upon an image signal input in to the device.
  4. 4. A method according to any preceding claim, wherein the display apparatus comprises first and second dynamic display devices, and the method comprises controlling the first and second dynamic display devices so as to synchronise changes in displays provided by the respective devices.
  5. 5. A method according to any preceding claim, wherein the system comprises an audio signal generator, and the method comprises controlling the audio signal generator so as to synchronise changes in an audio signal provided by the audio signal generator with changes in a display provided by the dynamic display apparatus.
  6. 6. A method according to claim 5, wherein the method comprises recording an audio signal at the live event and combining the audio signal output by the audio signal generator with the recorded audio signal.
  7. 7. A method according to claim 5, wherein the method comprises outputting the audio signal using a loudspeaker at the live event.
  8. 8. A display system for use at a live event, the system comprising a dynamic display apparatus, means for receiving signals indicative of characteristics of a field of view of a camera at the live event, and means for controlling the dynamic display apparatus in response to the monitoring.
  9. 9. A method of controlling a display system at a live event, the system comprising a dynamic display apparatus and an audio signal generator, the method comprising controlling the audio signal generator so as to synchronise changes in an audio signal provided by the audio signal generator with changes in a display provided by the dynamic display apparatus.
GB0326391A 2003-11-12 2003-11-12 Controlling a dynamic display apparatus Withdrawn GB2408164A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0326391A GB2408164A (en) 2003-11-12 2003-11-12 Controlling a dynamic display apparatus
PCT/GB2004/004782 WO2005048225A1 (en) 2003-11-12 2004-11-12 Methods of, and systems for, controlling a display apparatus and related audio output

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0326391A GB2408164A (en) 2003-11-12 2003-11-12 Controlling a dynamic display apparatus

Publications (2)

Publication Number Publication Date
GB0326391D0 GB0326391D0 (en) 2003-12-17
GB2408164A true GB2408164A (en) 2005-05-18

Family

ID=29726414

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0326391A Withdrawn GB2408164A (en) 2003-11-12 2003-11-12 Controlling a dynamic display apparatus

Country Status (2)

Country Link
GB (1) GB2408164A (en)
WO (1) WO2005048225A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2959339A1 (en) * 2010-04-26 2011-10-28 Citiled METHOD FOR CONTROLLING AT LEAST ONE DISPLAY PANEL OF VARIABLE IMAGES IN A PLACE SUCH AS A STAGE
US9047256B2 (en) 2009-12-30 2015-06-02 Iheartmedia Management Services, Inc. System and method for monitoring audience in response to signage
US9373123B2 (en) 2009-12-30 2016-06-21 Iheartmedia Management Services, Inc. Wearable advertising ratings methods and systems
WO2018069219A1 (en) * 2016-10-14 2018-04-19 Uniqfeed Ag System for dynamically maximizing the contrast between the foreground and background in images and/or image sequences
US10805558B2 (en) 2016-10-14 2020-10-13 Uniqfeed Ag System for producing augmented images
US10832732B2 (en) 2016-10-14 2020-11-10 Uniqfeed Ag Television broadcast system for generating augmented images

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4806924A (en) * 1984-06-29 1989-02-21 Daniel Giraud Method and system for displaying information
US4814800A (en) * 1988-03-16 1989-03-21 Joshua F. Lavinsky Light show projector
JPH04129062A (en) * 1990-09-18 1992-04-30 Fujitsu Ltd Video and audio simultaneous output device
JPH07199879A (en) * 1993-12-28 1995-08-04 Hitachi Eng Co Ltd Device and method for sound corresponded image display
US5461596A (en) * 1993-10-26 1995-10-24 Eastman Kodak Company Portfolio photo CD visual/audio display system
US5562459A (en) * 1994-01-07 1996-10-08 Durlach; David M. Dynamic three dimenional amusement and display device
WO1997034283A1 (en) * 1996-03-13 1997-09-18 Regen Howard W Improved audio-visual sign
US5731846A (en) * 1994-03-14 1998-03-24 Scidel Technologies Ltd. Method and system for perspectively distoring an image and implanting same into a video stream
JP2000341642A (en) * 1999-05-25 2000-12-08 Hitachi Ltd Synchronization processor for digital picture and sound data
US6297853B1 (en) * 1993-02-14 2001-10-02 Orad Hi-Tech Systems Ltd. Apparatus and method for detecting, identifying and incorporating advertisements in a video image

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5003293A (en) * 1989-10-02 1991-03-26 Compunic Electronics Co., Ltd. Billboard with audio message spreading function
US5510828A (en) * 1994-03-01 1996-04-23 Lutterbach; R. Steven Interactive video display system
JPH07312712A (en) * 1994-05-19 1995-11-28 Sanyo Electric Co Ltd Video camera and reproducing device
JPH08171413A (en) * 1994-12-16 1996-07-02 Fuji Facom Corp Monitor controller using video and audio
FR2730837B1 (en) * 1995-02-22 1997-05-30 Sciamma Dominique REAL OR DELAYED INSERTION SYSTEM FOR VIRTUAL ADVERTISING OR INFORMATION PANELS IN TELEVISION PROGRAMS
AU5457598A (en) * 1996-11-27 1998-06-22 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
KR20010074218A (en) * 2001-04-11 2001-08-04 김도균 A Sign Board to say with Voice storeage system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4806924A (en) * 1984-06-29 1989-02-21 Daniel Giraud Method and system for displaying information
US4814800A (en) * 1988-03-16 1989-03-21 Joshua F. Lavinsky Light show projector
JPH04129062A (en) * 1990-09-18 1992-04-30 Fujitsu Ltd Video and audio simultaneous output device
US6297853B1 (en) * 1993-02-14 2001-10-02 Orad Hi-Tech Systems Ltd. Apparatus and method for detecting, identifying and incorporating advertisements in a video image
US5461596A (en) * 1993-10-26 1995-10-24 Eastman Kodak Company Portfolio photo CD visual/audio display system
JPH07199879A (en) * 1993-12-28 1995-08-04 Hitachi Eng Co Ltd Device and method for sound corresponded image display
US5562459A (en) * 1994-01-07 1996-10-08 Durlach; David M. Dynamic three dimenional amusement and display device
US5731846A (en) * 1994-03-14 1998-03-24 Scidel Technologies Ltd. Method and system for perspectively distoring an image and implanting same into a video stream
WO1997034283A1 (en) * 1996-03-13 1997-09-18 Regen Howard W Improved audio-visual sign
JP2000341642A (en) * 1999-05-25 2000-12-08 Hitachi Ltd Synchronization processor for digital picture and sound data

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9047256B2 (en) 2009-12-30 2015-06-02 Iheartmedia Management Services, Inc. System and method for monitoring audience in response to signage
US9373123B2 (en) 2009-12-30 2016-06-21 Iheartmedia Management Services, Inc. Wearable advertising ratings methods and systems
FR2959339A1 (en) * 2010-04-26 2011-10-28 Citiled METHOD FOR CONTROLLING AT LEAST ONE DISPLAY PANEL OF VARIABLE IMAGES IN A PLACE SUCH AS A STAGE
WO2011135241A1 (en) * 2010-04-26 2011-11-03 Citiled Method of controlling at least one panel for displaying changing images in a location such as a stadium
KR20190067221A (en) * 2016-10-14 2019-06-14 우니크페에드 아게 A system that dynamically maximizes the contrast between the foreground and background of an image and / or image sequence
CN109844861A (en) * 2016-10-14 2019-06-04 尤尼克费伊德股份公司 For the maximized system of dynamic contrast between the foreground and background in image or/and image sequence
WO2018069219A1 (en) * 2016-10-14 2018-04-19 Uniqfeed Ag System for dynamically maximizing the contrast between the foreground and background in images and/or image sequences
JP2019534527A (en) * 2016-10-14 2019-11-28 ユニークフィード アーゲー System for dynamically maximizing contrast between foreground and background in an image and / or image sequence
US10740905B2 (en) 2016-10-14 2020-08-11 Uniqfeed Ag System for dynamically maximizing the contrast between the foreground and background in images and/or image sequences
US10805558B2 (en) 2016-10-14 2020-10-13 Uniqfeed Ag System for producing augmented images
US10832732B2 (en) 2016-10-14 2020-11-10 Uniqfeed Ag Television broadcast system for generating augmented images
CN109844861B (en) * 2016-10-14 2021-01-08 尤尼克费伊德股份公司 System for dynamic contrast maximization between foreground and background in images or/and image sequences
KR102208733B1 (en) 2016-10-14 2021-01-28 우니크페에드 아게 A system that dynamically maximizes the contrast between the foreground and background of an image and/or sequence of images

Also Published As

Publication number Publication date
WO2005048225A1 (en) 2005-05-26
GB0326391D0 (en) 2003-12-17

Similar Documents

Publication Publication Date Title
US10609308B2 (en) Overly non-video content on a mobile device
US11087135B2 (en) Virtual trading card and augmented reality movie system
CN110636324B (en) Interface display method and device, computer equipment and storage medium
US9762817B2 (en) Overlay non-video content on a mobile device
CN110249631B (en) Display control system and display control method
AU2001283437B2 (en) Method and system for measurement of the duration an area is included in an image stream
US6269173B1 (en) Instant response broadcast board system and method
JP4794453B2 (en) Method and system for managing an interactive video display system
US6750919B1 (en) Event linked insertion of indicia into video
US20100013738A1 (en) Image capture and display configuration
WO2015049810A1 (en) Multi-viewpoint moving image layout system
CN101324945A (en) Advertisement selection method and system for determining time quantity of player for consumer to view advertisement
CA2980501C (en) Method and system for presenting game-related information
US20100225643A1 (en) Controlling a Three-Dimensional Virtual Broadcast Presentation
WO2021187529A1 (en) Information processing device, display system, and display control method
US20230353717A1 (en) Image processing system, image processing method, and storage medium
US20060100930A1 (en) Method and system for advertising
US20110141359A1 (en) In-Program Trigger of Video Content
GB2408164A (en) Controlling a dynamic display apparatus
US8587667B2 (en) Beyond field-of-view tracked object positional indicators for television event directors and camera operators
KR102195053B1 (en) Television broadcast system for generating augmented images
US20050001920A1 (en) Methods and apparatuses for managing and presenting content through a spherical display device
JP2007049661A (en) Advertisement apparatus and method employing retrieval technique in moving image distribution service
NO320345B1 (en) An intelligent system for integrated television communication and advertising, especially related to events and socio-economic targeting of the viewers.
RU2371781C1 (en) Interactive projection information delivery system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)